CN114639005A - Multi-classifier fused crop automatic classification method and system and storage medium - Google Patents
Multi-classifier fused crop automatic classification method and system and storage medium Download PDFInfo
- Publication number
- CN114639005A CN114639005A CN202210548609.7A CN202210548609A CN114639005A CN 114639005 A CN114639005 A CN 114639005A CN 202210548609 A CN202210548609 A CN 202210548609A CN 114639005 A CN114639005 A CN 114639005A
- Authority
- CN
- China
- Prior art keywords
- classification
- data
- crop
- time sequence
- precision
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000003860 storage Methods 0.000 title claims abstract description 8
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 53
- 230000004927 fusion Effects 0.000 claims abstract description 39
- 238000012549 training Methods 0.000 claims abstract description 35
- 238000002310 reflectometry Methods 0.000 claims abstract description 32
- 238000004364 calculation method Methods 0.000 claims abstract description 31
- 238000011156 evaluation Methods 0.000 claims description 36
- 238000007781 pre-processing Methods 0.000 claims description 16
- 238000013145 classification model Methods 0.000 claims description 11
- 238000012795 verification Methods 0.000 claims description 11
- 238000007635 classification algorithm Methods 0.000 claims description 8
- 238000010276 construction Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 4
- 238000013480 data collection Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 description 14
- 230000003595 spectral effect Effects 0.000 description 9
- 238000012706 support-vector machine Methods 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 7
- 238000007476 Maximum Likelihood Methods 0.000 description 6
- 241000209140 Triticum Species 0.000 description 6
- 235000021307 Triticum Nutrition 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 240000006162 Chenopodium quinoa Species 0.000 description 5
- 240000005979 Hordeum vulgare Species 0.000 description 5
- 235000007340 Hordeum vulgare Nutrition 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 235000017784 Mespilus germanica Nutrition 0.000 description 3
- 244000182216 Mimusops elengi Species 0.000 description 3
- 235000000560 Mimusops elengi Nutrition 0.000 description 3
- 235000007837 Vangueria infausta Nutrition 0.000 description 3
- 238000011835 investigation Methods 0.000 description 3
- 238000005520 cutting process Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000007637 random forest analysis Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 244000241838 Lycium barbarum Species 0.000 description 1
- 235000015459 Lycium barbarum Nutrition 0.000 description 1
- 244000241872 Lycium chinense Species 0.000 description 1
- 235000015468 Lycium chinense Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/10—Pre-processing; Data cleansing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
- G06V10/811—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/259—Fusion by voting
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
The invention discloses a multi-classifier fused crop automatic classification method, a multi-classifier fused crop automatic classification system and a storage medium, wherein the method comprises the following steps: data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained; calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence; based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification; inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set; and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
Description
Technical Field
The invention relates to the technical field of crop Classification, in particular to a decision-making grade crop automatic Classification method (OCA-MV) combining an Overall Classification Accuracy index (OCA) and a traditional Majority Vote (MV) mechanism, and specifically relates to a multi-classifier fused crop automatic Classification method, system and storage medium.
Background
The existing crop active classification method based on remote sensing data is usually based on a single classifier and a single data source, and multispectral data, high-score data and vegetation index time series have different advantages and complements for distinguishing crop types. The multispectral data has spectral information required for distinguishing the types of crops, but is limited by spatial resolution, and the classification precision of the multispectral data is limited by a mixed pixel effect; the improvement of spatial resolution (high-score data) is the most effective way to alleviate the effect of mixed pixels; the vegetation index time sequence can reflect the phenological characteristic differences of different crop types, and further relieve the influence of foreign matter co-spectrum and foreign matter different spectrum phenomena on classification precision when single-time image optical data (multispectral data and high-level data) are used for crop type identification. Meanwhile, different classifiers often show performance differences when constructing classification models of different regions and different crop objects, and stable classification accuracy is often difficult to obtain by using a single classifier.
The decision-level fusion of the multi-source remote sensing data is to classify images of each input data source and combine classification results according to decision rules so as to generate an optimal decision. The decision-level fusion technology can effectively combine the advantages of multi-source remote sensing data and multi-classifiers in crop classification, and improves classification precision. However, in the traditional decision-level fusion strategy of the Majority Vote (MV), the influence of the classification performance difference on the decision result is not considered, and the self-adaptive majority vote method (AMV) adds self-adaptive weight obtained by overall precision calculation of a classifier on the basis of the MV, but still does not reflect the classification performance difference of different classifiers on different classification classes. It can be seen that the two decision fusion algorithms fail to fully reflect the performance difference of the classifier, and the precision of the classification result is influenced.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a multi-classifier fused crop automatic classification method, a multi-classifier fused crop automatic classification system and a storage medium, which are used for improving the remote sensing automatic classification precision of crops.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification method, including:
data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained;
calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence;
based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification;
inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set;
and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
According to the technical scheme, high-level data, an NDVI time sequence and multispectral remote sensing data are combined with a classifier set covering various classifiers, a crop automatic classification model is constructed, primary classification of crops is achieved, and the spectral detection capability of the multispectral remote sensing data, the space structure detection capability of the high-level remote sensing data and the detection capability of the long-time remote sensing data on the phenological difference characteristics of the crops are effectively combined; meanwhile, the technical scheme also combines the overall classification precision index and a traditional majority vote mechanism to perform decision fusion classification on the primary classification result, and effectively utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types.
According to the technical scheme, the advantages of automatic identification of crops by different classifiers are fused, the overall classification precision index represents the advantages of performance difference of different classifiers on classification of different crop types, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homomorphism and heterology equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance difference of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
As a further technical solution, the data acquisition and preprocessing further comprises:
collecting high-grade data when crops exceeding a preset percentage in a working area are all in a growing period and multispectral original image data of all periods in one year;
and preprocessing the collected high-fraction data and multispectral original image data to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
Specifically, the high-score data collection time is set to be within the growth period of most crops in the working area. A reference may be set and high-score data collection may begin when crops in the work area that exceed the reference are in the growing season.
Specifically, the acquisition time of multispectral original image data is set to be one year so as to provide long-time-sequence multisource remote sensing data.
Further, preprocessing the collected high-fraction data and the multispectral original image data, further comprising: and sequentially carrying out radiation correction, atmospheric correction, orthorectification and cutting on the collected high-fraction data and the multispectral original image to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
As a further technical solution, the selecting the multispectral reflectance data participating in the classification further comprises:
acquiring training samples of various types of crops, and combining the NDVI time sequences obtained by calculation to obtain NDVI mean value time sequence curves of training areas of different types of crops;
and comparing the NDVI time sequence curves of different types of crops by taking the NDVI mean value time sequence curves of different types of crop training areas as a reference, and selecting the multispectral reflectivity data corresponding to the time with the maximum difference of the NDVI time sequence curves of the different types of crops as multispectral image data participating in classification.
According to the technical scheme, NDVI mean value time sequence curves of different types of crop training areas are further obtained on the basis of the NDVI time sequences obtained through calculation, and then multispectral reflectivity data corresponding to the time with the maximum NDVI time sequence curve difference are selected from all the NDVI time sequence curves of different types of crop training areas as multispectral image data participating in classification, so that the spectral detection capability of multispectral remote sensing data is fully utilized when crops are automatically classified, and the problems of data redundancy and low calculation efficiency caused by the fact that original multispectral remote sensing images are directly used are solved.
When the crop is classified initially, the NDVI time sequence and the multispectral image data participating in classification are input into the crop automatic classification model, and the high efficiency of calculation is guaranteed on the basis of fully utilizing the advantages of the long-time-sequence remote sensing data and the multispectral remote sensing data.
As a further technical solution, the method further comprises: and (3) constructing an automatic crop classification model based on 3 input data and 7 supervision and classification algorithms, and outputting a primary classification result set.
Further, the 7 supervision and classification algorithms comprise a shortest distance algorithm, a maximum likelihood algorithm, a mahalanobis distance algorithm, a spectrum angle mapping algorithm, a bopp information divergence algorithm, a support vector machine algorithm and a random forest algorithm.
Further, the constructed automatic crop classification model can be trained and evaluated accurately by obtaining crop type samples. Obtaining a crop type sample further comprises: collecting crop type space distribution data in a working area, or carrying out field investigation, selecting crop type samples in the working area, randomly dividing the crop sample area into training samples and verification samples in a layered random sampling mode according to the proportion of 1:1, and respectively using the training samples and the verification samples for training an automatic classification model and evaluating the precision of classification results.
As a further technical scheme, based on the crop verification sample, the precision evaluation is carried out on the primary classification result set by using a confusion matrix method, and the precision evaluation index is calculated.
And calculating the Overall classification precision (OA) and the Kappa coefficient as precision evaluation indexes, and carrying out precision evaluation on the primary classification result. Wherein OA refers to the ratio of the total number of pels correctly classified by the classifier to the total number of pels to be classified.
Further, the overall classification accuracy OA, the producer accuracy PA, the user accuracy UA, and the Kappa coefficient may also be calculated as the accuracy evaluation index. Wherein, PA refers to the ratio of the correct classification pixel number of a certain crop type to all pixel numbers in the verification data set of the crop type; UA refers to the ratio of the number of correctly classified pixels in a certain crop type to the number of all pixels classified as that crop type.
As a further technical solution, an OCA-MV algorithm is executed on the initial classification result set to complete decision fusion classification, further comprising:
constructing a total classification precision index set;
calculating the attribute probability of the pixel crop type, and calculating the maximum value of the class probability pixel by pixel based on the attribute probability of the pixel crop type to obtain a decision classification result corresponding to a single overall classification precision index;
performing pixel crop type attribution probability calculation and pixel-by-pixel calculation of a class probability maximum value on all the overall classification precision indexes in the overall classification precision index set one by one to obtain a decision classification result set;
performing precision evaluation on the decision classification result set, and calculating the overall classification precision;
and obtaining a final decision classification result based on the overall classification precision.
The technical scheme is that a total classification precision index set is constructed by using the total classification precision obtained by calculation in the precision evaluation of the primary classification result set so as to represent the classification performance difference of different classifiers on different crop types; calculating a total classification precision index aiming at each data source, each classifier and each crop type, and calculating the probability of each pixel belonging to the kth crop type according to pixel classification categories and pixel-by-pixel assignment; then calculating a class probability maximum value based on the pixel crop type attribution probability to obtain a decision classification result corresponding to a single overall classification precision index, and obtaining a decision classification result set through repeated pixel crop type attribution probability calculation and pixel-by-pixel calculation of the class probability maximum value; and finally, outputting the classification result corresponding to the maximum overall classification precision as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification system, the system comprising:
the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity;
the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity;
the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with a crop type training sample;
the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set;
and the decision fusion classification module is used for executing an OCA-MV algorithm on the primary classification result set to complete decision fusion classification.
According to the technical scheme, the original data are obtained through the data acquisition and preprocessing module, and high-fraction data and multispectral time sequence reflectivity are obtained through preprocessing; the NDVI time sequence is obtained through the calculation module and is input into the primary classification module, so that the detection capability of long-time-sequence remote sensing data on the phenological difference characteristics of crops is fully utilized during primary classification of the crops; through a multispectral image selection module, with the NDVI time sequence as a reference standard, selecting multispectral image data participating in classification from all periods, and fully utilizing the spectral detection capability of multispectral remote sensing data on the premise of considering the calculation efficiency; performing primary classification on input source data through a primary classification module; and executing an optimization algorithm which gives consideration to the overall classification precision index and the traditional majority vote mechanism on the primary classification result set through a decision fusion classification module to obtain a final crop classification result. By fusing the advantages of automatic identification of different classifiers on crops and the advantages of classification performance differences of the different classifiers on different crop types represented by the overall classification precision index, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homotopic and heterology equivalent responses and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
As a further technical solution, the decision fusion classification module further includes:
the index set construction module is used for constructing a total classification precision index set;
the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set;
the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index;
the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision;
and the output module is used for outputting a final decision classification result based on the overall classification precision.
According to the technical scheme, an index set construction module constructs a total classification precision index set based on total classification precision obtained by calculation in precision evaluation of an initial classification result set so as to represent classification performance differences of different classifiers on different crop types; calculating the attribute probability of the pixel crop type aiming at each index in the overall classification precision index set through an attribute probability calculating module, and calculating the maximum value of the class probability one by one on the basis of the attribute probability of the pixel crop type through a class probability maximum value calculating module to obtain a decision classification result corresponding to each index; calculating all indexes through an attribution probability calculating module and a class probability maximum value calculating module to obtain a decision classification result set; then, performing precision evaluation on the decision classification result set through a precision evaluation module, and calculating the overall classification precision; and finally, outputting a final decision classification result through an output module to serve as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method.
Compared with the prior art, the invention has the beneficial effects that:
(1) according to the method, high-score data, an NDVI time sequence and multispectral remote sensing data are combined with a classifier set covering various classifiers, a crop automatic classification model is constructed, the primary classification of crops is realized, and the spectral detection capability of the multispectral remote sensing data, the spatial structure detection capability of the high-score remote sensing data and the detection capability of long-time remote sensing data on the crop phenological difference characteristics are effectively combined; meanwhile, decision fusion classification is carried out on the primary classification result by combining the overall classification precision index and a traditional majority vote mechanism, and the overall classification precision index is effectively utilized to represent the classification performance difference of different classifiers for different crop types. According to the method, the advantages of automatic identification of crops by different classifiers are fused, and the overall classification precision index represents the advantages of classification performance differences of different classifiers on different crop types, so that the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homomorphism-heterogram equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
(2) The invention effectively integrates the advantages of different classifiers for automatically identifying crops, so that the final automatic crop classification precision is superior to that of a single classifier.
(3) The invention utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types, solves the defect that the traditional decision-level fusion strategy and the self-adaptive majority vote method for majority vote can not comprehensively reflect the performance difference of the classifiers, and has the automatic classification precision of crops obviously superior to the traditional decision-level fusion strategy and the self-adaptive majority vote method for majority vote.
Drawings
Fig. 1(a) is a schematic flow chart of a multi-classifier fused crop automatic classification method according to an embodiment of the present invention before primary classification.
Fig. 1(b) is a schematic diagram of the primary classification and precision evaluation flow of the multi-classifier fused crop automatic classification method according to the embodiment of the invention.
Fig. 1(c) is a schematic flow chart of the OCA-MV algorithm according to an embodiment of the present invention.
Figure 2 is a schematic comparison of NDVI timing curves for different crop types according to an embodiment of the present invention.
Fig. 3 shows the RF automatic classification result of high score six (GF 6) data crops according to an embodiment of the present invention.
Fig. 4 is a diagram illustrating the automatic classification result of the crop SVM based on the Sentinel-2 data (S2), according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating the automatic classification result of the crop SVM according to the present invention, based on the Sentinel-2 NDVI time series data (S2-NDVI).
Fig. 6 is a diagram illustrating the MV algorithm crop automatic classification results of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
Fig. 7 is a diagram illustrating the AMV algorithm crop automatic classification results of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
FIG. 8 is a schematic diagram of the OCA-MV algorithm crop automatic classification result of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without any inventive step, are within the scope of the present invention.
The invention provides a multi-classifier-fused crop automatic classification method, which combines high-fraction data, NDVI time sequence and multispectral remote sensing data with a classifier set covering various classifiers to construct a crop automatic classification model, realize the primary classification of crops, and effectively combine the spectral detection capability of the multispectral remote sensing data, the space structure detection capability of the high-fraction remote sensing data and the detection capability of the long-time remote sensing data on the phenological difference characteristics of the crops; meanwhile, the method also combines the overall classification precision index and the traditional majority vote mechanism to perform decision fusion classification on the primary classification result, and effectively utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types. The method solves the problems of mixed pixel, same-object different-spectrum equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and the single data source, and simultaneously solves the problem that the traditional decision-level fusion strategy of majority vote and the self-adaptive majority vote method can not comprehensively reflect the performance difference of the classifier.
As an embodiment, the multi-classifier fused crop automatic classification method, as shown in fig. 1(a) -1 (c), includes the following steps:
the method comprises the following steps: and acquiring high-spatial resolution remote sensing data (high-fraction data) and long-time-sequence multispectral remote sensing data, and preprocessing to obtain the high-fraction data and the multispectral time-sequence reflectivity.
The number of wave bands and the spatial resolution of the multispectral data are not lower than Landsat-8. The spatial resolution of the high-fraction data is not less than 2.5 meters.
Furthermore, in the data acquisition step, the high-score data acquisition time is within the growth period of most crops in the working area, the time sequence multispectral remote sensing data is preferably collected within one year, the same sensor completely covers the whole period data of the research area and the image quality is good.
The high-score image used in this example was high score number 6 (GF-6) data of typical crop planting area in the fragrant day and town of dorn county, dorn, of Qinghai province, acquired at 22/8/2021, and Sentinel-2 data of the same area, which was cloudless and good in image quality throughout the year, for 12 periods (2021.2.9, 2021.6.4, 2021.6.29, 2021.7.2, 2021.7.22, 2021.7.29, 2021.8.26, 2021.9.7, 2021.9.22, 2021.9.30, 2021.10.12, 2021.10.17).
In the data preprocessing step, the collected high-fraction data and the multispectral original image are subjected to radiation correction, atmospheric correction, orthorectification and cutting in sequence to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
The pretreated GF-6 reflectivity is 3488 multiplied by 2788, the pretreated Sentinel-2 data is 872 multiplied by 697, the land utilization types of the area comprise construction land, bare land and arable land, wherein the arable land crop types comprise: highland barley, quinoa, wheat, matrimony vine and rape 5 kinds of crops.
Step two: calculating to obtain a vegetation index time sequence (namely, an NDVI time sequence) by utilizing the multispectral time sequence reflectivity, comparing NDVI time sequence curves of different crop types by combining with crop training samples, analyzing the phenological characteristic difference, and selecting multispectral reflectivity data corresponding to the moment when the NDVI of each type of crop is greatly different in the growing period to participate in classification.
Calculating the NDVI time series further comprises:
calculating a normalized vegetation index (NDVI) from the acquired multispectral time-series reflectivity data to obtain an NDVI time series (S2-NDVI), wherein the calculation formula is as follows:
wherein,the reflectivity of the near infrared band of the multispectral data,is the red-band reflectance of the multi-spectral data.
Obtaining a crop type training sample further comprises:
collecting crop type space distribution data in a working area, or carrying out field investigation, selecting crop type samples in the working area, randomly dividing the crop sample area into training samples and verification samples in a layered random sampling mode according to the proportion of 1:1, and respectively using the training samples and the verification samples for training an automatic classification model and evaluating the precision of classification results.
As an implementation mode, in 2021, 24 days at 8 months and 24 days, field investigation is carried out to Xiangri Detown, training sample areas of various crops are obtained by taking field blocks as units, 8 training sample areas of medlar, 14 training sample areas of quinoa, 5 training sample areas of highland barley, 18 training sample areas of wheat, 13 training sample areas of rape, 8 training sample areas of construction land and 8 training sample areas of bare land and 8 verification sample areas of rape are obtained.
The step of participating in classification of the multispectral reflectance data selection further comprises:
the NDVI time sequence and training samples of various crops are utilized to obtain NDVI mean value time sequence curves of training areas of different crop types, the difference of phenological characteristics of various crops is analyzed, and multispectral reflectivity data corresponding to the moment when the NDVI of various crops is greatly different are selected to participate in classification by comparing the difference of the NDVI time sequence curves of various crops.
The NDVI mean timing curves obtained for the training samples for each crop type in S2-NDVI are shown in fig. 2.
The NDVI mean value difference of training samples of various crop types in NDVI values of stages 2021.7.29, 2021.8.26 and 2021.9.7 is found by comparing NDVI time sequence curves of different crops, and the NDVI time sequence curves are favorable for distinguishing various types of crops.
Sentinel-2 data collected 2021.8.26 was selected to participate in classification as multispectral reflectance data, taking into account GF-6 data collection time (2021.8.22) and field validation time (2021.8.24).
Step three: performing crop type primary classification on the high-score data reflectivity, the NDVI time sequence data and the multispectral reflectivity data selected in the second step by using a classifier set and combining crop training samples to obtain a primary classification result set, and performing precision evaluation;
wherein the primary classification of crop types further comprises:
performing crop type primary classification on the high-score data reflectivity (GF 6), the NDVI time sequence data (S2-NDVI) and the multispectral reflectivity data (S2) selected in the second step by using a multi-type classifier to obtain a classification result set, wherein the classification algorithm is suitable for selecting various types of supervised classification algorithms, including a shortest Distance (MD) algorithm, a Maximum Likelihood (ML) algorithm, a Mahalanobis Distance (MAD) algorithm, a Spectral Angle Map (SAM) algorithm, a Spectral Information Divergence (SID) algorithm, a Support Vector Machine (SVM) algorithm and a Random Forest (randform,
RF) algorithm.
3 kinds of data (GF 6, S2 and S2-NDVI) and 7 kinds of classification algorithms are used for establishing a crop automatic classification model to obtain a classification result set (GF 6-MD, GF6-ML, GF6-MAD, GF6-SAM, GF6-SID, GF6-SVM, GF6-RF, S2-MD, S2-ML, S2-MAD, S2-SAM, S2-SID, S2-SVM, S2-RF, S2-NDVI-MD, S2-NDVI-ML, S2-NDVI-MAD, S2-NDVI-SAM, S2-NDVI-SID, S2-NDVI-SVM, S2-NDVI-RF).
The precision evaluation of the classification result set further comprises the following steps:
based on the verification sample, the precision evaluation is carried out on the obtained primary classification result set by using a confusion matrix method, precision evaluation indexes such as Overall classification precision (OA), Producer Precision (PA), User precision (UA) and Kappa coefficient are calculated, and the precision evaluation is carried out. Where OA refers to the ratio of the total number of pels correctly classified by the classifier to the total number of pels to be classified. PA refers to the ratio of the number of correctly classified pixels of a certain crop type to the number of all pixels in the verification data set of the crop type; UA refers to the ratio of the number of correctly classified pixels in a certain crop type to the number of all pixels classified as that crop type.
The Kappa coefficient calculation formula is as follows:
wherein,represents the proportion of cells in the confusion matrix where the image classification is correct, andit represents the proportion of classification errors in the confusion matrix due to incidental factors.
Is provided withIs as followsClass data utilization ofThe class classifier performs overall classification accuracy of crop classification,is as followsClass data utilization ofThe Kappa coefficient of the class classifier for crop classification,is referred to asClass data utilization ofClass classifier performs classification on cropUser precision of crop-like types.
The accuracy evaluation results of the primary classification result set are shown in table 1.
TABLE 1 Single data Source Single classifier accuracy evaluation results (Unit:%)
From the precision evaluation results, the RF algorithm obtained the highest overall classification precision (74.48%) for GF6 data, and the classification results are shown in fig. 3. From the single crop class classification precision (mean of PA and UA), the medlar and wheat classification precision (74.78% and 73.11%) of the SAM algorithm is the highest, the quinoa classification precision (71.09%) of the ML algorithm is the highest, and the highland barley and wheat classification precision (70.88% and 84.77%) of the MAD algorithm is the highest.
For the data of S2, the SVM algorithm obtained the highest overall classification accuracy (80.88%), and the classification results are shown in FIG. 4. From the single crop category classification precision, the medlar and wheat classification precision (73.95% and 83.56%) of the ML algorithm is the highest, the quinoa classification precision (65.99%) of the SID algorithm is the highest, the highland barley classification precision (74.19%) of the SVM algorithm is the highest, and the rape classification precision (73.64%) of the MAD algorithm is the highest.
For the S2-NDVI data, the SVM algorithm achieved the highest overall classification accuracy (81.07%), and the classification results are shown in FIG. 5. From the single crop category classification precision, the Chinese wolfberry classification precision (78.96%) of the RF algorithm is the highest, the quinoa and wheat classification precision (78.31% and 80.68%) of the SVM algorithm is the highest, the highland barley classification precision (81.58%) of the MAD algorithm is the highest, and the rape classification precision (60.56%) of the ML algorithm is the highest.
Based on the precision evaluation results, the classification precision of different classifiers is greatly different when the different data are automatically classified, the classification performance of different classifiers is also greatly different when the different crop types are classified, and the stable automatic classification precision of crops is difficult to obtain by using a single data source and a single classifier.
Step four: and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
In this embodiment, an OCA-MV algorithm is respectively executed on a GF6 data classification result set (7 classification result), an S2 data classification result set (7 classification result), an S2-NDVI data classification result set (7 classification result), a classification result set (14 classification result) in which GF6 data and S2 data are combined, a classification result set (14 classification result) in which GF6 data and S2-NDVI data are combined, a classification result set (14 classification result) in which GF6 data and S2-NDVI data are combined, a classification result set in which S2 data and S2-NDVI data are combined, and a classification result set (21 classification result) of three kinds of data, so as to obtain a decision fusion classification result, and execute conventional MV and AMV algorithms at the same time, and compare results of the decision fusion algorithms and classification accuracy.
And 4.1, constructing an overall classification precision index set.
Constructing a set of overall class precision indices for each data source using each crop type class in the classification results for each classifierThe method is used for representing the classification performance difference of different classifiers on different crop types.
The method comprises 8 overall classification precision indexes, and the calculation formulas are respectively as follows:
and 4.2, calculating the attribution probability of the pixel crop type.
Calculate per data source, per classifier and per crop typeAnd assigning values pixel by pixel according to the pixel classification categoryCalculating each pixel element to be attributed toProbability of crop-like typeThe following were used:
wherein n represents the number of data sources, m represents the number of classifiers, x and y are classified image row-column labels,representative pixelFall intoProbability of crop type of class.
According to a majority vote mechanism, each pixel is classified asThe type of the corresponding crop is selected,the number of crop type categories. Completing the decision fusion of the primary classification result to obtain the multi-data-source multi-classifier decision fusion crop classification resultCL。
Step 4.4, according to8 overall classification accuracies of (c) (( ) Indexing, repeating the step 4.2 and the step 4.3 to obtain a decision classification result set。
And 4.5, evaluating the precision of the decision classification result set.
Based on the verification sample, the decision classification result set obtained in the step 4.4 is classified by using a confusion matrix methodPerforming precision evaluation and calculating the overall classification precision。
And 4.6, obtaining a final decision classification result:
calculating overall classification accuracyAnd derive its corresponding classification resultAnd finally obtaining the automatic crop classification result fused by the long-time sequence multi-source remote sensing data and the multiple classifiers.
In this embodiment, the decision classification results of the OCA-MV algorithm, the MV calculation, and the AMV algorithm of each classification result set are subjected to precision evaluation, and the results are shown in table 2.
TABLE 2 evaluation result of decision classification precision of multiple data sources and multiple classifiers
The precision evaluation results show that the crop decision classification precision of the multiple classifiers is superior to that of the single-data-source single-classifier classification results, and the crop classification precision of the OCA-MV algorithm provided by the invention is superior to that of the traditional MV and AMV algorithms.
The results of automatic crop classification using MV, AMV and OCA-MV algorithms combined with GF6 data, S2 data and S2-NDVI data are shown in FIGS. 6, 7 and 8, respectively.
Decision fusion classification results of the GF6, S2 and S2-NDVI classification result sets show that the OCA-MV method provided by the invention obtains the highest overall classification precision, OA is 77.29%, 82.14% and 82.00% respectively, the classification precision is obviously superior to that of the traditional decision classification algorithm (MV and AMV), and compared with the classification precision of single data source single classifiers (GF 6-RF, S2-SVM and S2-NDVI-SVM), the classification precision is respectively improved by 2.81%, 1.26% and 0.93%, and the superiority of the multi-classifier decision fusion crop automatic classification method is reflected.
Decision fusion classification results of the GF6+ S2, GF6+ S2-NDVI, S2+ S2-NDVI and GF6+ S2+ S2-NDVI classification result sets show that the OCA-MV method provided by the invention obtains the highest overall classification precision, OA is 82.70%, 82.36%, 84.20% and 85.81% respectively, the classification precision is obviously superior to that of the traditional decision classification algorithm (MV and AMV), and the overall classification precision is superior to that of a single-data-source multi-classifier, so that the superiority of the automatic crop classification method based on long-time-sequence multi-source remote sensing data provided by the invention and the advancement of the OCA-MV algorithm provided by the invention are embodied.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification system, the system comprising: the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity; the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity; the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with the crop type training sample; the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set; and the decision fusion classification module is used for executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
The system acquires original data through a data acquisition and preprocessing module, and preprocesses the original data to obtain high-fraction data and multispectral time sequence reflectivity; the NDVI time sequence is obtained through the calculation module and is input into the primary classification module, so that the detection capability of long-time-sequence remote sensing data on the phenological difference characteristics of crops is fully utilized during primary classification of the crops; through a multispectral image selection module, with the NDVI time sequence as a reference standard, selecting multispectral image data participating in classification from all periods, and fully utilizing the spectral detection capability of multispectral remote sensing data on the premise of considering the calculation efficiency; performing primary classification on input source data through a primary classification module; and executing an optimization algorithm which gives consideration to the overall classification precision index and the traditional majority vote mechanism on the primary classification result set through a decision fusion classification module to obtain a final crop classification result. By fusing the advantages of automatic identification of different classifiers on crops and the advantages of classification performance differences of the different classifiers on different crop types represented by the overall classification precision index, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homotopic and heterology equivalent responses and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
The decision fusion classification module further comprises: the index set construction module is used for constructing a total classification precision index set; the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set; the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index; the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision; and the output module is used for outputting a final decision classification result based on the overall classification precision.
Constructing a total classification precision index set based on the total classification precision calculated in the precision evaluation of the primary classification result set through an index set construction module so as to represent the classification performance difference of different classifiers on different crop types; calculating the attribute probability of the pixel crop type aiming at each index in the overall classification precision index set through an attribute probability calculating module, and calculating the maximum value of the class probability one by one on the basis of the attribute probability of the pixel crop type through a class probability maximum value calculating module to obtain a decision classification result corresponding to each index; calculating all indexes through an attribution probability calculating module and a class probability maximum value calculating module to obtain a decision classification result set; then, performing precision evaluation on the decision classification result set through a precision evaluation module, and calculating the overall classification precision; and finally, outputting a final decision classification result through an output module to serve as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the essence of the corresponding technical solutions.
Claims (9)
1. The automatic crop classification method based on multi-classifier fusion is characterized by comprising the following steps:
data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained;
calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence;
based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification;
inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set;
and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
2. The method of claim 1, wherein the data collection and pre-processing further comprises:
collecting high-grade data when crops exceeding a preset percentage in a working area are all in a growing period and multispectral original image data of all periods in one year;
and preprocessing the collected high-fraction data and multispectral original image data to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
3. The method of claim 1, wherein selecting multispectral reflectance data for classification further comprises:
acquiring training samples of various types of crops, and combining the NDVI time sequences obtained by calculation to obtain NDVI mean value time sequence curves of training areas of different types of crops;
and comparing the NDVI time sequence curves of different types of crops by taking the NDVI mean value time sequence curves of different types of crop training areas as a reference, and selecting the multispectral reflectivity data corresponding to the time with the maximum difference of the NDVI time sequence curves of the different types of crops as multispectral image data participating in classification.
4. The method of claim 1, further comprising: and (3) constructing an automatic crop classification model based on 3 input data and 7 supervision and classification algorithms, and outputting a primary classification result set.
5. The method according to claim 4, wherein the precision evaluation is performed on the primary classification result set by using a confusion matrix method based on the crop verification samples, and the precision evaluation index is calculated.
6. The method of claim 1, wherein the OCA-MV algorithm is performed on the primary classification result set to complete decision fusion classification, and further comprising:
constructing a total classification precision index set;
calculating pixel crop type attribution probability, and calculating the maximum value of the class probability pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to a single overall classification precision index;
pixel crop type attribution probability calculation and pixel-by-pixel calculation of the maximum class probability are carried out on all the overall classification precision indexes in the overall classification precision index set one by one, and a decision classification result set is obtained;
performing precision evaluation on the decision classification result set, and calculating the overall classification precision;
and obtaining a final decision classification result based on the overall classification precision.
7. A multi-classifier fused crop automatic classification system, the system comprising:
the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity;
the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity;
the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with a crop type training sample;
the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set;
and the decision fusion classification module is used for executing an OCA-MV algorithm on the primary classification result set to complete decision fusion classification.
8. The multi-classifier fused crop automatic classification system according to claim 7, wherein the decision fusion classification module further comprises:
the index set construction module is used for constructing a total classification precision index set;
the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set;
the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index;
the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision;
and the output module is used for outputting a final decision classification result based on the overall classification precision.
9. A computer-readable storage medium having stored thereon a computer program, characterized in that,
the computer program, when executed by a processor, implements the method of any one of claims 1 to 6.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210548609.7A CN114639005B (en) | 2022-05-20 | 2022-05-20 | Multi-classifier fused crop automatic classification method and system and storage medium |
NL2034106A NL2034106B1 (en) | 2022-05-20 | 2023-02-07 | Automatic crop classification method and system based on multi-classifier fusion, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210548609.7A CN114639005B (en) | 2022-05-20 | 2022-05-20 | Multi-classifier fused crop automatic classification method and system and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114639005A true CN114639005A (en) | 2022-06-17 |
CN114639005B CN114639005B (en) | 2022-10-21 |
Family
ID=81953079
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210548609.7A Active CN114639005B (en) | 2022-05-20 | 2022-05-20 | Multi-classifier fused crop automatic classification method and system and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114639005B (en) |
NL (1) | NL2034106B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115063690A (en) * | 2022-06-24 | 2022-09-16 | 电子科技大学 | Vegetation classification method based on NDVI (normalized difference vegetation index) time sequence characteristics |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103258212A (en) * | 2013-04-03 | 2013-08-21 | 中国科学院东北地理与农业生态研究所 | Semi-supervised integrated remote-sensing image classification method based on attractor propagation clustering |
CN103489005A (en) * | 2013-09-30 | 2014-01-01 | 河海大学 | High-resolution remote sensing image classifying method based on fusion of multiple classifiers |
CN106355030A (en) * | 2016-09-20 | 2017-01-25 | 浙江大学 | Fault detection method based on analytic hierarchy process and weighted vote decision fusion |
CN108399423A (en) * | 2018-02-01 | 2018-08-14 | 南京大学 | A kind of multidate-Combining Multiple Classifiers of classification of remote-sensing images |
CN108629494A (en) * | 2018-04-19 | 2018-10-09 | 三峡大学 | Arid grade appraisal procedure and system |
CN109711446A (en) * | 2018-12-18 | 2019-05-03 | 中国科学院深圳先进技术研究院 | A kind of terrain classification method and device based on multispectral image and SAR image |
-
2022
- 2022-05-20 CN CN202210548609.7A patent/CN114639005B/en active Active
-
2023
- 2023-02-07 NL NL2034106A patent/NL2034106B1/en active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103258212A (en) * | 2013-04-03 | 2013-08-21 | 中国科学院东北地理与农业生态研究所 | Semi-supervised integrated remote-sensing image classification method based on attractor propagation clustering |
CN103489005A (en) * | 2013-09-30 | 2014-01-01 | 河海大学 | High-resolution remote sensing image classifying method based on fusion of multiple classifiers |
CN106355030A (en) * | 2016-09-20 | 2017-01-25 | 浙江大学 | Fault detection method based on analytic hierarchy process and weighted vote decision fusion |
CN108399423A (en) * | 2018-02-01 | 2018-08-14 | 南京大学 | A kind of multidate-Combining Multiple Classifiers of classification of remote-sensing images |
CN108629494A (en) * | 2018-04-19 | 2018-10-09 | 三峡大学 | Arid grade appraisal procedure and system |
CN109711446A (en) * | 2018-12-18 | 2019-05-03 | 中国科学院深圳先进技术研究院 | A kind of terrain classification method and device based on multispectral image and SAR image |
Non-Patent Citations (1)
Title |
---|
牛明昂等: "多分类器融合与单分类器影像分类比较研究", 《矿山测量》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115063690A (en) * | 2022-06-24 | 2022-09-16 | 电子科技大学 | Vegetation classification method based on NDVI (normalized difference vegetation index) time sequence characteristics |
CN115063690B (en) * | 2022-06-24 | 2024-06-07 | 电子科技大学 | Vegetation classification method based on NDVI time sequence characteristics |
Also Published As
Publication number | Publication date |
---|---|
NL2034106B1 (en) | 2024-06-19 |
NL2034106A (en) | 2023-11-27 |
CN114639005B (en) | 2022-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hu et al. | A phenology-based spectral and temporal feature selection method for crop mapping from satellite time series | |
Feng et al. | Crop type identification and mapping using machine learning algorithms and sentinel-2 time series data | |
Clark | Comparison of simulated hyperspectral HyspIRI and multispectral Landsat 8 and Sentinel-2 imagery for multi-seasonal, regional land-cover mapping | |
Luo et al. | Comparison of machine learning algorithms for mapping mango plantations based on Gaofen-1 imagery | |
CN113657158B (en) | Google EARTH ENGINE-based large-scale soybean planting area extraction algorithm | |
CN116543316B (en) | Method for identifying turf in paddy field by utilizing multi-time-phase high-resolution satellite image | |
Sun et al. | Mapping tropical dry forest age using airborne waveform LiDAR and hyperspectral metrics | |
CN111126511A (en) | Vegetation index fusion-based LAI quantitative model establishment method | |
Yuan et al. | Research on rice leaf area index estimation based on fusion of texture and spectral information | |
CN114639005B (en) | Multi-classifier fused crop automatic classification method and system and storage medium | |
CN113901966B (en) | Crop classification method fusing multi-source geographic information data | |
Chellasamy et al. | A multievidence approach for crop discrimination using multitemporal worldview-2 imagery | |
Shen et al. | High-throughput phenotyping of individual plant height in an oilseed rape population based on Mask-RCNN and UAV images | |
CN116258844A (en) | Rapid and accurate identification method for phenotype character of cotton leaf | |
CN115063610A (en) | Soybean planting area identification method based on Sentinel-1 and 2 images and area measurement method thereof | |
Li et al. | Crop region extraction of remote sensing images based on fuzzy ARTMAP and adaptive boost | |
CN113205565A (en) | Forest biomass remote sensing mapping method based on multi-source data | |
Kozoderov et al. | Models of pattern recognition and forest state estimation based on hyperspectral remote sensing data | |
Cheng et al. | Crop aboveground biomass monitoring model based on UAV spectral index reconstruction and Bayesian model averaging: A case study of film-mulched wheat and maize | |
Ding et al. | Extraction of soybean planting areas based on multi-temporal Sentinel-1/2 data | |
Gao et al. | Optimal feature selection and crop extraction using random forest based on GF-6 WFV data | |
Wang et al. | RETRACTED: Theoretical research on rice and wheat lodging detection based on artificial intelligence technology and a template matching algorithm | |
Chadwick | Enhancing post-harvest regeneration monitoring with digital aerial photogrammetry and deep learning | |
Jepkosgei | Evaluating the factors influencing farmers’ choices of maize-based cropping patterns and assessing the potential of desis hyperspectral satellite data to discriminate the cropping patterns. | |
Okada et al. | High-Throughput Phenotyping of Soybean Biomass: Conventional Trait Estimation and Novel Latent Feature Extraction Using UAV Remote Sensing and Deep Learning Models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |