CN114639005A - Multi-classifier fused crop automatic classification method and system and storage medium - Google Patents

Multi-classifier fused crop automatic classification method and system and storage medium Download PDF

Info

Publication number
CN114639005A
CN114639005A CN202210548609.7A CN202210548609A CN114639005A CN 114639005 A CN114639005 A CN 114639005A CN 202210548609 A CN202210548609 A CN 202210548609A CN 114639005 A CN114639005 A CN 114639005A
Authority
CN
China
Prior art keywords
classification
data
crop
time sequence
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210548609.7A
Other languages
Chinese (zh)
Other versions
CN114639005B (en
Inventor
帅爽
张志�
马梓程
王亚毛
彭艳鹏
熊忠招
谢菲
王旭
龚元夫
陈思
肖瑶
谢翠容
岳常海
孙天成
孟丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Institute Of Land Surveying And Mapping
China University of Geosciences
Original Assignee
Hubei Institute Of Land Surveying And Mapping
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Institute Of Land Surveying And Mapping, China University of Geosciences filed Critical Hubei Institute Of Land Surveying And Mapping
Priority to CN202210548609.7A priority Critical patent/CN114639005B/en
Publication of CN114639005A publication Critical patent/CN114639005A/en
Application granted granted Critical
Publication of CN114639005B publication Critical patent/CN114639005B/en
Priority to NL2034106A priority patent/NL2034106B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/259Fusion by voting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention discloses a multi-classifier fused crop automatic classification method, a multi-classifier fused crop automatic classification system and a storage medium, wherein the method comprises the following steps: data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained; calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence; based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification; inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set; and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.

Description

Multi-classifier fused crop automatic classification method and system and storage medium
Technical Field
The invention relates to the technical field of crop Classification, in particular to a decision-making grade crop automatic Classification method (OCA-MV) combining an Overall Classification Accuracy index (OCA) and a traditional Majority Vote (MV) mechanism, and specifically relates to a multi-classifier fused crop automatic Classification method, system and storage medium.
Background
The existing crop active classification method based on remote sensing data is usually based on a single classifier and a single data source, and multispectral data, high-score data and vegetation index time series have different advantages and complements for distinguishing crop types. The multispectral data has spectral information required for distinguishing the types of crops, but is limited by spatial resolution, and the classification precision of the multispectral data is limited by a mixed pixel effect; the improvement of spatial resolution (high-score data) is the most effective way to alleviate the effect of mixed pixels; the vegetation index time sequence can reflect the phenological characteristic differences of different crop types, and further relieve the influence of foreign matter co-spectrum and foreign matter different spectrum phenomena on classification precision when single-time image optical data (multispectral data and high-level data) are used for crop type identification. Meanwhile, different classifiers often show performance differences when constructing classification models of different regions and different crop objects, and stable classification accuracy is often difficult to obtain by using a single classifier.
The decision-level fusion of the multi-source remote sensing data is to classify images of each input data source and combine classification results according to decision rules so as to generate an optimal decision. The decision-level fusion technology can effectively combine the advantages of multi-source remote sensing data and multi-classifiers in crop classification, and improves classification precision. However, in the traditional decision-level fusion strategy of the Majority Vote (MV), the influence of the classification performance difference on the decision result is not considered, and the self-adaptive majority vote method (AMV) adds self-adaptive weight obtained by overall precision calculation of a classifier on the basis of the MV, but still does not reflect the classification performance difference of different classifiers on different classification classes. It can be seen that the two decision fusion algorithms fail to fully reflect the performance difference of the classifier, and the precision of the classification result is influenced.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a multi-classifier fused crop automatic classification method, a multi-classifier fused crop automatic classification system and a storage medium, which are used for improving the remote sensing automatic classification precision of crops.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification method, including:
data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained;
calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence;
based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification;
inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set;
and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
According to the technical scheme, high-level data, an NDVI time sequence and multispectral remote sensing data are combined with a classifier set covering various classifiers, a crop automatic classification model is constructed, primary classification of crops is achieved, and the spectral detection capability of the multispectral remote sensing data, the space structure detection capability of the high-level remote sensing data and the detection capability of the long-time remote sensing data on the phenological difference characteristics of the crops are effectively combined; meanwhile, the technical scheme also combines the overall classification precision index and a traditional majority vote mechanism to perform decision fusion classification on the primary classification result, and effectively utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types.
According to the technical scheme, the advantages of automatic identification of crops by different classifiers are fused, the overall classification precision index represents the advantages of performance difference of different classifiers on classification of different crop types, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homomorphism and heterology equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance difference of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
As a further technical solution, the data acquisition and preprocessing further comprises:
collecting high-grade data when crops exceeding a preset percentage in a working area are all in a growing period and multispectral original image data of all periods in one year;
and preprocessing the collected high-fraction data and multispectral original image data to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
Specifically, the high-score data collection time is set to be within the growth period of most crops in the working area. A reference may be set and high-score data collection may begin when crops in the work area that exceed the reference are in the growing season.
Specifically, the acquisition time of multispectral original image data is set to be one year so as to provide long-time-sequence multisource remote sensing data.
Further, preprocessing the collected high-fraction data and the multispectral original image data, further comprising: and sequentially carrying out radiation correction, atmospheric correction, orthorectification and cutting on the collected high-fraction data and the multispectral original image to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
As a further technical solution, the selecting the multispectral reflectance data participating in the classification further comprises:
acquiring training samples of various types of crops, and combining the NDVI time sequences obtained by calculation to obtain NDVI mean value time sequence curves of training areas of different types of crops;
and comparing the NDVI time sequence curves of different types of crops by taking the NDVI mean value time sequence curves of different types of crop training areas as a reference, and selecting the multispectral reflectivity data corresponding to the time with the maximum difference of the NDVI time sequence curves of the different types of crops as multispectral image data participating in classification.
According to the technical scheme, NDVI mean value time sequence curves of different types of crop training areas are further obtained on the basis of the NDVI time sequences obtained through calculation, and then multispectral reflectivity data corresponding to the time with the maximum NDVI time sequence curve difference are selected from all the NDVI time sequence curves of different types of crop training areas as multispectral image data participating in classification, so that the spectral detection capability of multispectral remote sensing data is fully utilized when crops are automatically classified, and the problems of data redundancy and low calculation efficiency caused by the fact that original multispectral remote sensing images are directly used are solved.
When the crop is classified initially, the NDVI time sequence and the multispectral image data participating in classification are input into the crop automatic classification model, and the high efficiency of calculation is guaranteed on the basis of fully utilizing the advantages of the long-time-sequence remote sensing data and the multispectral remote sensing data.
As a further technical solution, the method further comprises: and (3) constructing an automatic crop classification model based on 3 input data and 7 supervision and classification algorithms, and outputting a primary classification result set.
Further, the 7 supervision and classification algorithms comprise a shortest distance algorithm, a maximum likelihood algorithm, a mahalanobis distance algorithm, a spectrum angle mapping algorithm, a bopp information divergence algorithm, a support vector machine algorithm and a random forest algorithm.
Further, the constructed automatic crop classification model can be trained and evaluated accurately by obtaining crop type samples. Obtaining a crop type sample further comprises: collecting crop type space distribution data in a working area, or carrying out field investigation, selecting crop type samples in the working area, randomly dividing the crop sample area into training samples and verification samples in a layered random sampling mode according to the proportion of 1:1, and respectively using the training samples and the verification samples for training an automatic classification model and evaluating the precision of classification results.
As a further technical scheme, based on the crop verification sample, the precision evaluation is carried out on the primary classification result set by using a confusion matrix method, and the precision evaluation index is calculated.
And calculating the Overall classification precision (OA) and the Kappa coefficient as precision evaluation indexes, and carrying out precision evaluation on the primary classification result. Wherein OA refers to the ratio of the total number of pels correctly classified by the classifier to the total number of pels to be classified.
Further, the overall classification accuracy OA, the producer accuracy PA, the user accuracy UA, and the Kappa coefficient may also be calculated as the accuracy evaluation index. Wherein, PA refers to the ratio of the correct classification pixel number of a certain crop type to all pixel numbers in the verification data set of the crop type; UA refers to the ratio of the number of correctly classified pixels in a certain crop type to the number of all pixels classified as that crop type.
As a further technical solution, an OCA-MV algorithm is executed on the initial classification result set to complete decision fusion classification, further comprising:
constructing a total classification precision index set;
calculating the attribute probability of the pixel crop type, and calculating the maximum value of the class probability pixel by pixel based on the attribute probability of the pixel crop type to obtain a decision classification result corresponding to a single overall classification precision index;
performing pixel crop type attribution probability calculation and pixel-by-pixel calculation of a class probability maximum value on all the overall classification precision indexes in the overall classification precision index set one by one to obtain a decision classification result set;
performing precision evaluation on the decision classification result set, and calculating the overall classification precision;
and obtaining a final decision classification result based on the overall classification precision.
The technical scheme is that a total classification precision index set is constructed by using the total classification precision obtained by calculation in the precision evaluation of the primary classification result set so as to represent the classification performance difference of different classifiers on different crop types; calculating a total classification precision index aiming at each data source, each classifier and each crop type, and calculating the probability of each pixel belonging to the kth crop type according to pixel classification categories and pixel-by-pixel assignment; then calculating a class probability maximum value based on the pixel crop type attribution probability to obtain a decision classification result corresponding to a single overall classification precision index, and obtaining a decision classification result set through repeated pixel crop type attribution probability calculation and pixel-by-pixel calculation of the class probability maximum value; and finally, outputting the classification result corresponding to the maximum overall classification precision as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification system, the system comprising:
the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity;
the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity;
the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with a crop type training sample;
the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set;
and the decision fusion classification module is used for executing an OCA-MV algorithm on the primary classification result set to complete decision fusion classification.
According to the technical scheme, the original data are obtained through the data acquisition and preprocessing module, and high-fraction data and multispectral time sequence reflectivity are obtained through preprocessing; the NDVI time sequence is obtained through the calculation module and is input into the primary classification module, so that the detection capability of long-time-sequence remote sensing data on the phenological difference characteristics of crops is fully utilized during primary classification of the crops; through a multispectral image selection module, with the NDVI time sequence as a reference standard, selecting multispectral image data participating in classification from all periods, and fully utilizing the spectral detection capability of multispectral remote sensing data on the premise of considering the calculation efficiency; performing primary classification on input source data through a primary classification module; and executing an optimization algorithm which gives consideration to the overall classification precision index and the traditional majority vote mechanism on the primary classification result set through a decision fusion classification module to obtain a final crop classification result. By fusing the advantages of automatic identification of different classifiers on crops and the advantages of classification performance differences of the different classifiers on different crop types represented by the overall classification precision index, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homotopic and heterology equivalent responses and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
As a further technical solution, the decision fusion classification module further includes:
the index set construction module is used for constructing a total classification precision index set;
the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set;
the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index;
the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision;
and the output module is used for outputting a final decision classification result based on the overall classification precision.
According to the technical scheme, an index set construction module constructs a total classification precision index set based on total classification precision obtained by calculation in precision evaluation of an initial classification result set so as to represent classification performance differences of different classifiers on different crop types; calculating the attribute probability of the pixel crop type aiming at each index in the overall classification precision index set through an attribute probability calculating module, and calculating the maximum value of the class probability one by one on the basis of the attribute probability of the pixel crop type through a class probability maximum value calculating module to obtain a decision classification result corresponding to each index; calculating all indexes through an attribution probability calculating module and a class probability maximum value calculating module to obtain a decision classification result set; then, performing precision evaluation on the decision classification result set through a precision evaluation module, and calculating the overall classification precision; and finally, outputting a final decision classification result through an output module to serve as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method.
Compared with the prior art, the invention has the beneficial effects that:
(1) according to the method, high-score data, an NDVI time sequence and multispectral remote sensing data are combined with a classifier set covering various classifiers, a crop automatic classification model is constructed, the primary classification of crops is realized, and the spectral detection capability of the multispectral remote sensing data, the spatial structure detection capability of the high-score remote sensing data and the detection capability of long-time remote sensing data on the crop phenological difference characteristics are effectively combined; meanwhile, decision fusion classification is carried out on the primary classification result by combining the overall classification precision index and a traditional majority vote mechanism, and the overall classification precision index is effectively utilized to represent the classification performance difference of different classifiers for different crop types. According to the method, the advantages of automatic identification of crops by different classifiers are fused, and the overall classification precision index represents the advantages of classification performance differences of different classifiers on different crop types, so that the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homomorphism-heterogram equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
(2) The invention effectively integrates the advantages of different classifiers for automatically identifying crops, so that the final automatic crop classification precision is superior to that of a single classifier.
(3) The invention utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types, solves the defect that the traditional decision-level fusion strategy and the self-adaptive majority vote method for majority vote can not comprehensively reflect the performance difference of the classifiers, and has the automatic classification precision of crops obviously superior to the traditional decision-level fusion strategy and the self-adaptive majority vote method for majority vote.
Drawings
Fig. 1(a) is a schematic flow chart of a multi-classifier fused crop automatic classification method according to an embodiment of the present invention before primary classification.
Fig. 1(b) is a schematic diagram of the primary classification and precision evaluation flow of the multi-classifier fused crop automatic classification method according to the embodiment of the invention.
Fig. 1(c) is a schematic flow chart of the OCA-MV algorithm according to an embodiment of the present invention.
Figure 2 is a schematic comparison of NDVI timing curves for different crop types according to an embodiment of the present invention.
Fig. 3 shows the RF automatic classification result of high score six (GF 6) data crops according to an embodiment of the present invention.
Fig. 4 is a diagram illustrating the automatic classification result of the crop SVM based on the Sentinel-2 data (S2), according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating the automatic classification result of the crop SVM according to the present invention, based on the Sentinel-2 NDVI time series data (S2-NDVI).
Fig. 6 is a diagram illustrating the MV algorithm crop automatic classification results of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
Fig. 7 is a diagram illustrating the AMV algorithm crop automatic classification results of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
FIG. 8 is a schematic diagram of the OCA-MV algorithm crop automatic classification result of the GF6+ S2+ S2-NDVI classification result set according to the embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without any inventive step, are within the scope of the present invention.
The invention provides a multi-classifier-fused crop automatic classification method, which combines high-fraction data, NDVI time sequence and multispectral remote sensing data with a classifier set covering various classifiers to construct a crop automatic classification model, realize the primary classification of crops, and effectively combine the spectral detection capability of the multispectral remote sensing data, the space structure detection capability of the high-fraction remote sensing data and the detection capability of the long-time remote sensing data on the phenological difference characteristics of the crops; meanwhile, the method also combines the overall classification precision index and the traditional majority vote mechanism to perform decision fusion classification on the primary classification result, and effectively utilizes the overall classification precision index to represent the classification performance difference of different classifiers for different crop types. The method solves the problems of mixed pixel, same-object different-spectrum equivalent and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and the single data source, and simultaneously solves the problem that the traditional decision-level fusion strategy of majority vote and the self-adaptive majority vote method can not comprehensively reflect the performance difference of the classifier.
As an embodiment, the multi-classifier fused crop automatic classification method, as shown in fig. 1(a) -1 (c), includes the following steps:
the method comprises the following steps: and acquiring high-spatial resolution remote sensing data (high-fraction data) and long-time-sequence multispectral remote sensing data, and preprocessing to obtain the high-fraction data and the multispectral time-sequence reflectivity.
The number of wave bands and the spatial resolution of the multispectral data are not lower than Landsat-8. The spatial resolution of the high-fraction data is not less than 2.5 meters.
Furthermore, in the data acquisition step, the high-score data acquisition time is within the growth period of most crops in the working area, the time sequence multispectral remote sensing data is preferably collected within one year, the same sensor completely covers the whole period data of the research area and the image quality is good.
The high-score image used in this example was high score number 6 (GF-6) data of typical crop planting area in the fragrant day and town of dorn county, dorn, of Qinghai province, acquired at 22/8/2021, and Sentinel-2 data of the same area, which was cloudless and good in image quality throughout the year, for 12 periods (2021.2.9, 2021.6.4, 2021.6.29, 2021.7.2, 2021.7.22, 2021.7.29, 2021.8.26, 2021.9.7, 2021.9.22, 2021.9.30, 2021.10.12, 2021.10.17).
In the data preprocessing step, the collected high-fraction data and the multispectral original image are subjected to radiation correction, atmospheric correction, orthorectification and cutting in sequence to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
The pretreated GF-6 reflectivity is 3488 multiplied by 2788, the pretreated Sentinel-2 data is 872 multiplied by 697, the land utilization types of the area comprise construction land, bare land and arable land, wherein the arable land crop types comprise: highland barley, quinoa, wheat, matrimony vine and rape 5 kinds of crops.
Step two: calculating to obtain a vegetation index time sequence (namely, an NDVI time sequence) by utilizing the multispectral time sequence reflectivity, comparing NDVI time sequence curves of different crop types by combining with crop training samples, analyzing the phenological characteristic difference, and selecting multispectral reflectivity data corresponding to the moment when the NDVI of each type of crop is greatly different in the growing period to participate in classification.
Calculating the NDVI time series further comprises:
calculating a normalized vegetation index (NDVI) from the acquired multispectral time-series reflectivity data to obtain an NDVI time series (S2-NDVI), wherein the calculation formula is as follows:
Figure 73704DEST_PATH_IMAGE002
wherein,
Figure 187285DEST_PATH_IMAGE003
the reflectivity of the near infrared band of the multispectral data,
Figure 816849DEST_PATH_IMAGE004
is the red-band reflectance of the multi-spectral data.
Obtaining a crop type training sample further comprises:
collecting crop type space distribution data in a working area, or carrying out field investigation, selecting crop type samples in the working area, randomly dividing the crop sample area into training samples and verification samples in a layered random sampling mode according to the proportion of 1:1, and respectively using the training samples and the verification samples for training an automatic classification model and evaluating the precision of classification results.
As an implementation mode, in 2021, 24 days at 8 months and 24 days, field investigation is carried out to Xiangri Detown, training sample areas of various crops are obtained by taking field blocks as units, 8 training sample areas of medlar, 14 training sample areas of quinoa, 5 training sample areas of highland barley, 18 training sample areas of wheat, 13 training sample areas of rape, 8 training sample areas of construction land and 8 training sample areas of bare land and 8 verification sample areas of rape are obtained.
The step of participating in classification of the multispectral reflectance data selection further comprises:
the NDVI time sequence and training samples of various crops are utilized to obtain NDVI mean value time sequence curves of training areas of different crop types, the difference of phenological characteristics of various crops is analyzed, and multispectral reflectivity data corresponding to the moment when the NDVI of various crops is greatly different are selected to participate in classification by comparing the difference of the NDVI time sequence curves of various crops.
The NDVI mean timing curves obtained for the training samples for each crop type in S2-NDVI are shown in fig. 2.
The NDVI mean value difference of training samples of various crop types in NDVI values of stages 2021.7.29, 2021.8.26 and 2021.9.7 is found by comparing NDVI time sequence curves of different crops, and the NDVI time sequence curves are favorable for distinguishing various types of crops.
Sentinel-2 data collected 2021.8.26 was selected to participate in classification as multispectral reflectance data, taking into account GF-6 data collection time (2021.8.22) and field validation time (2021.8.24).
Step three: performing crop type primary classification on the high-score data reflectivity, the NDVI time sequence data and the multispectral reflectivity data selected in the second step by using a classifier set and combining crop training samples to obtain a primary classification result set, and performing precision evaluation;
wherein the primary classification of crop types further comprises:
performing crop type primary classification on the high-score data reflectivity (GF 6), the NDVI time sequence data (S2-NDVI) and the multispectral reflectivity data (S2) selected in the second step by using a multi-type classifier to obtain a classification result set, wherein the classification algorithm is suitable for selecting various types of supervised classification algorithms, including a shortest Distance (MD) algorithm, a Maximum Likelihood (ML) algorithm, a Mahalanobis Distance (MAD) algorithm, a Spectral Angle Map (SAM) algorithm, a Spectral Information Divergence (SID) algorithm, a Support Vector Machine (SVM) algorithm and a Random Forest (randform,
RF) algorithm.
3 kinds of data (GF 6, S2 and S2-NDVI) and 7 kinds of classification algorithms are used for establishing a crop automatic classification model to obtain a classification result set (GF 6-MD, GF6-ML, GF6-MAD, GF6-SAM, GF6-SID, GF6-SVM, GF6-RF, S2-MD, S2-ML, S2-MAD, S2-SAM, S2-SID, S2-SVM, S2-RF, S2-NDVI-MD, S2-NDVI-ML, S2-NDVI-MAD, S2-NDVI-SAM, S2-NDVI-SID, S2-NDVI-SVM, S2-NDVI-RF).
The precision evaluation of the classification result set further comprises the following steps:
based on the verification sample, the precision evaluation is carried out on the obtained primary classification result set by using a confusion matrix method, precision evaluation indexes such as Overall classification precision (OA), Producer Precision (PA), User precision (UA) and Kappa coefficient are calculated, and the precision evaluation is carried out. Where OA refers to the ratio of the total number of pels correctly classified by the classifier to the total number of pels to be classified. PA refers to the ratio of the number of correctly classified pixels of a certain crop type to the number of all pixels in the verification data set of the crop type; UA refers to the ratio of the number of correctly classified pixels in a certain crop type to the number of all pixels classified as that crop type.
The Kappa coefficient calculation formula is as follows:
Figure 256052DEST_PATH_IMAGE005
wherein,
Figure 112013DEST_PATH_IMAGE006
represents the proportion of cells in the confusion matrix where the image classification is correct, and
Figure 809710DEST_PATH_IMAGE007
it represents the proportion of classification errors in the confusion matrix due to incidental factors.
Is provided with
Figure 778934DEST_PATH_IMAGE008
Is as follows
Figure DEST_PATH_IMAGE009
Class data utilization of
Figure 934845DEST_PATH_IMAGE010
The class classifier performs overall classification accuracy of crop classification,
Figure 871577DEST_PATH_IMAGE012
is as follows
Figure 123698DEST_PATH_IMAGE009
Class data utilization of
Figure 603221DEST_PATH_IMAGE010
The Kappa coefficient of the class classifier for crop classification,
Figure 164652DEST_PATH_IMAGE013
is referred to as
Figure 995205DEST_PATH_IMAGE009
Class data utilization of
Figure 51017DEST_PATH_IMAGE010
Class classifier performs classification on crop
Figure 385046DEST_PATH_IMAGE014
User precision of crop-like types.
The accuracy evaluation results of the primary classification result set are shown in table 1.
TABLE 1 Single data Source Single classifier accuracy evaluation results (Unit:%)
Figure 117379DEST_PATH_IMAGE015
From the precision evaluation results, the RF algorithm obtained the highest overall classification precision (74.48%) for GF6 data, and the classification results are shown in fig. 3. From the single crop class classification precision (mean of PA and UA), the medlar and wheat classification precision (74.78% and 73.11%) of the SAM algorithm is the highest, the quinoa classification precision (71.09%) of the ML algorithm is the highest, and the highland barley and wheat classification precision (70.88% and 84.77%) of the MAD algorithm is the highest.
For the data of S2, the SVM algorithm obtained the highest overall classification accuracy (80.88%), and the classification results are shown in FIG. 4. From the single crop category classification precision, the medlar and wheat classification precision (73.95% and 83.56%) of the ML algorithm is the highest, the quinoa classification precision (65.99%) of the SID algorithm is the highest, the highland barley classification precision (74.19%) of the SVM algorithm is the highest, and the rape classification precision (73.64%) of the MAD algorithm is the highest.
For the S2-NDVI data, the SVM algorithm achieved the highest overall classification accuracy (81.07%), and the classification results are shown in FIG. 5. From the single crop category classification precision, the Chinese wolfberry classification precision (78.96%) of the RF algorithm is the highest, the quinoa and wheat classification precision (78.31% and 80.68%) of the SVM algorithm is the highest, the highland barley classification precision (81.58%) of the MAD algorithm is the highest, and the rape classification precision (60.56%) of the ML algorithm is the highest.
Based on the precision evaluation results, the classification precision of different classifiers is greatly different when the different data are automatically classified, the classification performance of different classifiers is also greatly different when the different crop types are classified, and the stable automatic classification precision of crops is difficult to obtain by using a single data source and a single classifier.
Step four: and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
In this embodiment, an OCA-MV algorithm is respectively executed on a GF6 data classification result set (7 classification result), an S2 data classification result set (7 classification result), an S2-NDVI data classification result set (7 classification result), a classification result set (14 classification result) in which GF6 data and S2 data are combined, a classification result set (14 classification result) in which GF6 data and S2-NDVI data are combined, a classification result set (14 classification result) in which GF6 data and S2-NDVI data are combined, a classification result set in which S2 data and S2-NDVI data are combined, and a classification result set (21 classification result) of three kinds of data, so as to obtain a decision fusion classification result, and execute conventional MV and AMV algorithms at the same time, and compare results of the decision fusion algorithms and classification accuracy.
And 4.1, constructing an overall classification precision index set.
Constructing a set of overall class precision indices for each data source using each crop type class in the classification results for each classifier
Figure 169649DEST_PATH_IMAGE016
The method is used for representing the classification performance difference of different classifiers on different crop types.
Figure 763572DEST_PATH_IMAGE016
The method comprises 8 overall classification precision indexes, and the calculation formulas are respectively as follows:
Figure 217687DEST_PATH_IMAGE017
Figure 120921DEST_PATH_IMAGE018
Figure 926066DEST_PATH_IMAGE019
Figure 448314DEST_PATH_IMAGE020
Figure 629373DEST_PATH_IMAGE021
Figure 578874DEST_PATH_IMAGE022
Figure 730370DEST_PATH_IMAGE023
Figure 56309DEST_PATH_IMAGE024
and 4.2, calculating the attribution probability of the pixel crop type.
Calculate per data source, per classifier and per crop type
Figure 94803DEST_PATH_IMAGE016
And assigning values pixel by pixel according to the pixel classification category
Figure 215206DEST_PATH_IMAGE025
Calculating each pixel element to be attributed to
Figure 853998DEST_PATH_IMAGE026
Probability of crop-like type
Figure 718048DEST_PATH_IMAGE027
The following were used:
Figure 611049DEST_PATH_IMAGE028
wherein n represents the number of data sources, m represents the number of classifiers, x and y are classified image row-column labels,
Figure 433512DEST_PATH_IMAGE029
representative pixel
Figure 169386DEST_PATH_IMAGE031
Fall into
Figure 961762DEST_PATH_IMAGE032
Probability of crop type of class.
Step 4.3, calculating the maximum value of class probability pixel by pixel
Figure DEST_PATH_IMAGE033
Figure 178111DEST_PATH_IMAGE034
According to a majority vote mechanism, each pixel is classified as
Figure 905895DEST_PATH_IMAGE033
The type of the corresponding crop is selected,
Figure 519279DEST_PATH_IMAGE035
the number of crop type categories. Completing the decision fusion of the primary classification result to obtain the multi-data-source multi-classifier decision fusion crop classification resultCL
Step 4.4, according to
Figure 725133DEST_PATH_IMAGE016
8 overall classification accuracies of (c) ((
Figure 330076DEST_PATH_IMAGE036
Figure 228762DEST_PATH_IMAGE037
) Indexing, repeating the step 4.2 and the step 4.3 to obtain a decision classification result set
Figure 63863DEST_PATH_IMAGE038
And 4.5, evaluating the precision of the decision classification result set.
Based on the verification sample, the decision classification result set obtained in the step 4.4 is classified by using a confusion matrix method
Figure 73407DEST_PATH_IMAGE039
Performing precision evaluation and calculating the overall classification precision
Figure 61086DEST_PATH_IMAGE040
And 4.6, obtaining a final decision classification result:
calculating overall classification accuracy
Figure 865094DEST_PATH_IMAGE041
And derive its corresponding classification result
Figure 187491DEST_PATH_IMAGE039
And finally obtaining the automatic crop classification result fused by the long-time sequence multi-source remote sensing data and the multiple classifiers.
In this embodiment, the decision classification results of the OCA-MV algorithm, the MV calculation, and the AMV algorithm of each classification result set are subjected to precision evaluation, and the results are shown in table 2.
TABLE 2 evaluation result of decision classification precision of multiple data sources and multiple classifiers
Figure 735147DEST_PATH_IMAGE042
The precision evaluation results show that the crop decision classification precision of the multiple classifiers is superior to that of the single-data-source single-classifier classification results, and the crop classification precision of the OCA-MV algorithm provided by the invention is superior to that of the traditional MV and AMV algorithms.
The results of automatic crop classification using MV, AMV and OCA-MV algorithms combined with GF6 data, S2 data and S2-NDVI data are shown in FIGS. 6, 7 and 8, respectively.
Decision fusion classification results of the GF6, S2 and S2-NDVI classification result sets show that the OCA-MV method provided by the invention obtains the highest overall classification precision, OA is 77.29%, 82.14% and 82.00% respectively, the classification precision is obviously superior to that of the traditional decision classification algorithm (MV and AMV), and compared with the classification precision of single data source single classifiers (GF 6-RF, S2-SVM and S2-NDVI-SVM), the classification precision is respectively improved by 2.81%, 1.26% and 0.93%, and the superiority of the multi-classifier decision fusion crop automatic classification method is reflected.
Decision fusion classification results of the GF6+ S2, GF6+ S2-NDVI, S2+ S2-NDVI and GF6+ S2+ S2-NDVI classification result sets show that the OCA-MV method provided by the invention obtains the highest overall classification precision, OA is 82.70%, 82.36%, 84.20% and 85.81% respectively, the classification precision is obviously superior to that of the traditional decision classification algorithm (MV and AMV), and the overall classification precision is superior to that of a single-data-source multi-classifier, so that the superiority of the automatic crop classification method based on long-time-sequence multi-source remote sensing data provided by the invention and the advancement of the OCA-MV algorithm provided by the invention are embodied.
According to an aspect of the present specification, there is provided a multi-classifier fused crop automatic classification system, the system comprising: the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity; the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity; the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with the crop type training sample; the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set; and the decision fusion classification module is used for executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
The system acquires original data through a data acquisition and preprocessing module, and preprocesses the original data to obtain high-fraction data and multispectral time sequence reflectivity; the NDVI time sequence is obtained through the calculation module and is input into the primary classification module, so that the detection capability of long-time-sequence remote sensing data on the phenological difference characteristics of crops is fully utilized during primary classification of the crops; through a multispectral image selection module, with the NDVI time sequence as a reference standard, selecting multispectral image data participating in classification from all periods, and fully utilizing the spectral detection capability of multispectral remote sensing data on the premise of considering the calculation efficiency; performing primary classification on input source data through a primary classification module; and executing an optimization algorithm which gives consideration to the overall classification precision index and the traditional majority vote mechanism on the primary classification result set through a decision fusion classification module to obtain a final crop classification result. By fusing the advantages of automatic identification of different classifiers on crops and the advantages of classification performance differences of the different classifiers on different crop types represented by the overall classification precision index, the automatic classification precision of the crops is effectively improved, the problems of mixed pixel, homotopic and heterology equivalent responses and poor stability of a single classifier in the remote sensing crop automatic classification technology based on the single classifier and a single data source are solved, and the problem that the performance differences of the classifiers cannot be comprehensively reflected by a traditional decision-level fusion strategy and a self-adaptive majority vote method of the majority vote is solved.
The decision fusion classification module further comprises: the index set construction module is used for constructing a total classification precision index set; the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set; the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index; the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision; and the output module is used for outputting a final decision classification result based on the overall classification precision.
Constructing a total classification precision index set based on the total classification precision calculated in the precision evaluation of the primary classification result set through an index set construction module so as to represent the classification performance difference of different classifiers on different crop types; calculating the attribute probability of the pixel crop type aiming at each index in the overall classification precision index set through an attribute probability calculating module, and calculating the maximum value of the class probability one by one on the basis of the attribute probability of the pixel crop type through a class probability maximum value calculating module to obtain a decision classification result corresponding to each index; calculating all indexes through an attribution probability calculating module and a class probability maximum value calculating module to obtain a decision classification result set; then, performing precision evaluation on the decision classification result set through a precision evaluation module, and calculating the overall classification precision; and finally, outputting a final decision classification result through an output module to serve as a final long-time-sequence multi-source remote sensing data multi-classifier fused crop automatic classification result.
According to an aspect of the present specification, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the essence of the corresponding technical solutions.

Claims (9)

1. The automatic crop classification method based on multi-classifier fusion is characterized by comprising the following steps:
data acquisition and pretreatment are carried out, and high-fraction data and multispectral time sequence reflectivity after pretreatment are obtained;
calculating based on the multispectral time sequence reflectivity to obtain an NDVI time sequence;
based on the NDVI time sequence obtained by calculation, combining with a crop type training sample, and selecting multispectral image data participating in classification;
inputting high-score data, an NDVI time sequence and multispectral image data participating in classification into a classifier set to obtain an initial classification result set;
and executing an OCA-MV algorithm on the initial classification result set to complete decision fusion classification.
2. The method of claim 1, wherein the data collection and pre-processing further comprises:
collecting high-grade data when crops exceeding a preset percentage in a working area are all in a growing period and multispectral original image data of all periods in one year;
and preprocessing the collected high-fraction data and multispectral original image data to obtain preprocessed high-fraction data and multispectral time sequence reflectivity.
3. The method of claim 1, wherein selecting multispectral reflectance data for classification further comprises:
acquiring training samples of various types of crops, and combining the NDVI time sequences obtained by calculation to obtain NDVI mean value time sequence curves of training areas of different types of crops;
and comparing the NDVI time sequence curves of different types of crops by taking the NDVI mean value time sequence curves of different types of crop training areas as a reference, and selecting the multispectral reflectivity data corresponding to the time with the maximum difference of the NDVI time sequence curves of the different types of crops as multispectral image data participating in classification.
4. The method of claim 1, further comprising: and (3) constructing an automatic crop classification model based on 3 input data and 7 supervision and classification algorithms, and outputting a primary classification result set.
5. The method according to claim 4, wherein the precision evaluation is performed on the primary classification result set by using a confusion matrix method based on the crop verification samples, and the precision evaluation index is calculated.
6. The method of claim 1, wherein the OCA-MV algorithm is performed on the primary classification result set to complete decision fusion classification, and further comprising:
constructing a total classification precision index set;
calculating pixel crop type attribution probability, and calculating the maximum value of the class probability pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to a single overall classification precision index;
pixel crop type attribution probability calculation and pixel-by-pixel calculation of the maximum class probability are carried out on all the overall classification precision indexes in the overall classification precision index set one by one, and a decision classification result set is obtained;
performing precision evaluation on the decision classification result set, and calculating the overall classification precision;
and obtaining a final decision classification result based on the overall classification precision.
7. A multi-classifier fused crop automatic classification system, the system comprising:
the data acquisition and preprocessing module is used for acquiring and preprocessing data to acquire high-fraction data and multispectral time sequence reflectivity;
the calculation module is used for calculating and obtaining an NDVI time sequence based on the multispectral time sequence reflectivity;
the multispectral image selection module is used for selecting multispectral image data participating in classification based on the NDVI time sequence obtained through calculation and in combination with a crop type training sample;
the primary classification module is used for inputting the high-score data, the NDVI time sequence and the multispectral image data participating in classification into the classifier set to obtain a primary classification result set;
and the decision fusion classification module is used for executing an OCA-MV algorithm on the primary classification result set to complete decision fusion classification.
8. The multi-classifier fused crop automatic classification system according to claim 7, wherein the decision fusion classification module further comprises:
the index set construction module is used for constructing a total classification precision index set;
the attribution probability calculating module is used for calculating the attribution probability of the pixel crop type aiming at each index in the overall classification precision index set;
the class probability maximum value calculating module is used for calculating the class probability maximum value pixel by pixel based on the pixel crop type attribution probability to obtain a decision classification result corresponding to each index;
the precision evaluation module is used for carrying out precision evaluation on the decision classification result sets corresponding to all the indexes and calculating the overall classification precision;
and the output module is used for outputting a final decision classification result based on the overall classification precision.
9. A computer-readable storage medium having stored thereon a computer program, characterized in that,
the computer program, when executed by a processor, implements the method of any one of claims 1 to 6.
CN202210548609.7A 2022-05-20 2022-05-20 Multi-classifier fused crop automatic classification method and system and storage medium Active CN114639005B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210548609.7A CN114639005B (en) 2022-05-20 2022-05-20 Multi-classifier fused crop automatic classification method and system and storage medium
NL2034106A NL2034106B1 (en) 2022-05-20 2023-02-07 Automatic crop classification method and system based on multi-classifier fusion, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210548609.7A CN114639005B (en) 2022-05-20 2022-05-20 Multi-classifier fused crop automatic classification method and system and storage medium

Publications (2)

Publication Number Publication Date
CN114639005A true CN114639005A (en) 2022-06-17
CN114639005B CN114639005B (en) 2022-10-21

Family

ID=81953079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210548609.7A Active CN114639005B (en) 2022-05-20 2022-05-20 Multi-classifier fused crop automatic classification method and system and storage medium

Country Status (2)

Country Link
CN (1) CN114639005B (en)
NL (1) NL2034106B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063690A (en) * 2022-06-24 2022-09-16 电子科技大学 Vegetation classification method based on NDVI (normalized difference vegetation index) time sequence characteristics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103258212A (en) * 2013-04-03 2013-08-21 中国科学院东北地理与农业生态研究所 Semi-supervised integrated remote-sensing image classification method based on attractor propagation clustering
CN103489005A (en) * 2013-09-30 2014-01-01 河海大学 High-resolution remote sensing image classifying method based on fusion of multiple classifiers
CN106355030A (en) * 2016-09-20 2017-01-25 浙江大学 Fault detection method based on analytic hierarchy process and weighted vote decision fusion
CN108399423A (en) * 2018-02-01 2018-08-14 南京大学 A kind of multidate-Combining Multiple Classifiers of classification of remote-sensing images
CN108629494A (en) * 2018-04-19 2018-10-09 三峡大学 Arid grade appraisal procedure and system
CN109711446A (en) * 2018-12-18 2019-05-03 中国科学院深圳先进技术研究院 A kind of terrain classification method and device based on multispectral image and SAR image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103258212A (en) * 2013-04-03 2013-08-21 中国科学院东北地理与农业生态研究所 Semi-supervised integrated remote-sensing image classification method based on attractor propagation clustering
CN103489005A (en) * 2013-09-30 2014-01-01 河海大学 High-resolution remote sensing image classifying method based on fusion of multiple classifiers
CN106355030A (en) * 2016-09-20 2017-01-25 浙江大学 Fault detection method based on analytic hierarchy process and weighted vote decision fusion
CN108399423A (en) * 2018-02-01 2018-08-14 南京大学 A kind of multidate-Combining Multiple Classifiers of classification of remote-sensing images
CN108629494A (en) * 2018-04-19 2018-10-09 三峡大学 Arid grade appraisal procedure and system
CN109711446A (en) * 2018-12-18 2019-05-03 中国科学院深圳先进技术研究院 A kind of terrain classification method and device based on multispectral image and SAR image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
牛明昂等: "多分类器融合与单分类器影像分类比较研究", 《矿山测量》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063690A (en) * 2022-06-24 2022-09-16 电子科技大学 Vegetation classification method based on NDVI (normalized difference vegetation index) time sequence characteristics
CN115063690B (en) * 2022-06-24 2024-06-07 电子科技大学 Vegetation classification method based on NDVI time sequence characteristics

Also Published As

Publication number Publication date
NL2034106B1 (en) 2024-06-19
NL2034106A (en) 2023-11-27
CN114639005B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
Hu et al. A phenology-based spectral and temporal feature selection method for crop mapping from satellite time series
Feng et al. Crop type identification and mapping using machine learning algorithms and sentinel-2 time series data
Clark Comparison of simulated hyperspectral HyspIRI and multispectral Landsat 8 and Sentinel-2 imagery for multi-seasonal, regional land-cover mapping
Luo et al. Comparison of machine learning algorithms for mapping mango plantations based on Gaofen-1 imagery
CN113657158B (en) Google EARTH ENGINE-based large-scale soybean planting area extraction algorithm
CN116543316B (en) Method for identifying turf in paddy field by utilizing multi-time-phase high-resolution satellite image
Sun et al. Mapping tropical dry forest age using airborne waveform LiDAR and hyperspectral metrics
CN111126511A (en) Vegetation index fusion-based LAI quantitative model establishment method
Yuan et al. Research on rice leaf area index estimation based on fusion of texture and spectral information
CN114639005B (en) Multi-classifier fused crop automatic classification method and system and storage medium
CN113901966B (en) Crop classification method fusing multi-source geographic information data
Chellasamy et al. A multievidence approach for crop discrimination using multitemporal worldview-2 imagery
Shen et al. High-throughput phenotyping of individual plant height in an oilseed rape population based on Mask-RCNN and UAV images
CN116258844A (en) Rapid and accurate identification method for phenotype character of cotton leaf
CN115063610A (en) Soybean planting area identification method based on Sentinel-1 and 2 images and area measurement method thereof
Li et al. Crop region extraction of remote sensing images based on fuzzy ARTMAP and adaptive boost
CN113205565A (en) Forest biomass remote sensing mapping method based on multi-source data
Kozoderov et al. Models of pattern recognition and forest state estimation based on hyperspectral remote sensing data
Cheng et al. Crop aboveground biomass monitoring model based on UAV spectral index reconstruction and Bayesian model averaging: A case study of film-mulched wheat and maize
Ding et al. Extraction of soybean planting areas based on multi-temporal Sentinel-1/2 data
Gao et al. Optimal feature selection and crop extraction using random forest based on GF-6 WFV data
Wang et al. RETRACTED: Theoretical research on rice and wheat lodging detection based on artificial intelligence technology and a template matching algorithm
Chadwick Enhancing post-harvest regeneration monitoring with digital aerial photogrammetry and deep learning
Jepkosgei Evaluating the factors influencing farmers’ choices of maize-based cropping patterns and assessing the potential of desis hyperspectral satellite data to discriminate the cropping patterns.
Okada et al. High-Throughput Phenotyping of Soybean Biomass: Conventional Trait Estimation and Novel Latent Feature Extraction Using UAV Remote Sensing and Deep Learning Models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant