CN115271217A - Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle - Google Patents
Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle Download PDFInfo
- Publication number
- CN115271217A CN115271217A CN202210913918.XA CN202210913918A CN115271217A CN 115271217 A CN115271217 A CN 115271217A CN 202210913918 A CN202210913918 A CN 202210913918A CN 115271217 A CN115271217 A CN 115271217A
- Authority
- CN
- China
- Prior art keywords
- wheat
- data
- yield
- aerial vehicle
- unmanned aerial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 241000209140 Triticum Species 0.000 title claims abstract description 75
- 235000021307 Triticum Nutrition 0.000 title claims abstract description 75
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000012360 testing method Methods 0.000 claims description 23
- 238000013528 artificial neural network Methods 0.000 claims description 16
- 230000003595 spectral effect Effects 0.000 claims description 15
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 claims description 14
- 239000002689 soil Substances 0.000 claims description 13
- 230000001537 neural effect Effects 0.000 claims description 12
- 229930002875 chlorophyll Natural products 0.000 claims description 10
- 235000019804 chlorophyll Nutrition 0.000 claims description 10
- 241000196324 Embryophyta Species 0.000 claims description 9
- 238000002310 reflectometry Methods 0.000 claims description 9
- 238000012549 training Methods 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 8
- 238000005259 measurement Methods 0.000 claims description 8
- 230000004927 fusion Effects 0.000 claims description 6
- 238000010521 absorption reaction Methods 0.000 claims description 5
- 238000010899 nucleation Methods 0.000 claims description 5
- 244000068988 Glycine max Species 0.000 claims description 3
- 235000010469 Glycine max Nutrition 0.000 claims description 3
- 230000007613 environmental effect Effects 0.000 claims description 3
- 230000002262 irrigation Effects 0.000 claims description 3
- 238000003973 irrigation Methods 0.000 claims description 3
- 238000001543 one-way ANOVA Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 3
- 230000017260 vegetative to reproductive phase transition of meristem Effects 0.000 claims description 3
- 238000005303 weighing Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims 1
- 230000035800 maturation Effects 0.000 claims 1
- 235000013339 cereals Nutrition 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 238000000540 analysis of variance Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 101100280138 Mus musculus Evi2a gene Proteins 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 230000010485 coping Effects 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/58—Extraction of image or video features relating to hyperspectral data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Marketing (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Primary Health Care (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Marine Sciences & Fisheries (AREA)
- Animal Husbandry (AREA)
- Agronomy & Crop Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- Mining & Mineral Resources (AREA)
- Development Economics (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
Abstract
The invention belongs to the technical field of wheat yield prediction, and particularly relates to a wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle.
Description
Technical Field
The invention relates to the technical field of wheat yield prediction, in particular to a wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle.
Background
Grain is an important strategic material for the economic safety of the county citizens and the countries, and the grain safety is closely related to the development of economy and the harmony of society. Water resource shortage, land degradation, frequent natural disasters and increasingly serious agricultural environment pollution seriously affect the stable development and the improvement of the quality of grain production. Winter wheat is one of main grains in China, the yield of the winter wheat can be predicted accurately in time, powerful support can be provided for agricultural decision making and operation management, and the method is an urgent need for developing accurate agriculture and walking sustainable development roads.
The existing wheat yield is generally estimated through satellite remote sensing, and has important significance on decision on a macro scale, but satellite images have the problems of long revisit period, low image resolution, mixed pixels, meteorological condition limitation and the like, so that the auxiliary effect on actual operation management of agricultural operators is very little, and therefore, the wheat yield prediction method based on unmanned aerial vehicle multi-source remote sensing data is invented.
Disclosure of Invention
The invention is provided in view of the above and/or the problems existing in the existing wheat yield prediction method based on unmanned aerial vehicle multi-source remote sensing data.
Therefore, the invention aims to provide a wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle, which can solve the existing problems.
To solve the above technical problem, according to an aspect of the present invention, the present invention provides the following technical solutions:
a wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle comprises the following specific steps:
the method comprises the following steps: setting a ground actual measurement test, and sampling and weighing the wheat in a mature period to obtain wheat yield data;
step two: acquiring multisource remote sensing image data of the unmanned aerial vehicle in each key growth period of the wheat, and acquiring the multisource remote sensing image data by adopting a multi-rotor unmanned aerial vehicle remote sensing platform to simultaneously carry a multispectral camera, a thermal imager and two RGB cameras, wherein the multispectral camera acquires a multispectral image, the thermal imager acquires a thermal infrared image, and the two RGB cameras acquire RGB images at different angles respectively;
step three: extracting spectral information by using the obtained multispectral data, performing radiometric calibration on the multispectral images of the wheat to obtain reflectivity data of a wheat canopy, and constructing a seventeen vegetation index according to the reflectivity data in an inversion mode;
step four: extracting structural information by using the obtained RGB and multispectral data, generating a digital elevation model DEM by using an RGB image obtained by an unmanned aerial vehicle before wheat seeding, generating a digital surface model DSM by using the RGB image obtained after seeding, then obtaining the height of a canopy by using the DSM-DEM, after obtaining, using a maximum inter-class variance method to make a binary mask of weeds and soil from the multispectral orthographic image by using threshold segmentation, removing the weeds and the soil in the image by using mask data, and then using a formula: coverage of vegetationCalculating the vegetation coverage in each land parcel;
step five: extracting spectral information by using the obtained thermal infrared data, and calculating the normalized relative canopy temperature by using the thermal infrared imageTemperature information for yield prediction, wherein T i Is the temperature, T, of the ith pixel min Minimum temperature, T, of the entire field max Is the most important of the whole fieldHigh temperature;
step six: calculating a gray level CO-occurrence matrix from the red, green, infrared and near infrared wave bands in the multispectral image and the images of the height of the canopy and the normalized relative canopy temperature, wherein the gray level CO-occurrence matrix comprises eight texture characteristic factors which are respectively a mean value ME, a variance VA, homogeneity HO, contrast CO, dissimilarity DI, entropy EN, a second moment SE and a correlation CR;
step seven: using the extracted parameter information and the actually measured yield data of the wheat sample plot as input parameters of a model, modeling by using a deep neural network, randomly selecting 70% of samples for training, and verifying 30% of samples, wherein the extracted multi-source parameters are firstly fused by a DNN network, and the fused information is trained by the DNN network to establish a wheat yield prediction model;
step eight: verifying the wheat yield prediction model by using the actually measured wheat yield data, and using R 2 And evaluating the performance of the prediction model by using a Root Mean Square Error (RMSE), detecting the difference of the yield among different varieties of wheat by using one-way variance analysis, and testing the spatial correlation of the prediction error on each sub-plot by using a global Moran index test model.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: the actual measurement test of the yield of the medium and small wheat in the step I is carried out according to 12.3m 2 Or 13.8m 2 The planting area is divided into sub-plots, the wheat in each sub-plot is sampled and weighed, and the sub-plots are uniformly arranged in the planting area as much as possible.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: in the second step, the time for acquiring the image data by the unmanned aerial vehicle at least covers the key growth period of the wheat growth, including a tillering period, an elongation stage, a booting stage, a flowering stage, a filling stage and a mature stage.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: the spectral information in the third step comprises green, red edge, near infrared band reflectivity, ratio vegetation index, chlorophyll index, red edge chlorophyll index, normalized vegetation index NDVI, green channel normalized vegetation index, green red vegetation index, normalized red edge index, simplified canopy chlorophyll content index, enhanced vegetation index, two-band enhanced vegetation index, optimized soil-adjusted vegetation index, modified chlorophyll absorption ratio index, improved chlorophyll absorption ratio index, MCRI/OSAVI, TCARI/OSAVI and wide-band dynamic range vegetation index, wherein the total number of the vegetation indexes is seventeen.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: the texture feature index in the sixth step comprises:
wherein P is i,j Is a normalized co-occurrence matrix.
As an optimal scheme of the method for predicting the wheat yield based on the multi-source remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: the spectral information, the structural information, the textural features and the temperature information are all depleted of weeds and soil background, using the average value of the information in each sub-plot as input parameters.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: and seventhly, modeling a wheat yield prediction model by using a deep neural network based on the multi-source parameters and ground actual measurement yield data, wherein the extracted multi-source parameters are fused by the deep neural network, the fused information is trained by the deep neural network, in a data fusion stage, the training network of the spectral information comprises four layers of convolutional layers, the number of neural nodes in each layer is 32, 64, 128 and 256, the structural and temperature information comprises two layers of convolutional layers, the number of neural nodes in each layer is 32 and 64, the network of the texture information comprises five layers of convolutional layers, the number of neural nodes in each layer is 32, 64, 128, 256 and 512, the multi-source fused data is obtained after the training of the first stage, in the yield prediction stage, the multi-source fused data is used as input data, the predicted yield is used as an output result, and the deep neural network comprises three layers of convolutional layers, and the number of neural nodes in each layer is 512, 512 and 512.
As an optimal scheme of the method for predicting the wheat yield based on the multisource remote sensing data of the unmanned aerial vehicle, the method comprises the following steps: in the eighth step, in order to test the applicability of the prediction model among the soybeans of different varieties, the difference of the yield among the three varieties is detected by using one-way variance analysis, and in addition, in order to test the capability of the prediction model in dealing with the spatial difference caused by different soils, irrigation conditions and other environmental factors, the spatial correlation of the prediction error of the global Moran index test model on each sub-plot is used.
Compared with the prior art:
the invention utilizes an unmanned aerial vehicle remote sensing platform carrying multiple sensors to extract data such as reflectivity, vegetation index and temperature from acquired images such as multispectral images, RGB images and thermal infrared images for multi-source data fusion, and establishes a wheat yield prediction deep learning model by combining measured yield data, thereby realizing wheat yield prediction and growth monitoring.
Drawings
FIG. 1 is a schematic diagram of parameter information extraction according to the present invention;
FIG. 2 is a schematic diagram of a multi-modal deep neural network prediction model according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The invention provides a wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle, please refer to fig. 1-2, and the method comprises the following specific steps:
the method comprises the following steps: setting a ground actual measurement test, and sampling and weighing the wheat in a mature period to obtain wheat yield data;
wherein the wheat yield actual measurement test is 12.3m 2 Or 13.8m 2 The planting area is divided into sub-plots, the wheat in each sub-plot is sampled and weighed, and the sub-plots are uniformly distributed in the planting area as much as possible;
step two: acquiring multisource remote sensing image data of the unmanned aerial vehicle in each key growth period of the wheat, and acquiring the multisource remote sensing image data by adopting a multi-rotor unmanned aerial vehicle remote sensing platform to simultaneously carry a multispectral camera, a thermal imager and two RGB cameras, wherein the multispectral camera acquires a multispectral image, the thermal imager acquires a thermal infrared image, and the two RGB cameras acquire RGB images at different angles (vertical and inclined by 45 degrees);
the time for acquiring the image data by the unmanned aerial vehicle at least covers the key growth period of the growth of the wheat, including a tillering period, an elongation period, a booting period, a flowering period, a filling period and a mature period;
step three: extracting spectral information by using the obtained multispectral data, performing radiometric calibration on the multispectral images of the wheat to obtain reflectivity data of a wheat canopy, and constructing a seventeen vegetation index according to the reflectivity data in an inversion mode;
wherein the spectral information includes green (G), red (R), red Edge (RE), near Infrared (NIR) band reflectance, ratio Vegetation Index (RVI), chlorophyll index (GCI), red Edge Chlorophyll Index (RECI), normalized vegetation index NDVI (NDVI), green channel normalized vegetation index (NDVI), green Red Vegetation Index (GRVI), normalized red edge (NDRE), normalized red edge index (NDREI), reduced canopy chlorophyll content index (SCCCI), enhanced Vegetation Index (EVI), dual band enhanced vegetation index (EVI 2), optimized soil conditioning vegetation index (OSAVI), modified chlorophyll absorption index (MCARI), modified chlorophyll absorption index (TCARI), MCARI/osaavi, TCARI/osaavi, and wide band dynamic range vegetation index (WDRVI), for seventeen vegetation indices, and seventeen vegetation indices are calculated as shown in the following table:
step four: extracting structural information by using the obtained RGB and multispectral data, generating a digital elevation model DEM by using RGB images obtained by an unmanned aerial vehicle before wheat seeding, and obtaining the digital elevation model DEM after seedingThe method comprises the following steps of generating a digital surface model DSM by using an RGB image, obtaining the height of a canopy through a DSM-DEM, using a maximum inter-class variance method (OSTU) to make a binary mask of weeds and soil from an orthophoto image of a multi-spectrum by using threshold segmentation, removing the weeds and the soil in the image by using mask data, and then using a formula: coverage of vegetationCalculating the vegetation coverage in each land parcel;
step five: extracting spectral information by using the obtained thermal infrared data, and calculating the normalized relative canopy temperature by using the thermal infrared imageTemperature information for yield prediction, wherein T i Is the temperature of the ith pixel, T min Minimum temperature, T, of the entire field max The highest temperature of the whole field is obtained;
step six: calculating a gray level CO-occurrence matrix (GLCM) from the red, green, infrared and near infrared bands in the multispectral image and the images of the height of the canopy and the normalized relative canopy temperature, and comprising eight texture feature factors, namely a mean value ME, a variance VA, homogeneity HO, contrast CO, dissimilarity DI, entropy EN, a secondary moment SE and a correlation CR;
wherein the texture feature index comprises:
wherein P is i,j Is a normalized co-occurrence matrix;
step seven: taking the extracted parameter information and the actually measured yield data of the wheat sample plot as input parameters of a model, modeling by using a Deep Neural Network (DNN), randomly selecting 70% of samples for training, and verifying 30% of samples, wherein the extracted multi-source parameters are firstly fused by the DNN, and the fused information is trained by the DNN to establish a wheat yield prediction model;
the method comprises the steps that a wheat yield prediction model is modeled by using a Deep Neural Network (DNN) based on multi-source parameters and ground actual measurement yield data, wherein the extracted multi-source parameters are firstly fused through the Deep Neural Network (DNN), fused information is trained through the Deep Neural Network (DNN), in a data fusion stage, the training network of spectral information comprises four layers of convolution layers, the number of each layer of neural nodes is 32, 64, 128 and 256, structure and temperature information comprises two layers of convolution layers, the number of each layer of neural nodes is 32 and 64, the network of texture information comprises five layers of convolution layers, the number of each layer of neural nodes is 32, 64, 128, 256 and 512, the multi-source fusion data is obtained after training in a first stage, in a yield prediction stage, the multi-source fusion data serves as input data, the predicted yield serves as an output result, the Deep Neural Network (DNN) comprises three layers of convolution layers, and the number of each layer of neural nodes is 512, 512 and 1024;
step eight: verifying the wheat yield prediction model by using the actually measured wheat yield data, and using R 2 Evaluating the performance of the prediction model by the root mean square error RMSE, detecting the difference of the yield among different varieties of wheat by using one-way analysis of variance (ANOVA test), and testing the spatial correlation of the prediction error of the model on each sub-plot by using a Global Moran index (the Global Moran's I);
in order to test the applicability of the prediction model among different varieties of soybeans, one-way analysis of variance (ANOVA test) is used to test the difference of yield among the three varieties, and in addition, in order to test the capability of the prediction model in coping with the spatial difference caused by different soils, irrigation conditions and other environmental factors, the Global Moran index (the Global Moran's I) is used to test the spatial correlation of the prediction error of the model on each sub-plot.
The spectral, structural, textural and temperature information described above have all been stripped of weed and soil background, using the average of the information in each sub-plot as input parameters.
While the invention has been described above with reference to an embodiment, various modifications may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In particular, the various features of the disclosed embodiments of this invention can be used in any combination as long as there is no structural conflict, and the combination is not exhaustively described in this specification merely for the sake of brevity and resource savings. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (8)
1. A wheat yield prediction method based on multi-source remote sensing data of an unmanned aerial vehicle is characterized by comprising the following specific steps:
the method comprises the following steps: setting a ground actual measurement test, and sampling and weighing the wheat in a mature period to obtain wheat yield data;
step two: acquiring multisource remote sensing image data of the unmanned aerial vehicle in each key growth period of the wheat, and acquiring the multisource remote sensing image data by adopting a multi-rotor unmanned aerial vehicle remote sensing platform to simultaneously carry a multispectral camera, a thermal imager and two RGB cameras, wherein the multispectral camera acquires a multispectral image, the thermal imager acquires a thermal infrared image, and the two RGB cameras acquire RGB images at different angles respectively;
step three: extracting spectral information by using the obtained multispectral data, performing radiometric calibration on the multispectral image of the wheat to obtain reflectivity data of a wheat canopy, and constructing a seventeen vegetation index according to the reflectivity data in an inversion mode;
step four: extracting structural information by using the obtained RGB and multispectral data, generating a digital elevation model DEM by using an RGB image obtained by an unmanned aerial vehicle before wheat seeding, generating a digital surface model DSM by using the RGB image obtained after seeding, then obtaining the height of a canopy by using the DSM-DEM, after obtaining, using a maximum inter-class variance method to make a binary mask of weeds and soil from the multispectral orthographic image by using threshold segmentation, removing the weeds and the soil in the image by using mask data, and then using a formula: coverage of vegetationCalculating the vegetation coverage in each land parcel;
step five: extracting spectral information by using the obtained thermal infrared data, and calculating the normalized relative canopy temperature by using the thermal infrared imageTemperature information for yield prediction, wherein T i Is the temperature of the ith pixel, T min Minimum temperature, T, of the entire field max The highest temperature of the whole field is obtained;
step six: calculating a gray level CO-occurrence matrix from the red, green, infrared and near infrared wave bands in the multispectral image and the images of the height of the canopy and the normalized relative canopy temperature, wherein the gray level CO-occurrence matrix comprises eight texture characteristic factors which are respectively a mean value ME, a variance VA, homogeneity HO, contrast CO, dissimilarity DI, entropy EN, a second moment SE and a correlation CR;
step seven: using the extracted parameter information and the actually measured yield data of the wheat sample plot as input parameters of a model, modeling by using a deep neural network, randomly selecting 70% of samples for training, and verifying 30% of samples, wherein the extracted multi-source parameters are firstly fused by a DNN network, and the fused information is trained by the DNN network to establish a wheat yield prediction model;
step eight: verifying the wheat yield prediction model by using actually measured wheat yield data, and using R 2 And evaluating the performance of the prediction model by the root mean square error RMSE, detecting the difference of the yield among different varieties of wheat by using one-way variance analysis, and detecting the spatial correlation of the prediction error on each sub-plot by using a global Moran index detection model.
2. The method for predicting the yield of the wheat based on the multi-source remote sensing data of the unmanned aerial vehicle according to claim 1, wherein the actual measurement test of the yield of the medium and small wheat in the step one is performed according to 12.3m 2 Or 13.8m 2 The planting area is divided into sub-plots, the wheat in each sub-plot is sampled and weighed, and the sub-plots are uniformly arranged in the planting area as much as possible.
3. The method for predicting the wheat yield based on the multi-source remote sensing data of the unmanned aerial vehicle according to claim 1, wherein the time for the unmanned aerial vehicle to acquire the image data in the second step at least covers a key growth period of the wheat growth, including a tillering period, an elongation period, a booting period, a flowering period, a filling period and a maturation period.
4. The method for predicting wheat yield based on multi-source remote sensing data of an unmanned aerial vehicle according to claim 1, wherein the spectral information in the third step comprises green, red-edge, near-infrared band reflectivity, ratio vegetation index, chlorophyll index, red-edge chlorophyll index, normalized vegetation index NDVI, green-channel normalized vegetation index, green-red vegetation index, normalized red-edge index, simplified canopy chlorophyll content index, enhanced vegetation index, two-band enhanced vegetation index, optimized soil-conditioning vegetation index, modified chlorophyll absorption ratio index, RI/OSAVI and TCA wide-band dynamic range vegetation index, and the total number is seventeen vegetation index.
5. The method for predicting the wheat yield based on the multi-source remote sensing data of the unmanned aerial vehicle according to claim 1, wherein the texture feature index in the sixth step comprises:
wherein P is i,j Is a normalized co-occurrence matrix.
6. The method of claim 1, wherein the spectral information, the structural information, the textural features and the temperature information are removed from weeds and soil background, and an average value of the information in each sub-plot is used as an input parameter.
7. The method for predicting the wheat yield based on the multi-source remote sensing data of the unmanned aerial vehicle according to claim 1, wherein in the seventh step, based on the multi-source parameters and ground measured yield data, a deep neural network is used for modeling a wheat yield prediction model, extracted multi-source parameters are firstly fused through the deep neural network, fused information is trained through the deep neural network, in a data fusion stage, the training network of the spectral information comprises four layers of convolutional layers, the number of each layer of neural nodes is 32, 64, 128 and 256, the structure and temperature information comprises two layers of convolutional layers, the number of each layer of neural nodes is 32 and 64, the network of the texture information comprises five layers, the number of each layer of neural nodes is 32, 64, 128, 256 and 512, the multi-source fused data is obtained after the training in the first stage, in a yield prediction stage, the multi-source fused data is used as input data, and the yield is predicted as an output result, wherein the deep neural network comprises three layers of convolutional layers, and the number of each layer of neural nodes is 512, 512 and 1024.
8. The method for predicting the wheat yield based on the multi-source remote sensing data of the unmanned aerial vehicle according to claim 1, wherein in the eighth step, in order to test the applicability of the prediction model between different varieties of soybeans, a one-way analysis of variance is used to test the difference of the yield between the three varieties, and in addition, in order to test the capability of the prediction model in dealing with the spatial difference caused by different soils, irrigation conditions and other environmental factors, a global Moran index test model is used to test the spatial correlation of the prediction error on each sub-plot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210913918.XA CN115271217A (en) | 2022-08-01 | 2022-08-01 | Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210913918.XA CN115271217A (en) | 2022-08-01 | 2022-08-01 | Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115271217A true CN115271217A (en) | 2022-11-01 |
Family
ID=83746168
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210913918.XA Pending CN115271217A (en) | 2022-08-01 | 2022-08-01 | Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115271217A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115830442A (en) * | 2022-11-11 | 2023-03-21 | 中国科学院空天信息创新研究院 | Machine learning-based remote sensing estimation method and system for wheat tiller density |
CN116482041A (en) * | 2023-06-25 | 2023-07-25 | 武汉大学 | Rice heading period nondestructive rapid identification method and system based on reflection spectrum |
CN116740592A (en) * | 2023-06-16 | 2023-09-12 | 安徽农业大学 | Wheat yield estimation method and device based on unmanned aerial vehicle image |
CN117592604A (en) * | 2023-11-22 | 2024-02-23 | 河北省农林科学院旱作农业研究所 | Unmanned aerial vehicle-mounted remote sensing identification method for water-efficient wheat varieties |
CN117853947A (en) * | 2024-03-06 | 2024-04-09 | 山东同圆数字科技有限公司 | Winter wheat remote sensing image automatic analysis system |
-
2022
- 2022-08-01 CN CN202210913918.XA patent/CN115271217A/en active Pending
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115830442A (en) * | 2022-11-11 | 2023-03-21 | 中国科学院空天信息创新研究院 | Machine learning-based remote sensing estimation method and system for wheat tiller density |
CN115830442B (en) * | 2022-11-11 | 2023-08-04 | 中国科学院空天信息创新研究院 | Remote sensing estimation method and system for wheat stem tiller density based on machine learning |
CN116740592A (en) * | 2023-06-16 | 2023-09-12 | 安徽农业大学 | Wheat yield estimation method and device based on unmanned aerial vehicle image |
CN116740592B (en) * | 2023-06-16 | 2024-02-02 | 安徽农业大学 | Wheat yield estimation method and device based on unmanned aerial vehicle image |
CN116482041A (en) * | 2023-06-25 | 2023-07-25 | 武汉大学 | Rice heading period nondestructive rapid identification method and system based on reflection spectrum |
CN116482041B (en) * | 2023-06-25 | 2023-09-05 | 武汉大学 | Rice heading period nondestructive rapid identification method and system based on reflection spectrum |
CN117592604A (en) * | 2023-11-22 | 2024-02-23 | 河北省农林科学院旱作农业研究所 | Unmanned aerial vehicle-mounted remote sensing identification method for water-efficient wheat varieties |
CN117853947A (en) * | 2024-03-06 | 2024-04-09 | 山东同圆数字科技有限公司 | Winter wheat remote sensing image automatic analysis system |
CN117853947B (en) * | 2024-03-06 | 2024-05-10 | 山东同圆数字科技有限公司 | Winter wheat remote sensing image automatic analysis system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115271217A (en) | Wheat yield prediction method based on multi-source remote sensing data of unmanned aerial vehicle | |
CN109581372B (en) | Ecological environment remote sensing monitoring method | |
CN112183209B (en) | Regional crop classification method and system based on multidimensional feature fusion | |
Al-Ali et al. | A comparative study of remote sensing classification methods for monitoring and assessing desert vegetation using a UAV-based multispectral sensor | |
CN111242224B (en) | Multi-source remote sensing data classification method based on unmanned aerial vehicle extraction classification sample points | |
US7630990B2 (en) | Endmember spectrum database construction method, endmember spectrum database construction apparatus and endmember spectrum database construction program | |
CN110363246B (en) | Fusion method of vegetation index NDVI with high space-time resolution | |
CN111832518B (en) | Space-time fusion-based TSA remote sensing image land utilization method | |
CN111178169B (en) | Urban surface covering fine classification method and device based on remote sensing image | |
Liang et al. | Improved estimation of aboveground biomass in rubber plantations by fusing spectral and textural information from UAV-based RGB imagery | |
CN113505635A (en) | Method and device for identifying winter wheat and garlic mixed planting area based on optics and radar | |
CN115481368B (en) | Vegetation coverage estimation method based on full remote sensing machine learning | |
CN117218531B (en) | Sea-land ecological staggered zone mangrove plant overground carbon reserve estimation method | |
Wang et al. | Unsupervised discrimination between lodged and non-lodged winter wheat: A case study using a low-cost unmanned aerial vehicle | |
CN116645603A (en) | Soybean planting area identification and area measurement method | |
CN113887493B (en) | Black and odorous water body remote sensing image identification method based on ID3 algorithm | |
CN114021656A (en) | Water body extraction method based on GEE cloud platform and optical and radar data fusion | |
CN116229459A (en) | Domestic satellite multispectral image pixel-by-pixel quality marking method | |
Lou et al. | An effective method for canopy chlorophyll content estimation of marsh vegetation based on multiscale remote sensing data | |
CN116563721B (en) | Tobacco field extraction method based on layered classification thought | |
CN117557897A (en) | Lodging monitoring method and device for target crops, electronic equipment and storage medium | |
Ozkan et al. | Estimation of forest stand parameters by using the spectral and textural features derived from digital aerial images. | |
Danoedoro et al. | Combining pan-sharpening and forest cover density transformation methods for vegetation mapping using Landsat-8 Satellite Imagery | |
CN114782843A (en) | Crop yield prediction method and system based on unmanned aerial vehicle multispectral image fusion | |
Powell et al. | Remote Sensing of Lake Clarity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |