CN116563706B - Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature - Google Patents

Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature Download PDF

Info

Publication number
CN116563706B
CN116563706B CN202310508864.3A CN202310508864A CN116563706B CN 116563706 B CN116563706 B CN 116563706B CN 202310508864 A CN202310508864 A CN 202310508864A CN 116563706 B CN116563706 B CN 116563706B
Authority
CN
China
Prior art keywords
crop
yield
yield estimation
date
reflectivity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310508864.3A
Other languages
Chinese (zh)
Other versions
CN116563706A (en
Inventor
姚鸿勋
鲍志伟
刘劼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202310508864.3A priority Critical patent/CN116563706B/en
Publication of CN116563706A publication Critical patent/CN116563706A/en
Application granted granted Critical
Publication of CN116563706B publication Critical patent/CN116563706B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a crop yield estimation method aiming at multi-characteristic of reflectance of a multispectral image, relates to the technical field of crop production, and aims at solving the problems that in the prior art, the yield estimation method cannot fully utilize information of an original multispectral image, so that the capacity of a yield estimation model is limited, and the accuracy of final yield estimation is low. And the accuracy of yield estimation is improved.

Description

Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature
Technical Field
The invention relates to the technical field of crop production, in particular to a crop yield estimation method aiming at multi-spectral image reflectivity multi-characteristics.
Background
With the gradual application of unmanned aerial vehicles and multispectral cameras in agriculture, the research of the correlation between multispectral image data of crops and the yield, plant height, chlorophyll content and the like of crops becomes a hot spot for many scholars to study. The establishment of a relationship between crop multispectral image data and actual crop yield, i.e., yield estimation through crop multispectral images, is a popular research problem.
Most of the existing yield estimation methods utilize the average value of reflectivity of each wave band of a crop multispectral image, and the yield estimation is carried out by combining a limited vegetation index formula. However, this method requires artificial selection of vegetation index formula, and cannot fully utilize the information of the original multispectral image, which limits the capacity of the yield estimation model and directly results in low accuracy of final yield estimation.
Disclosure of Invention
The purpose of the invention is that: aiming at the problems that the yield estimation method in the prior art cannot fully utilize the information of the original multispectral image, so that the capacity of a yield estimation model is limited and the accuracy of final yield estimation is low, the crop yield estimation method aiming at the multispectral image reflectivity multi-feature is provided.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a crop yield estimation method for multi-spectral image reflectivity multi-feature, comprising the steps of:
step one: acquiring crop multispectral image data and crop actual yield data, wherein the multispectral image data comprises multispectral each wave band images of a plurality of growing periods or a single growing period of a crop;
Step two: acquiring the reflectivity of each wave band image, and extracting features of the reflectivity of each wave band image, wherein the features comprise at least two of a minimum value, a mean value, a variance, a standard deviation, a maximum value and a median value;
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
Step four: and (5) finishing crop yield estimation by using the trained yield estimation model.
Further, the specific steps of the third step are as follows:
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
Step three, two: calculating the loss of the estimated yield value and the actual yield value output by the yield estimation model;
and step three: and (3) calculating the accumulated loss based on the loss in the third step, and updating parameters of the yield estimation model through a gradient optimization algorithm so as to obtain a final yield estimation model.
Further, the yield estimation model comprises an LSTM network, a multi-date attention weighting module MDA, a mean value pooling layer and a full connection layer;
The LSTM network is used for extracting characteristics of the multi-characteristic values of the reflectivity of the crops on a plurality of dates, and performing implicit excavation of the multi-vegetation index characteristics to obtain the multi-vegetation index characteristics of the crops on a plurality of dates;
the multi-date attention weighting module MDA is used for carrying out interaction among the multi-vegetation index features of the multiple dates of the crops and carrying out weighted attention on the multi-vegetation index features of the multiple dates of the crops;
the averaging layer is used for further extracting multi-vegetation index characteristics of the weighted and focused crops on multiple dates;
The full connection layer is used for estimating the yield according to the extracted characteristics of the mean value pooling layer.
Further, the multi-date attention weighting module specifically performs the following steps:
For an input crop multi-date eigenvalue vector x, the multi-date attention weighting module firstly maps the input crop multi-date eigenvalue vector x to three different high-dimensional space learning eigenvalue representations to obtain three crop multi-date eigenvalue vector representations Q, K, V, and three different high-dimensional spaces, namely a Query space, a Key space and a Value space;
Performing matrix multiplication similarity calculation on the crop multi-date eigenvalue vector representation Q after Query space mapping and the crop multi-date eigenvalue vector representation K after Key space mapping, and obtaining a attention degree weight matrix A through a softmax function;
and finally, carrying out matrix multiplication on the attention weight matrix A and the crop multi-date eigenvalue vector representation V mapped to the Value space to obtain a weighted representation z, and carrying out shortcut connection on the z and the original input vector x to obtain a final weighted vector y.
Further, the specific steps of mapping are as follows:
The mapping process is q=query (x), k=key (x), v=value (x), wherein Query, key, value modules are composed of a full connection layer FC and an activation function ReLU;
Further, the attention weight matrix a obtained by the softmax function is expressed as:
A=softmax(Q*(K^T))
where K T represents the transpose operation of the K matrix.
Further, the final weight vector y is expressed as:
y=x+A*V。
further, the wave bands comprise a red light wave band, a blue light wave band, a green light wave band, a red side wave band and a near infrared wave band.
Furthermore, the multispectral image data is obtained by carrying a multispectral lens on the unmanned aerial vehicle.
Further, the feature extraction of the reflectivity of each band image is performed through QGIS or ArcGIS.
The beneficial effects of the invention are as follows:
By utilizing various characteristics of the reflectivity of each wave band of the multispectral image, the information of multispectral image data can be utilized more fully and efficiently, and the problems that the vegetation index formula is difficult to select and the estimation result is inaccurate due to improper selection of the vegetation index formula are also avoided. And the accuracy of yield estimation is improved.
The method in the prior art comprises the step of manually selecting a vegetation index formula and carrying out further feature extraction. The method has the problem that proper vegetation index formulas are difficult to manually select and design according to specific tasks. The application considers that the hidden excavation similar to the vegetation index formula can be carried out by utilizing the inside of the deep learning model, thereby solving the problem that the proper vegetation index is difficult to select and design in the method in the prior art, and improving the overall efficiency of the estimating method. In addition, in the method in the prior art, the step of manually selecting a vegetation index formula to perform further feature extraction can cause irreversibility of partial information loss after information extraction. For example, with eigenvalues a=1 and b=2, after the vegetation index formula k= (a+b)/2 is taken, k=1.5 will be obtained, but at this time, k=1.5 is taken as the input of the subsequent model, and there is already information loss, and the original a=1 and b=2 cannot be known, because a=2 and b=1 may also obtain a k=1.5 result. Therefore, if the selected vegetation index formula is incorrect, a worse estimated yield will be caused. The method of the application does not have the step, can keep the original information to be sent into the subsequent model, and implicitly excavates the vegetation index formula through the multi-characteristic vegetation index encoder in the model, thereby improving the accuracy of the final estimated result.
The prior art method only considers the whole consideration of the multi-date of crops, namely, the multi-date data is subjected to primary attention judgment scoring at the global angle. For example, if there is data for multiple dates A, B, C, this attention mechanism only considers the importance between A, B, C and the final estimated yield from an overall perspective. The present application contemplates that there may be some correlation between dates, such as a relatively tight relationship between a and C, little or no significant correlation between B and C, etc. Therefore, we propose a multi-date attention interaction module MDA, in which we can consider the relevance between each date and other dates and perform data interaction, so as to improve the accuracy of the final estimated result.
Drawings
FIG. 1 is a diagram comparing the present application with the prior art;
FIG. 2 is a flow chart of the present application;
FIG. 3 is a flow chart of a yield estimation model;
fig. 4 is a block diagram of a yield estimation model.
Detailed Description
It should be noted that, in particular, the various embodiments of the present disclosure may be combined with each other without conflict.
The first embodiment is as follows: referring to fig. 1, a crop yield estimation method for multi-feature reflectivity of a multispectral image according to the present embodiment is specifically described, and includes the following steps:
step one: acquiring crop multispectral image data and crop actual yield data, wherein the multispectral image data comprises multispectral each wave band images of a plurality of growing periods or a single growing period of a crop;
Step two: acquiring the reflectivity of each wave band image, and extracting features of the reflectivity of each wave band image, wherein the features comprise at least two of a minimum value, a mean value, a variance, a standard deviation, a maximum value and a median value;
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
Step four: and (5) finishing crop yield estimation by using the trained yield estimation model. The overall flow chart of the present application is shown in fig. 2.
By using the multi-characteristic reflectivity of the multispectral image of the crop, the application avoids the problems of inaccurate estimated result caused by improper selection of the vegetation index formula and the selection of the vegetation index formula, simplifies the whole process and improves the accuracy of the estimated result of the yield; by using multiple features, the original information contained in the multispectral image is better utilized, and the problem of insufficient utilization of multispectral image information is solved; the result of the yield estimation has higher accuracy.
The comparison of the method of the present application with other methods of the prior art is shown in figure 1. By utilizing various characteristics of the reflectivity of each wave band of the multispectral image, the problems that the vegetation index formula is difficult to select and the estimated result is inaccurate due to improper selection of the vegetation index formula are avoided. And meanwhile, the information of multispectral image data can be utilized more fully and efficiently, and the accuracy of yield estimation is improved. Provides a new idea for the subsequent crop yield estimation method.
There are various ways to obtain multispectral images of crops, wherein (1) photographing can be performed by a handheld device. (2) The unmanned aerial vehicle can be used for carrying the multispectral lens for shooting. (3) The photographing can be performed by carrying a multispectral lens on other equipment.
The various feature values extracted include, but are not limited to, maximum, minimum, mean, median, variance, standard deviation, etc.
For multi-characteristic input of reflectivity of the crop multi-spectral image, the model outputs a final yield estimated value result.
The second embodiment is as follows: this embodiment is further described with respect to the first embodiment, and the difference between this embodiment and the first embodiment is that the specific steps of the third embodiment are as follows:
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
step three, two: formulating a loss function, and calculating the loss of an estimated yield value and an actual yield value output by the yield estimation model;
and step three: and (3) calculating the accumulated loss based on the loss in the third step, and updating parameters of the yield estimation model through a gradient optimization algorithm so as to obtain a final yield estimation model.
A yield estimation model is constructed, the input of the yield estimation model is derived from the obtained multi-date multi-spectral reflectance multi-feature vector set, the flow chart of the model is shown in fig. 3, and the structure diagram of the model is shown in fig. 4. The multi-feature vegetation index encoder is used for carrying out encoding feature extraction and implicit vegetation index formula mining on the multi-date multi-spectral reflectance multi-feature vector group, and carrying out feature fusion according to the time dimension. The attention interaction module is used for carrying out the day-to-day attention weighted interaction on the mined implicit vegetation index features and outputting enhanced features. The yield regressor further processes the weighted features and predicts and estimates yield.
Training the constructed model by using the marked actual yield value obtained in the step three to obtain a final network model. Specific: training a yield estimation model by using the marked actual yield value, formulating a loss function 1, and calculating the loss 1; the cumulative loss 1 is calculated and the parameters of the model are updated by a gradient optimization algorithm. Wherein the loss function 1 includes, but is not limited to, L1-loss, L2-loss, and the like, which are common regression losses.
And a third specific embodiment: the second embodiment is further described, and the difference between the second embodiment and the third embodiment is that the yield estimation model includes an LSTM network, a multi-date attention weighting module MDA, a mean value pooling layer, and a full connection layer;
The LSTM network is used for extracting characteristics of the multi-characteristic values of the reflectivity of the crops on a plurality of dates, and performing implicit excavation of the multi-vegetation index characteristics to obtain the multi-vegetation index characteristics of the crops on a plurality of dates;
the multi-date attention weighting module MDA is used for carrying out interaction among the multi-vegetation index features of the multiple dates of the crops and carrying out weighted attention on the multi-vegetation index features of the multiple dates of the crops;
the averaging layer is used for further extracting multi-vegetation index characteristics of the weighted and focused crops on multiple dates;
The full connection layer is used for estimating the yield according to the extracted characteristics of the mean value pooling layer.
The specific embodiment IV is as follows: this embodiment is further described in the third embodiment, and the difference between this embodiment and the third embodiment is that the multi-date attention weighting module specifically performs the following steps:
For an input crop multi-date eigenvalue vector x, the multi-date attention weighting module firstly maps the input crop multi-date eigenvalue vector x to three different high-dimensional space learning eigenvalue representations to obtain three crop multi-date eigenvalue vector representations Q, K, V, and three different high-dimensional spaces, namely a Query space, a Key space and a Value space;
Performing matrix multiplication similarity calculation on the crop multi-date eigenvalue vector representation Q after Query space mapping and the crop multi-date eigenvalue vector representation K after Key space mapping, and obtaining a attention degree weight matrix A through a softmax function;
and finally, carrying out matrix multiplication on the attention weight matrix A and the crop multi-date eigenvalue vector representation V mapped to the Value space to obtain a weighted representation z, and carrying out shortcut connection on the z and the original input vector x to obtain a final weighted vector y.
Fifth embodiment: this embodiment is further described with respect to the fourth embodiment, and the difference between this embodiment and the fourth embodiment is that the mapping includes the following specific steps:
The mapping process is q=query (x), k=key (x), v=value (x), wherein Query, key, value modules are internally composed of a full connection layer FC and an activation function ReLU.
Specific embodiment six: this embodiment is further described in relation to the fifth embodiment, and the difference between this embodiment and the fifth embodiment is that the attention weight matrix a obtained by the softmax function is expressed as:
A=softmax(Q*(K^T))
where K T represents the transpose operation of the K matrix.
Seventh embodiment: this embodiment is further described with respect to the sixth embodiment, and the difference between this embodiment and the sixth embodiment is that the final weight vector y is expressed as:
y=x+A*V。
Eighth embodiment: this embodiment is further described with respect to the first embodiment, and the difference between this embodiment and the first embodiment is that the wavelength band includes a red light wavelength band, a blue light wavelength band, a green light wavelength band, a red side wavelength band, and a near infrared wavelength band.
Detailed description nine: the present embodiment is further described with respect to the first embodiment, and the difference between the present embodiment and the first embodiment is that the multispectral image data is obtained by mounting a multispectral lens on an unmanned aerial vehicle.
Detailed description ten: the present embodiment is further described with respect to the first embodiment, and the difference between the present embodiment and the first embodiment is that the feature extraction of the reflectance of each band image is performed by QGIS or ArcGIS.
Multispectral image feature extraction models are typically processed using software that hosts extraction feature plugins, such as QGIS, arcGIS, and the like.
It should be noted that the detailed description is merely for explaining and describing the technical solution of the present invention, and the scope of protection of the claims should not be limited thereto. All changes which come within the meaning and range of equivalency of the claims and the specification are to be embraced within their scope.

Claims (9)

1. A method for estimating crop yield for multi-spectral image reflectivity multi-feature, comprising the steps of:
step one: acquiring crop multispectral image data and crop actual yield data, wherein the multispectral image data comprises multispectral each wave band images of a plurality of growing periods or a single growing period of a crop;
Step two: acquiring the reflectivity of each wave band image, and extracting features of the reflectivity of each wave band image, wherein the features comprise at least two of a minimum value, a mean value, a variance, a standard deviation, a maximum value and a median value;
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
Step four: the trained yield estimation model is utilized to complete crop yield estimation;
The yield estimation model comprises an LSTM network, a multi-date attention weighting module MDA, a mean value pooling layer and a full connection layer;
The LSTM network is used for extracting characteristics of the multi-characteristic values of the reflectivity of the crops on a plurality of dates, and performing implicit excavation of the multi-vegetation index characteristics to obtain the multi-vegetation index characteristics of the crops on a plurality of dates;
the multi-date attention weighting module MDA is used for carrying out interaction among the multi-vegetation index features of the multiple dates of the crops and carrying out weighted attention on the multi-vegetation index features of the multiple dates of the crops;
the averaging layer is used for further extracting multi-vegetation index characteristics of the weighted and focused crops on multiple dates;
The full connection layer is used for estimating the yield according to the extracted characteristics of the mean value pooling layer.
2. The method for estimating crop yield for multi-spectral image reflectivity multi-feature according to claim 1, wherein the specific steps of the third step are as follows:
Step three: training a yield estimation model by utilizing the characteristics extracted in the second step and the actual yield data of crops;
Step three, two: calculating the loss of the estimated yield value and the actual yield value output by the yield estimation model;
and step three: and (3) calculating the accumulated loss based on the loss in the third step, and updating parameters of the yield estimation model through a gradient optimization algorithm so as to obtain a final yield estimation model.
3. The crop yield estimation method for multispectral image reflectivity multifarious of claim 1, wherein the multi-date attention weighting module specifically performs the steps of:
For an input crop multi-date eigenvalue vector x, the multi-date attention weighting module firstly maps the input crop multi-date eigenvalue vector x to three different high-dimensional space learning eigenvalue representations to obtain three crop multi-date eigenvalue vector representations Q, K, V, and three different high-dimensional spaces, namely a Query space, a Key space and a Value space;
Performing matrix multiplication similarity calculation on the crop multi-date eigenvalue vector representation Q after Query space mapping and the crop multi-date eigenvalue vector representation K after Key space mapping, and obtaining a attention degree weight matrix A through a softmax function;
and finally, carrying out matrix multiplication on the attention weight matrix A and the crop multi-date eigenvalue vector representation V mapped to the Value space to obtain a weighted representation z, and carrying out shortcut connection on the z and the original input vector x to obtain a final weighted vector y.
4. A crop yield estimation method for multispectral image reflectivity multifeatures according to claim 3, characterized in that the specific steps of the mapping are:
The mapping process is q=query (x), k=key (x), v=value (x), wherein Query, key, value modules are internally composed of a full connection layer FC and an activation function ReLU.
5. The method of claim 4, wherein the weight matrix a of interest obtained by a softmax function is expressed as:
A=softmax(Q*(K^T))
where K T represents the transpose operation of the K matrix.
6. A crop yield estimation method for multispectral image reflectivity multifaries according to claim 5, characterized in that the final weighting vector y is expressed as:
y=x+A*V。
7. The method of claim 1, wherein the bands include red, blue, green, red, near infrared bands.
8. The crop yield estimation method for multispectral image reflectivity multifaries according to claim 1, wherein the multispectral image data is obtained by carrying multispectral lenses on an unmanned aerial vehicle.
9. A crop yield estimation method for multi-feature reflectance of multispectral images according to claim 1, wherein the feature extraction of reflectance of each band image is performed by QGIS or ArcGIS.
CN202310508864.3A 2023-05-08 2023-05-08 Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature Active CN116563706B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310508864.3A CN116563706B (en) 2023-05-08 2023-05-08 Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310508864.3A CN116563706B (en) 2023-05-08 2023-05-08 Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature

Publications (2)

Publication Number Publication Date
CN116563706A CN116563706A (en) 2023-08-08
CN116563706B true CN116563706B (en) 2024-05-17

Family

ID=87485549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310508864.3A Active CN116563706B (en) 2023-05-08 2023-05-08 Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature

Country Status (1)

Country Link
CN (1) CN116563706B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017169511A (en) * 2016-03-24 2017-09-28 株式会社日立ソリューションズ東日本 Apparatus for estimating normal stock ratio of agricultural crop, apparatus for predicting yield of agricultural crop, and method for estimating normal stock ratio of agricultural crop
CN110414738A (en) * 2019-08-01 2019-11-05 吉林高分遥感应用研究院有限公司 A kind of crop yield prediction technique and system
CN111798327A (en) * 2020-06-24 2020-10-20 安徽大学 Construction method and application of wheat yield calculation model based on hyperspectral image
CN112345458A (en) * 2020-10-22 2021-02-09 南京农业大学 Wheat yield estimation method based on multispectral image of unmanned aerial vehicle
CN114692991A (en) * 2022-04-18 2022-07-01 浙江大学 Wolfberry yield prediction method and system based on deep learning
CN114782843A (en) * 2022-04-22 2022-07-22 浙江大学 Crop yield prediction method and system based on unmanned aerial vehicle multispectral image fusion
CN114863273A (en) * 2022-04-21 2022-08-05 广东工业大学 Crop early season multisource remote sensing yield data processing method and prediction method
CN114882359A (en) * 2022-05-07 2022-08-09 中国科学院空天信息创新研究院 Soybean planting area extraction method and system based on vegetation index time series spectrum characteristics
CN115222100A (en) * 2022-06-23 2022-10-21 郑州大学 Crop yield prediction method based on three-dimensional cyclic convolution neural network and multi-temporal remote sensing image
CN115578637A (en) * 2022-10-17 2023-01-06 中国科学院空天信息创新研究院 Winter wheat yield estimation analysis method and system based on long-term and short-term memory network
CN115730523A (en) * 2022-11-28 2023-03-03 华中农业大学 Near-real-time prediction method for regional scale crop yield based on deep learning
CN115841615A (en) * 2022-11-24 2023-03-24 浙江领见数智科技有限公司 Tobacco yield prediction method and device based on multispectral data of unmanned aerial vehicle
CN115860269A (en) * 2023-02-20 2023-03-28 南京信息工程大学 Crop yield prediction method based on triple attention mechanism

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9953241B2 (en) * 2014-12-16 2018-04-24 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for satellite image processing to estimate crop yield
CN109459392B (en) * 2018-11-06 2019-06-14 南京农业大学 A kind of rice the upperground part biomass estimating and measuring method based on unmanned plane multispectral image
CN111815014B (en) * 2020-05-18 2023-10-10 浙江大学 Crop yield prediction method and system based on unmanned aerial vehicle low-altitude remote sensing information
CN113554232A (en) * 2021-07-26 2021-10-26 吉林大学 Crop yield prediction method and system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017169511A (en) * 2016-03-24 2017-09-28 株式会社日立ソリューションズ東日本 Apparatus for estimating normal stock ratio of agricultural crop, apparatus for predicting yield of agricultural crop, and method for estimating normal stock ratio of agricultural crop
CN110414738A (en) * 2019-08-01 2019-11-05 吉林高分遥感应用研究院有限公司 A kind of crop yield prediction technique and system
CN111798327A (en) * 2020-06-24 2020-10-20 安徽大学 Construction method and application of wheat yield calculation model based on hyperspectral image
CN112345458A (en) * 2020-10-22 2021-02-09 南京农业大学 Wheat yield estimation method based on multispectral image of unmanned aerial vehicle
CN114692991A (en) * 2022-04-18 2022-07-01 浙江大学 Wolfberry yield prediction method and system based on deep learning
CN114863273A (en) * 2022-04-21 2022-08-05 广东工业大学 Crop early season multisource remote sensing yield data processing method and prediction method
CN114782843A (en) * 2022-04-22 2022-07-22 浙江大学 Crop yield prediction method and system based on unmanned aerial vehicle multispectral image fusion
CN114882359A (en) * 2022-05-07 2022-08-09 中国科学院空天信息创新研究院 Soybean planting area extraction method and system based on vegetation index time series spectrum characteristics
CN115222100A (en) * 2022-06-23 2022-10-21 郑州大学 Crop yield prediction method based on three-dimensional cyclic convolution neural network and multi-temporal remote sensing image
CN115578637A (en) * 2022-10-17 2023-01-06 中国科学院空天信息创新研究院 Winter wheat yield estimation analysis method and system based on long-term and short-term memory network
CN115841615A (en) * 2022-11-24 2023-03-24 浙江领见数智科技有限公司 Tobacco yield prediction method and device based on multispectral data of unmanned aerial vehicle
CN115730523A (en) * 2022-11-28 2023-03-03 华中农业大学 Near-real-time prediction method for regional scale crop yield based on deep learning
CN115860269A (en) * 2023-02-20 2023-03-28 南京信息工程大学 Crop yield prediction method based on triple attention mechanism

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Improving Wheat Yield Prediction Accuracy Using LSTM-RF Framework Based on UAV Thermal Infrared and Multispectral Imagery;Yulin Shen等;《Agriculture》;20220620;第12卷(第6期);第1-13页 *
Multispectral Crop Yield Prediction Using 3D-Convolutional Neural Networks and Attention Convolutional LSTM Approaches;Seyed Mahdi Mirhoseini Nejad等;《IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing》;20221121(第16期);第254-266页 *
基于LSTM的芝麻遥感估产研究;陈燕生等;《赤峰学院学报》;20210925;第37卷(第09期);第18-22页 *
基于无人机数码影像和高光谱数据的冬小麦产量估算对比;陶惠林等;农业工程学报;20191208;35(第23期);第111-118页 *

Also Published As

Publication number Publication date
CN116563706A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
CN113449680B (en) Knowledge distillation-based multimode small target detection method
Farmonov et al. Crop type classification by DESIS hyperspectral imagery and machine learning algorithms
CN110751019B (en) High-resolution image crop automatic extraction method and device based on deep learning
Sharma et al. Wheat crop yield prediction using deep LSTM model
CN112949414B (en) Intelligent surface water body drawing method for wide-vision-field high-resolution six-satellite image
CN110889394A (en) Rice lodging recognition method based on deep learning UNet network
CN112067129B (en) Hyperspectral processing method and waveband selection method
CN113610905B (en) Deep learning remote sensing image registration method based on sub-image matching and application
CN101667292B (en) SAR image segmentation system and segmentation method based on immune clone and projection pursuit
CN115331104A (en) Crop planting information extraction method based on convolutional neural network
CN116189021B (en) Multi-branch intercrossing attention-enhanced unmanned aerial vehicle multispectral target detection method
CN116343058A (en) Global collaborative fusion-based multispectral and panchromatic satellite image earth surface classification method
Naseer et al. Onion Crop Monitoring with Multispectral Imagery Using Deep Neural Network
Bhadra et al. End-to-end 3D CNN for plot-scale soybean yield prediction using multitemporal UAV-based RGB images
CN116563706B (en) Crop yield estimation method aiming at multi-spectral image reflectivity multi-feature
CN117746244A (en) Litchi male and female flower counting method based on point-to-point network and Bayesian loss
CN112668421A (en) Attention mechanism-based rapid classification method for hyperspectral crops of unmanned aerial vehicle
Treboux et al. Towards retraining of machine learning algorithms: an efficiency analysis applied to smart agriculture
CN116307105A (en) Winter wheat yield prediction method, device and equipment based on multi-mode canopy image
CN114925947B (en) Phenological adaptive crop physiological index deep learning estimation method and system
CN115659836A (en) Unmanned system vision self-positioning method based on end-to-end feature optimization model
CN114863273A (en) Crop early season multisource remote sensing yield data processing method and prediction method
Martini et al. Enhancing navigation benchmarking and perception data generation for row-based crops in simulation
Fan et al. An improved Deeplab based model for extracting cultivated land information from high definition remote sensing images
CN114091774A (en) Crop yield estimation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant