AU2021102735A4 - Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning - Google Patents

Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning Download PDF

Info

Publication number
AU2021102735A4
AU2021102735A4 AU2021102735A AU2021102735A AU2021102735A4 AU 2021102735 A4 AU2021102735 A4 AU 2021102735A4 AU 2021102735 A AU2021102735 A AU 2021102735A AU 2021102735 A AU2021102735 A AU 2021102735A AU 2021102735 A4 AU2021102735 A4 AU 2021102735A4
Authority
AU
Australia
Prior art keywords
category
images
cotton
order derivative
cotton plant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021102735A
Inventor
Pan GAO
Jiao Lin
Xin Lv
Wei Xu
Tianying YAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shihezi University
Original Assignee
Shihezi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shihezi University filed Critical Shihezi University
Priority to AU2021102735A priority Critical patent/AU2021102735A4/en
Application granted granted Critical
Publication of AU2021102735A4 publication Critical patent/AU2021102735A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/314Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • G01N33/4833Physical analysis of biological material of solid biological material, e.g. tissue samples, cell cultures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2333/00Assays involving biological materials from specific organisms or of a specific nature
    • G01N2333/415Assays involving biological materials from specific organisms or of a specific nature from plants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2333/00Assays involving biological materials from specific organisms or of a specific nature
    • G01N2333/435Assays involving biological materials from specific organisms or of a specific nature from animals; from humans
    • G01N2333/43504Assays involving biological materials from specific organisms or of a specific nature from animals; from humans from invertebrates
    • G01N2333/43552Assays involving biological materials from specific organisms or of a specific nature from animals; from humans from invertebrates from insects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/0098Plants or trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Biomedical Technology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Food Science & Technology (AREA)
  • Molecular Biology (AREA)
  • Hematology (AREA)
  • Optics & Photonics (AREA)
  • Urology & Nephrology (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to the technical field of pest stress monitoring, and discloses a cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning. Hyperspectral images of cotton leaves are obtained by using a hyperspectral imaging system; with single cotton leaves as regions of interest, hyperspectral images of the single leaves are extracted, and average spectra and first-order derivative spectra are finally calculated. The spectral information and the deep learning technology are fully utilized, so as to discover importance of each band by using a visualization technology, and important bands are selected for monitoring and early warning. A three-dimensional convolutional neural network is used to learn the hyperspectral images of single leaves, and hyperspectral images in the range of visible and near-infrared bands are selected, to generate a significance map by using the visualization technology. Cotton leaf damage areas stressed by cotton aphids can be found. 17710447_1 (GHMatters) P116302.AU Sheet 5/7 FIG. 5 FIG. 6 0.018 0.016 0.014 co W- 0.012 0 a, 0. 010 o 0.008 E 0.006 a) co 0.004 0.002 0.000 500 600 700 800 900 100 Wavelength (nm) FIG. 7

Description

Sheet 5/7
FIG. 5
FIG. 6
0.018
0.016
co 0.014
W- 0.012
a, 0. 010
o 0.008
E 0.006 a) co 0.004 0.002
0.000 500 600 700 800 900 100 Wavelength (nm)
FIG. 7
COTTON APHID STRESS MONITORING METHOD AND SYSTEM BASED ON HIGH THROUGHPUT PLANT PHENOTYPING PLATFORM AND DEEP LEARNING TECHNICAL FIELD The present disclosure relates to the technical field of pest condition monitoring, and in particular, to a cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning. BACKGROUND The high-throughput plant phenotyping platform and deep learning technology are already used for pest stress monitoring at present. Existing technologies focus only on the estimation of cotton aphid stress levels, but not on results caused by cotton leaf damage areas. The existing technology for cotton aphid stress monitoring still adopts a conventional band selection method. The conventional band selection method requires lots of experiments to select an appropriate band range ultimately, which is labor-consuming. Moreover, it is difficult to select the appropriate band range. In summary, a new technology for cotton aphid stress monitoring based on a high-throughput plant phenotyping platform and deep learning is needed urgently in this field, to make full use of the high-throughput plant phenotyping platform and deep learning technologies, so as to discover the importance of each band by using a visualization technology, select important bands for monitoring and early warning, quickly monitor whether cotton plants are stressed by cotton aphids, and improve the timeliness of cotton aphid monitoring and early warning. SUMMARY The present disclosure aims to provide a cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning, to quickly monitor whether cotton plants are stressed by cotton aphids. To achieve the above objective, the present disclosure provides the following solutions: A cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning is provided, including: obtaining first-category cotton plant images and second-category cotton plant images under natural growth conditions, where the first-category cotton plant images include images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images include images of healthy cotton plants; calibrating the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images; extracting region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain first-category region-of-interest images and second-category region-of-interest images;
17710447_1 (GHMatters) P116302.AU calculating an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra; calculating a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first-order derivative spectra; training and optimizing a first convolutional neural network by using the first-category first order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network; performing visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and selecting sensitive bands; training and optimizing the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, where the second convolutional neural network and the first convolutional neural network are both one-dimensional convolutional neural networks; obtaining first-order derivative spectra of to-be-monitored cotton plant images; and inputting the first-order derivative spectra of the to-be-monitored cotton plant images into the trained and optimized second convolutional neural network, and classifying the to-be-monitored cotton plant images, to obtain categories to which the to-be-monitored cotton plant images belong, where the categories include a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids. The present disclosure further provides the following solutions: A cotton aphid monitoring system based on a high-throughput plant phenotyping platform and deep learning, including: a first obtaining module, configured to obtain first-category cotton plant images and second category cotton plant images under natural growth conditions, where the first-category cotton plant images include images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images include images of healthy cotton plants; a calibration module, configured to calibrate the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images; a region-of-interest extraction module, configured to extract region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain first-category region-of-interest images and second-category region-of-interest images; an average spectra calculation module, configured to calculate an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra; a first-order derivative spectra calculation module, configured to calculate a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first order derivative spectra; a first training and optimization module, configured to train and optimize a first convolutional neural network by using the first-category first-order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network; a visualization analysis module, configured to perform visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and select sensitive bands; a second training and optimization module, configured to train and optimize the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, where the second convolutional neural network and the first convolutional neural network are both one-dimensional convolutional neural networks; a second obtaining module, configured to obtain first-order derivative spectra of to-be monitored cotton plant images; and a category prediction module, configured to input the first-order derivative spectra of the to-be monitored cotton plant images into the trained and optimized second convolutional neural network, and classify the to-be-monitored cotton plant images, to obtain categories to which the to-be monitored cotton plant images belong, where the categories include a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids. According to embodiments of the present disclosure, the present disclosure has the following technical effects: According to the cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning disclosed in the present disclosure, hyperspectral images of cotton leaves are obtained by using the high-throughput plant phenotyping platform; with single cotton leaves as regions of interest, hyperspectral images of the single leaves are extracted, and average spectra and first-order derivative spectra are finally calculated. The spectral information and deep learning technology are fully utilized, so as to discover importance of each band by using a visualization technology, and select important bands for monitoring and early warning. The band range selection method based on deep learning can discover sensitive bands. A cotton aphid stress monitoring model can be established after the sensitive bands are selected, which can quickly monitor whether cotton plants are stressed by cotton aphids and improve the timeliness of cotton aphid monitoring and early warning. The selection of suitable bands can reduce investment costs of spectral imaging equipment. In addition, a three-dimensional convolutional neural network is used to learn the hyperspectral images of single leaves, and hyperspectral images in the range of visible and near-infrared bands are selected, to generate a significance map by using the visualization technology. Cotton leaf damage areas stressed by cotton aphids can be found, to serve as an aid to determine cotton aphid infestation and as a basis for classifying cotton aphid damage levels. BRIEF DESCRIPTION OF THE DRAWINGS To describe the technical solutions in embodiments of the present disclosure or in the prior art more clearly, the accompanying drawings needed in the embodiments will be introduced below briefly. Apparently, the accompanying drawings in the following description show merely some embodiments of the present disclosure, and other drawings can be derived from these accompanying drawings by those of ordinary skill in the art without creative efforts. FIG. 1 is a flowchart of an embodiment of a cotton aphid monitoring method based on a high throughput plant phenotyping platform and deep learning according to the present disclosure; FIG. 2 is a schematic structural diagram of a one-dimensional convolutional neural network established according to the present disclosure; FIG. 3 is a schematic structural diagram of a three-dimensional convolutional neural network established according to the present disclosure; FIG. 4 is a technical flowchart of an overall technical solution of the present disclosure; FIG. 5 is an image of a cotton leaf not stressed by cotton aphids according to the present disclosure; FIG. 6 is an image of a cotton leaf stressed by cotton aphids according to the present disclosure; FIG. 7 is a graph showing a result of importance of each band according to the present disclosure; FIG. 8 is a graph showing a result of marked cotton leaf damage areas according to the present disclosure; FIG. 9 is another graph showing a result of marked cotton leaf damage areas according to the present disclosure; and FIG. 10 is a structural diagram of an embodiment of a cotton aphid monitoring system based on a high-throughput plant phenotyping platform and deep learning according to the present disclosure. DETAILED DESCRIPTION The technical solutions of the embodiments of the present disclosure are clearly and completely described below with reference to the accompanying drawings. Apparently, the described embodiments are merely some rather than all of the embodiments of the present disclosure. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present disclosure without creative efforts shall fall within the protection scope of the present disclosure. An objective of the present disclosure is to provide a cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning, to quickly monitor whether cotton plants are stressed by cotton aphids. To make the objectives, features, and advantages of the present disclosure more obvious and comprehensive, the following further describes in detail the present disclosure with reference to the accompanying drawings and specific implementations. FIG. 1 is a flowchart of an embodiment of a cotton aphid monitoring method based on a high throughput plant phenotyping platform and deep learning according to the present disclosure. Referring to FIG. 1, the cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning includes the following steps: Step 101: obtain first-category cotton plant images and second-category cotton plant images under natural growth conditions, where the first-category cotton plant images include images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images include images of healthy cotton plants. Step 101 specifically includes the following operations: Hyperspectral images (376-1044 nm) of different cotton aphid damage levels were acquired using a SOC 710VP hyperspectral imager (Surface Optics Corporation, San Diego, CA, USA). Specifically, the images were captured in a fixed period of time, to reduce interference from biochemical factors such as the open state of cotton leaf stomata. Cotton plants increasingly stressed by cotton aphids and healthy cotton plants under natural growth conditions in a certain stage (such as a bud stage) were photographed by using a hyperspectral imaging system, to obtain images of cotton plants that are increasingly stressed by cotton aphids and images of healthy cotton plants. Step 102: calibrate the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images. Step 102 specifically includes the following operation: The first-category cotton plant images and the second-category cotton plant images are
I Ir -Id calibrated by using a calibration formula Iw - Id , to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images. In the formula, Ic represents calibrated cotton aphid images of each category, I, represents original cotton aphid images of each category, Id represents dark reference cotton aphid images of each category, and Iw represents white reference cotton aphid images of each category. The calibrated first-category cotton plant images and the calibrated second-category cotton plant images are all hyperspectral images. Step 103: extract region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtainfirst-category region-of interest images and second-category region-of-interest images. Step 103 specifically includes the following operations: Images corresponding to a wavelength range of 461-994 nm are captured from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain effective images. This operation specifically includes: removing front and rear noise wavelength images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, where an effective wavelength range of the new hyperspectral images is 461 994 nm. Images corresponding to wavelengths of 461 nm, 548 nm, and 698 nm are selected from the effective images to serve as to-be-synthesized images. Among effective wavelengths of the hyperspectral images taken by the SOC 710VP hyperspectral imager, wavelengths closest to the RGB triplets determined by the International Commission on Illumination are 436 nm, 546 nm and 700 nm. Therefore, images corresponding to the wavelengths of 461 nm, 548 nm and 698 nm in the effective images are selected as the to-be-synthesized images. The to-be-synthesized images are synthesized into RGB images, to obtain first-category RGB images and second-category RGB images, where the first-category RGB images are RGB images of the first-category cotton plant images, and the second-category RGB images are RGB images of the second-category cotton plant images. Mask regions in the first-category RGB images and the second-category RGB images are calculated by using a 2G-R-B segmentation algorithm, single-leaf regions are sequentially extracted from the effective images according to the mask regions, and the extracted single-leaf regions are used as region-of-interest images, to obtain the first-category region-of-interest images and the second-category region-of-interest images. After the operation of capturing images corresponding to a wavelength range of 461-994 nm from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain effective images, the method further includes the following operations: A three-dimensional convolutional neural network is trained and optimized by using the effective images, to obtain a trained and optimized three-dimensional convolutional neural network. A significance map is generated based on the trained and optimized three-dimensional convolutional neural network by using the visualization analysis technology, and cotton leaf damage areas are marked, so that the cotton leaf damage areas are visualized. Step 104: calculate an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra. Step 104 specifically includes the following operation:
An average spectrum of the k-th band of each category is calculated according to a formula Bk = mean( Ai,j,kfor all(i, j)e o). In the formula, Bk represents the average spectrum of the k-th band of each category; 0 represents the single-leaf regions in the region-of-interest images of each category; i and j represent coordinate values of the region-of-interest images of each category, where i represents horizontal coordinate values of the region-of-interest images of each category, and j represents vertical coordinate values of the region-of-interest images of each category; k represents the k-th band; and Aij,k represents values of coordinates (i, j) of the region-of-interest images of each category in the k-th band. The average spectrum is an average spectrum of different cotton aphid damage levels. Step 105: calculate a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first-order derivative spectra. Step 105 specifically includes the following operation: The first-order derivative spectra of each category are calculated according to a formula
B = Bk+1 - Bk-1 Xk+1 - Xk-1 . In the formula, Xk-i represents a wavelength corresponding to the (k-1)-th band;
Xk+1 represents a wavelength corresponding to the (k+1)-th band; Bk represents a first-order derivative spectrum of the k-th band of each category; Bk+1 represents an average spectrum of the (k+1)-th band of each category; and B- represents an average spectrum of the (k-1)-th band of each category. The first-order derivative spectra can effectively reduce impact of the background. The first-order derivative spectra may be calculated according to the average spectra. Step 106: train and optimize a first convolutional neural network by using the first-category first-order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network. Step 106 specifically includes the following operations: The first-category first-order derivative spectra and the second-category first-order derivative spectra are separately divided according to a ratio of 3:1:1, to obtain a training set, a validation set, and a test set, where the training set, the validation set, and the test set all include the first-category first-order derivative spectra and the second-category first-order derivative spectra. Specifically, region-of-interest images of cotton leaves not stressed by cotton aphids and cotton leaves stressed by cotton aphids that are captured in each photographing period are divided into a training set, a validation set, and a test set according to a ratio of 3:1:1, including two categories: cotton leaves not stressed by cotton aphids and cotton leaves stressed by cotton aphids. The first convolutional neural network is trained and optimized by using the training set and the validation set, to obtain the trained and optimized first convolutional neural network.
The first convolutional neural network is a one-dimensional convolutional neural network whose model relies on the MXNet deep learning library, and an activation function of an activation layer is set to 'relu'. A one-dimensional convolutional neural network consists of four modules. The first module consists of a one-dimensional convolutional layer (ConvID), a linear rectification function (Relu), a batch normalization layer (Batch Normalization) and a one-dimensional maximum pooling layer (MaxPoolID). The second module consists of a flattening layer (Flatten). The third module consists of a fully-connected layer (Dense) and a random deactivation layer (Dropout). The fourth module consists of a fully-connected layer and an excitation function layer (Softmax). A learning strategy is 'adam', an initial learning rate is 0.1, and the learning rate is gradually adjusted to 0.01 during the training process. A loss function is Softmax CrossEntropy Loss, and the structure of the constructed network is schematically shown in FIG. 2. In FIG. 2, Data is input data, that is, thefirst-order derivative spectra. After the calculation in each network layer, eigenvalues are outputted, and the last eigenvalue is Score, which is a prediction score value. The size of Score is equal to the number of categories of the first-order derivative spectra. After a first order derivative spectrum OPo of category c in a given prediction set is correctly classified by the one-dimensional convolutional neural network, a prediction score value Score is obtained, and the category c corresponds to the value Sc in Score. Step 107: perform visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and select sensitive bands. Step 107 specifically includes the following operations: The test set is inputted into the trained and optimizedfirst convolutional neural network for classification, to obtain prediction categories, where the prediction categories include: a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids. It is determined whether the prediction category is consistent with an actual category corresponding to the first-order derivative spectrum. If yes, the first-order derivative spectrum is recorded, and a probability value of the category to which the first-order derivative spectrum belongs is obtained. The probability value is a probability value that the category to which the first-order derivative spectrum belongs is the category of not being stressed by cotton aphids or a probability value that the category to which thefirst-order derivative spectrum belongs is the category of being stressed by cotton aphids. Otherwise, the first-order derivative spectrum is not recorded. Gradient inversion calculation is performed on each recorded probability value according to a
w, = abs( aS " | OPo) formula OOP , to obtain a gradient value of each band in each recorded first-order derivative spectrum. In the formula, wn represents a gradient value of each band in the recorded n-th first-order derivative spectrum, and wnis an absolute value of a derivative of a score Sc (that is, the probability value) about the first-order derivative spectrum ON, namely, the gradient; the size of wn is consistent with the number of bands, and values of wn correspond to importance of respective bands in a one-to-one manner; n is the same as the number of bands in each recorded first-order derivative spectrum; Sc represents the probability value; Po represents the first-order derivative spectrum, and OP represents each band in Po; each element in Po is represented by using OP, and OP represents each band of thefirst-order derivative spectrum. An LI distance is calculated separately for the gradient value of each band (namely, a gradient value of each sample) in each recorded first-order derivative spectrum, and all the LI distances are summed and then a mean value is calculated, to obtain importance of each band. A specific formula is w* = mean( En(1, 2 ,S) n ). In the formula, S is the number of the recorded first-order |I nE(1,2 . . S)Wnj derivative spectra (that is, the number of correctly classified samples); 1 represents the LI distance, w* represents the importance of each band, and wn represents a gradient of the n-th correctly classified sample (that is, the gradient value of each band in the recorded n-th first-order derivative spectrum). A line graph is drawn according to the importance of each band, and a peak in the line graph is selected as the sensitive band. That is, a line graph is drawn according to w*, and an extreme value in w*, namely, the peak in the line graph, is selected as the sensitive band. In the foregoing band range selection method based on a high-throughput plant phenotyping platform and deep learning, a training set offirst-order derivative spectra is selected, a one dimensional convolutional neural network is used to learn data patterns, model parameters are optimized by using a validation set, a model of the one-dimensional convolutional neural network that achieves the highest prediction accuracy with respect to the validation set is used as an optimal model, a performance effect of the optimal model is evaluated by using a test set, and the one dimensional convolutional neural network is visualized using a visualization technology, to select sensitive bands. The specific process of visualization and selection of sensitive bands is as follows: after a first order derivative spectrum OPo of category c (that is, a category of not being stressed or stressed by cotton aphids) in a given prediction set is correctly classified by the one-dimensional convolutional neural network, a prediction score value Sc (i.e., the probability value) is obtained. Gradient inversion calculation is performed on samples in the test set that are correctly classified by the one-dimensional convolutional neural network, to obtain a gradient value corresponding to each band, and the gradient value reflects the importance of the band. Step 108: train and optimize the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, where the second convolutional neural network and the first convolutional neural network are both one-dimensional convolutional neural networks.
Step 108 specifically includes the following operations: The selected sensitive bands are extracted from the training set, validation set, and test set of the average spectra, to form a new training set, validation set, and test set; the same one dimensional convolutional neural network is used to learn the new training set, the new validation set is used to optimize model parameters, and a model of the one-dimensional convolutional neural network that achieves the highest prediction accuracy with respect to the new validation set is used as a new optimal model; the performance of the sensitive band selection method based on deep learning is evaluated using the prediction accuracy of the new model with respect to the new test set; finally, the model is used as a cotton aphid stress monitoring model to quickly monitor whether cotton plants are stressed by cotton aphids. Step 109: obtain first-order derivative spectra of to-be-monitored cotton plant images. Step 109 specifically includes the following operations: To-be-monitored cotton plant images are obtained. The to-be-monitored cotton plant images are calibrated, to obtain calibrated to-be-monitored cotton plant images. Region-of-interest images are extracted from the calibrated to-be-monitored cotton plant images, to obtain region-of-interest images of the to-be-monitored cotton plant images. An average spectrum of each band is calculated for the region-of-interest images of the to-be monitored cotton plant images, to obtain average spectra of the to-be-monitored cotton plant images. A first-order derivative spectrum of each band is calculated for the average spectra, to obtain the first-order derivative spectra of the to-be-monitored cotton plant images. Step 110: input the first-order derivative spectra of the to-be-monitored cotton plant images into the trained and optimized second convolutional neural network, and classify the to-be monitored cotton plant images, to obtain categories to which the to-be-monitored cotton plant images belong, where the categories include a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids. Further, in step 103, the effective images are obtained by capturing images corresponding to a wavelength range of 461-994 nm in the calibrated first-category cotton plant images and the calibrated second-category cotton plant images. The subsequent steps of "training and optimizing the three-dimensional convolutional neural network by using the effective images, to obtain a trained and optimized three-dimensional convolutional neural network" and "generating a significance map based on the trained and optimized three-dimensional convolutional neural network by using the visualization analysis technology, and marking cotton leaf damage areas, so that the cotton leaf damage areas are visualized" are the technology for visualizing cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning, so as to find cotton leaf damage areas stressed by cotton aphids.
In the technology for visualizing cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning, a three-dimensional convolutional neural network is used to learn a defined training set of hyperspectral images, model parameters are optimized by using a validation set, and a model of deep learning that achieves the highest prediction accuracy with respect to the validation set is saved as an optimal model; the performance of the optimal model is evaluated by using a test set, and it is determined whether cotton leaves are infested by cotton aphids. The model with the optimal performance is selected, and a significance map (i.e., saliency map) is generated using the visualization technology, so as to mark cotton leaf damage areas. The three-dimensional convolutional neural network model relies on the MXNet deep learning library, and an activation function of an activation layer is set to 'relu'. A three-dimensional convolutional neural network consists of four modules. The first module consists of a three-dimensional convolutional layer (Conv3D), a linear rectification function (Relu), a batch normalization layer (Batch Normalization) and a one-dimensional maximum pooling layer (MaxPool3D). The second module consists of a flattening layer (Flatten). The third module consists of a fully-connected layer (Dense) and a random deactivation layer (Dropout). The fourth module consists of a fully connected layer and an excitation function layer (Softmax). A learning strategy is 'adam', an initial learning rate is 0.1, and the learning rate is gradually adjusted to 0.01 during the training process. A loss function is Softmax CrossEntropy Loss, and the structure of the constructed network is schematically shown in FIG. 3. In FIG. 3, Data is input data, that is, hyperspectral images containing only single leaves (461 994 nm). After the calculation in each network layer, eigenvalues are outputted, and the last eigenvalue is Score, which is a prediction score value. "+" indicates that the eigenvalues are summed point by point. The size of Score is equal to the number of categories of the hyperspectral images containing only single leaves (461-994 nm). After a hyperspectral image HS containing only a single leaf (461-994 nm) of category c in a given prediction set is correctly classified by the three-dimensional convolutional neural network, a prediction score value Score is obtained, and the category c corresponds to the value Sc in Score. The process of the visualization technology of the three-dimensional convolutional neural network specifically includes the following operations: 1. After a hyperspectral image HSo containing only a single leaf (461-994 nm) of category c in a given prediction set is correctly classified by the three-dimensional convolutional neural network, a prediction score value Sc is obtained, and gradient inversion calculation is performed on the prediction score value. A formula for gradient inversion calculation is as follows: 19Sc W = abs(- HSO) MHS In the formula, W represents an absolute value of a derivative of a score Sc about the hyperspectral image (461-994 nm) HSo containing only a single leaf, namely, the gradient. The size of W is consistent with the hyperspectral image (461-994 nm) HSo containing only a single leaf, and W contains three parameters K, I, and J. K represents the number of bands in the hyperspectral image (461-994 nm) containing only a single leaf, I represents an image length, J represents an image width, and HS represents each element in HSo, that is, any pixel point of any band in the hyperspectral image. 2. Calculate a significance map by using the visualization technology, where a calculation formula is as follow: map = iE(1.I),jE(1,,J)(Wi,j,kE(1,...,K))
In the formula, the width of the significance map is J and the length thereof is I. A heat map is drawn according to the significance map. A schematic diagram of an overall technical solution of this embodiment is shown in FIG. 4. In this embodiment, by using the cotton aphid monitoring technology based on a high-throughput plant phenotyping platform and deep learning, cotton plants stressed by cotton aphids can be quickly monitored. Through the combination of the deep learning technology, image processing technology and spectral imaging technology, the cotton aphid monitoring technology and the band range selection method based on a high-throughput plant phenotyping platform and deep learning are proposed. With the band range selection method based on deep learning, sensitive bands can be discovered, and a cotton aphid stress monitoring model can be established after the sensitive bands are selected. This improves the timeliness of cotton aphid monitoring and early warning, and the selection of suitable bands can reduce the investment costs of spectral imaging equipment in the high-throughput plant phenotyping platform. In addition, the three-dimensional convolutional neural network is used to learn hyperspectral images of single leaves; a technology for visualizing cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning is used, and cotton leaf damage areas are marked by using the visualization technology, so that cotton leaf damage areas stressed by cotton aphids can be detected. This can be used in combination with a cotton aphid stress monitoring model, where hyperspectral images of single leaves are extracted to calculate average spectra and first-order derivative spectra, and then cotton plants stressed by cotton aphids are monitored using the first-order derivative spectra. Meanwhile, the hyperspectral images of single leaves and the technology for visualizing cotton leaf damage areas are used to discover cotton leaf damage areas stressed by cotton aphids to serve as an aid to determine cotton aphid stress and as a basis for classifying cotton aphid damage levels. As a substitution of manual judgment for cotton aphid stress, the cotton aphid stress monitoring technology and the technology for visualization of cotton leaf damage areas based on a high throughput plant phenotyping platform and deep learning provide a technical guarantee for rapid monitoring and early warning of cotton aphids. Moreover, this technology breaks the traditional method of passive cotton aphid control, and is not only economical and scientific, but also achieves the purpose of precise control and greatly reduces the waste of human resources, pesticides, and the like. Using the foregoing "cotton aphid monitoring and technology for visualization of cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning", the technology was implemented in the cotton seedling stage. A 20-day pilot trial was conducted from December 5 to December 25, 2019, in the greenhouse in North District 2 of Shihezi University, where half of the cotton plants were not under pest control, and the other half of the cotton plants were under pest control and in a healthy state during cotton production. On this basis, the effectiveness of the cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning according to this embodiment is verified. The process specifically includes: Acquisition of hyperspectral images of different cotton aphid infestation levels: cotton plants increasing stressed by cotton aphids and healthy cotton plants were photographed every 5 days using a hyperspectral imaging system, and regions of interest are extracted to obtain single-cotton leaf samples. To reduce the interference of biochemical factors such as the open state of cotton leaf stomata, the photographing time was fixed at around 14:00 (Beijing time, China). FIG. 5 and FIG. 6 show some of acquisition results, where FIG. 5 shows an image of a cotton leaf not stressed by cotton aphids, and FIG. 6 shows an image of a cotton leaf stressed by cotton aphids. Both FIG. 5 and FIG. 6 are hyperspectral images in the band with a wavelength of 740 nm. Acquisition of average spectra and first-order derivative spectra of different cotton aphid damage levels: with single leaves as regions of interest, average spectra were calculated, and first order derivative spectra were further calculated. Cotton aphid monitoring technology and band range selection method based on a high throughput plant phenotyping platform and deep learning: first-order derivative spectra in the range of 461-994 nm were selected, and a one-dimensional convolutional neural network was used to learn data patterns, with prediction accuracy up to 0.98. The one-dimensional convolutional neural network was visualized by using the visualization technology, and sensitive bands were selected to discover the importance of each band in the monitoring process. Important bands were selected to establish a cotton aphid stress monitoring model. The importance result of each band is shown in FIG. 7. The bands at the peaks were selected as the sensitive bands, and the sensitive bands were used to establish the cotton aphid stress monitoring model, with the prediction accuracy up to 0.96. Cotton aphid monitoring and technology for visualization of cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning: hyperspectral images of single leaves stressed by cotton aphids in the range of 461-994 nm were selected, and cotton leaf damage areas were marked using a three-dimensional convolutional neural network. The results are shown in FIG. 8 and FIG. 9. In FIG. 8, part (a) shows an image of a cotton leaf stressed by cotton aphids and part (b) shows an image in which the damage areas are marked. In FIG. 9, part (a) shows an image of another cotton leaf stressed by cotton aphids, and part (b) shows an image in which the damage areas are marked. It was found that the cotton leaf damage areas were concentrated around the leaf veins. Benefit evaluation: Through the pilot test, this embodiment has achieved efficient and accurate monitoring and early warning for cotton aphid stress, and can provide corresponding control measures by establishing a cotton aphid stress monitoring and early warning model, to provide technical guidance for farmers to spray at the right time with the right amount. Moreover, this embodiment creatively uses the most advanced information technology to mark cotton leaf damage areas, thus achieving novelty, creativity and practicality incomparable with other existing technologies, thereby bringing obvious economic, ecological and social benefits. FIG. 10 is a structural diagram of an embodiment of a cotton aphid monitoring system based on a high-throughput plant phenotyping platform and deep learning according to the present disclosure. Referring to FIG. 10, the cotton aphid monitoring system based on a high-throughput plant phenotyping platform and deep learning includes: a first obtaining module 1001, configured to obtain first-category cotton plant images and second-category cotton plant images under natural growth conditions, where the first-category cotton plant images include images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images include images of healthy cotton plants; a calibration module 1002, configured to calibrate the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images; a region-of-interest extraction module 1003, configured to extract region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtainfirst-category region-of-interest images and second-category region-of interest images; an average spectra calculation module 1004, configured to calculate an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra; a first-order derivative spectra calculation module 1005, configured to calculate a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first order derivative spectra; a first training and optimization module 1006, configured to train and optimize a first convolutional neural network by using the first-category first-order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network; a visualization analysis module 1007, configured to perform visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and select sensitive bands; a second training and optimization module 1008, configured to train and optimize the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, where the second convolutional neural network and the first convolutional neural network are both a one dimensional convolutional neural network; a second obtaining module 1009, configured to obtain first-order derivative spectra of to-be monitored cotton plant images; and a category prediction module 1010, configured to input the first-order derivative spectra of the to-be-monitored cotton plant images into the trained and optimized second convolutional neural network, and classify the to-be-monitored cotton plant images, to obtain categories to which the to be-monitored cotton plant images belong, where the categories include a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids. The present disclosure provides a cotton aphid monitoring technology and a band range selection method based on a high-throughput plant phenotyping platform and deep learning, to obtain hyperspectral images of cotton leaves by using a hyperspectral imaging system. By using a single cotton leaf as a region of interest, a hyperspectral image of the single leaf is extracted, and an average spectrum and a first-order derivative spectrum are finally calculated. Spectral information and the deep learning technology are fully utilized, importance of each band is discovered by using a visualization technology, and important bands are selected for monitoring and early warning. The cotton aphid stress monitoring and early warning achieve good performance. The present disclosure also provides a technology for visualizing cotton leaf damage areas based on a high-throughput plant phenotyping platform and deep learning, where a three dimensional convolutional neural network is used to learn hyperspectral images of single leaves, and hyperspectral images within a range of visible and near-infrared bands are selected. A significance map is generated by using the visualization technology, so that cotton leaf damage areas stressed by cotton aphids can be found. According to the cotton aphid stress monitoring method and system based on a high-throughput plant phenotyping platform and deep learning disclosed in the present disclosure, cotton aphid monitoring is performed based on the high-throughput plant phenotyping platform and deep learning, and cotton leaf damage areas stressed by cotton aphids are visualized. The deep learning technology is used to directly learn the hyperspectral images of single cotton leaves as sample objects; spectral information and image information of the hyperspectral images are combined to determine a pest stress status without the error of "generalization". At the same time, by using the spectral fusion characteristic of the hyperspectral images, spectra are used as the main factor for cotton aphid stress monitoring, so that the image resolution can be reduced properly. In addition, by using the image characteristics of the hyperspectral images, cotton leaf damage areas caused by cotton aphid stress can be visualized. Based on the existing technology of cotton aphid monitoring based on spectral imaging, the present disclosure learns the hyperspectral images of single cotton leaves as sample objects and uses the image characteristics of the hyperspectral images to mark the cotton leaf damage areas caused by cotton aphid stress at fine granularity. The accuracy of distinguishing whether cotton is stressed by cotton aphids is up to 96%, and it is found that the cotton leave damage areas caused by cotton aphid stress are mainly concentrated in the leaf veins of cotton aphids. Through the band selection method based on deep learning, a classification effect of deep learning is close to or even higher than that of a traditional classifier (a linear classifier or a nonlinear classifier), and it is unnecessary to select a classifier. Meanwhile, using the gradient back propagation process of deep learning, the weight score of each band can be found, so that bands can be selected directly without trial and error for the band selection method. Each embodiment of the present specification is described in a progressive manner, each embodiment focuses on the difference from other embodiments, and for the same and similar parts between the embodiments, reference may be made to each other. For the system disclosed in the embodiments, since the system corresponds to the method disclosed in the embodiments, the description is relatively simple, and reference can be made to the method description. In this specification, several specific embodiments are used for illustration of the principles and implementations of the present disclosure. The description of the foregoing embodiments is used to help illustrate the method of the present disclosure and the core ideas thereof. In addition, those of ordinary skill in the art can make various modifications in terms of specific implementations and scope of application in accordance with the ideas of the present disclosure. In conclusion, the content of the present specification shall not be construed as a limitation to the present disclosure. In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word "comprise" or variations such as "comprises" or "comprising" is used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention. It is to be understood that, if any prior art publication is referred to herein, such reference does not constitute an admission that the publication forms a part of the common general knowledge in the art, in Australia or any other country.

Claims (5)

  1. What is claimed is: 1. A cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning, comprising: obtaining first-category cotton plant images and second-category cotton plant images under natural growth conditions, wherein the first-category cotton plant images comprise images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images comprise images of healthy cotton plants; calibrating the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images; extracting region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtainfirst-category region-of-interest images and second-category region-of-interest images; calculating an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra; calculating a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first-order derivative spectra; training and optimizing a first convolutional neural network by using the first-category first order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network; performing visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and selecting sensitive bands; training and optimizing the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, wherein the second convolutional neural network and the first convolutional neural network are both one-dimensional convolutional neural networks; obtaining first-order derivative spectra of to-be-monitored cotton plant images; and inputting the first-order derivative spectra of the to-be-monitored cotton plant images into the trained and optimized second convolutional neural network, and classifying the to-be-monitored cotton plant images, to obtain categories to which the to-be-monitored cotton plant images belong, wherein the categories comprise a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids.
  2. 2. The cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning according to claim 1, wherein the calibrating thefirst-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images specifically comprises: calibrating the first-category cotton plant images and the second-category cotton plant images IC Ir - Id by using a formula Iw - Id , to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images, wherein Ic represents calibrated cotton aphid images of each category, I represents original cotton aphid images of each category, Id represents dark reference cotton aphid images of each category, and Iw represents white reference cotton aphid images of each category.
  3. 3. The cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning according to claim 1, wherein the extracting region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain first-category region-of-interest images and second-category region-of-interest images specifically comprises: capturing images corresponding to a wavelength range of 461-994 nm from the calibrated first category cotton plant images and the calibrated second-category cotton plant images, to obtain effective images; selecting images corresponding to wavelengths of 461 nm, 548 nm, and 698 nm from the effective images to serve as to-be-synthesized images; synthesizing the to-be-synthesized images into RGB images, to obtain first-category RGB images and second-category RGB images, wherein the first-category RGB images are RGB images of the first-category cotton plant images, and the second-category RGB images are RGB images of the second-category cotton plant images; and calculating mask regions in the first-category RGB images and the second-category RGB images by using a 2G-R-B segmentation algorithm, sequentially extracting single-leaf regions from the effective images according to the mask regions, and using the extracted single-leaf regions as region-of-interest images, to obtain the first-category region-of-interest images and the second category region-of-interest images; wherein after the capturing images corresponding to a wavelength range of 461-994 nm from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain effective images, the method further comprises: training and optimizing a three-dimensional convolutional neural network by using the effective images, to obtain a trained and optimized three-dimensional convolutional neural network; and generating a significance map based on the trained and optimized three-dimensional convolutional neural network by using the visualization analysis technology, and marking cotton leaf damage areas; wherein the calculating an average spectrum of each band for the first-category region-of interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra specifically comprises: calculating an average spectrum of the k-th band of each category according to a formula Bk = mean ( Ai,j,kfor all(i,j) 0), wherein Bk represents the average spectrum of the k-th band of each category; 0 represents the single-leaf regions in the region-of-interest images of each category; i and j represent coordinate values of the region-of-interest images of each category, i represents horizontal coordinate values of the region-of-interest images of each category, and j represents vertical coordinate values of the region-of-interest images of each category; k represents the k-th band; and Aij,k represents values of coordinates (i, j) of the region-of-interest images of each category in the k-th band.
  4. 4. The cotton aphid monitoring method based on a high-throughput plant phenotyping platform and deep learning according to claim 1, wherein the calculating afirst-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first-order derivative spectra specificallycomprises: calculating first-order derivative spectra of each category according to a formula B =Bk' j - Bk Xk+1 - Xk-1 , wherein Xk-i represents a wavelength corresponding to the (k-1)-th band; xk+1 represents a wavelength corresponding to the (k+1)-th band; Bk represents afirst-order derivative spectrum of the k-th band of each category; Bk+1 represents an average spectrum of the (k+1)-th band of each category; and Bk-I represents an average spectrum of the (k-1)-th band of each category; wherein the training and optimizing a first convolutional neural network by using the first category first-order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network specifically comprises: separately dividing the first-category first-order derivative spectra and the second-category first-order derivative spectra according to a ratio of 3:1:1, to obtain a training set, a validation set, and a test set, wherein the training set, the validation set, and the test set all comprise the first category first-order derivative spectra and the second-category first-order derivative spectra; and training and optimizing the first convolutional neural network by using the training set and the validation set, to obtain the trained and optimized first convolutional neural network; wherein the performing visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and selecting sensitive bands specifically comprises: inputting the test set into the trained and optimized first convolutional neural network for classification, to obtain prediction categories, wherein the prediction categories comprise: a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids; determining whether the prediction category is consistent with an actual category corresponding to the first-order derivative spectrum; if yes, recording the first-order derivative spectrum, and obtaining a probability value of the category to which the first-order derivative spectrum belongs; otherwise, giving up recording the first-order derivative spectrum; performing gradient inversion calculation on each recorded probability value according to a w,= abs( BS ° |OP) formula BOP , to obtain a gradient value of each band in each recorded first-order derivative spectrum, wherein wn represents a gradient value of each band in the recorded n-th first order derivative spectrum; n is the same as the number of bands in each recorded first-order derivative spectrum; Sc represents the probability value; Po represents the first-order derivative spectrum; and OP represents each band in OPo; separately calculating an Li distance for the gradient value of each band in each recorded first order derivative spectrum, and summing all the Li distances and then calculating a mean value, to obtain importance of each band, wherein a specific formula is w* = mean( )ES(1,2.S)Wn |I nE(1,2 . . S)Wnjj represents the number of the recorded first-order derivative spectra; 1 represents the L distance, and w* represents the importance of each band; and drawing a line graph according to the importance of each band, and selecting a peak in the line graph as the sensitive band; wherein the obtaining first-order derivative spectra of to-be-monitored cotton plant images specifically comprises: obtaining the to-be-monitored cotton plant images; calibrating the to-be-monitored cotton plant images, to obtain calibrated to-be-monitored cotton plant images; extracting region-of-interest images from the calibrated to-be-monitored cotton plant images, to obtain region-of-interest images of the to-be-monitored cotton plant images; calculating an average spectrum of each band for the region-of-interest images of the to-be monitored cotton plant images, to obtain average spectra of the to-be-monitored cotton plant images; and calculating a first-order derivative spectrum of each band for the average spectra, to obtain the first-order derivative spectra of the to-be-monitored cotton plant images.
  5. 5. A cotton aphid monitoring system based on a high-throughput plant phenotyping platform and deep learning, comprising: a first obtaining module, configured to obtain first-category cotton plant images and second category cotton plant images under natural growth conditions, wherein the first-category cotton plant images comprise images of cotton plants that are increasingly stressed by cotton aphids, and the second-category cotton plant images comprise images of healthy cotton plants; a calibration module, configured to calibrate the first-category cotton plant images and the second-category cotton plant images, to obtain calibrated first-category cotton plant images and calibrated second-category cotton plant images; a region-of-interest extraction module, configured to extract region-of-interest images from the calibrated first-category cotton plant images and the calibrated second-category cotton plant images, to obtain first-category region-of-interest images and second-category region-of-interest images; an average spectra calculation module, configured to calculate an average spectrum of each band for the first-category region-of-interest images and the second-category region-of-interest images, to obtain first-category average spectra and second-category average spectra; a first-order derivative spectra calculation module, configured to calculate a first-order derivative spectrum of each band for the first-category average spectra and the second-category average spectra, to obtain first-category first-order derivative spectra and second-category first order derivative spectra; a first training and optimization module, configured to train and optimize a first convolutional neural network by using the first-category first-order derivative spectra and the second-category first-order derivative spectra, to obtain a trained and optimized first convolutional neural network; a visualization analysis module, configured to perform visualization analysis on the trained and optimized first convolutional neural network by using a visualization analysis technology, and select sensitive bands; a second training and optimization module, configured to train and optimize the trained and optimized first convolutional neural network by using first-order derivative spectra of the sensitive bands, to obtain a trained and optimized second convolutional neural network, wherein the second convolutional neural network and the first convolutional neural network are both a one-dimensional convolutional neural network; a second obtaining module, configured to obtain first-order derivative spectra of to-be monitored cotton plant images; and a category prediction module, configured to input the first-order derivative spectra of the to-be monitored cotton plant images into the trained and optimized second convolutional neural network, and classify the to-be-monitored cotton plant images, to obtain categories to which the to-be monitored cotton plant images belong, wherein the categories comprise a category of not being stressed by cotton aphids and a category of being stressed by cotton aphids.
    Sheet 1/7 21 May 2021 2021102735
    FIG. 1 17710476_1 (GHMatters) P116302.AU
    FIG. 2 Sheet 2/7
    FIG. 3 Sheet 3/7
    FIG. 4 Sheet 4/7
    Sheet 5/7 21 May 2021 2021102735
    FIG. 5
    FIG. 6
    Wavelength (nm)
    FIG. 7
    Sheet 6/7 21 May 2021 2021102735
    FIG. 8
    FIG. 9
    FIG. 10 Sheet 7/7
AU2021102735A 2021-05-21 2021-05-21 Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning Ceased AU2021102735A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021102735A AU2021102735A4 (en) 2021-05-21 2021-05-21 Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021102735A AU2021102735A4 (en) 2021-05-21 2021-05-21 Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning

Publications (1)

Publication Number Publication Date
AU2021102735A4 true AU2021102735A4 (en) 2021-07-15

Family

ID=76785324

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021102735A Ceased AU2021102735A4 (en) 2021-05-21 2021-05-21 Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning

Country Status (1)

Country Link
AU (1) AU2021102735A4 (en)

Similar Documents

Publication Publication Date Title
WO2022160771A1 (en) Method for classifying hyperspectral images on basis of adaptive multi-scale feature extraction model
CN104751478B (en) Object-oriented building change detection method based on multi-feature fusion
Ruiz-Ruiz et al. Testing different color spaces based on hue for the environmentally adaptive segmentation algorithm (EASA)
CN107563999A (en) A kind of chip defect recognition methods based on convolutional neural networks
CN108460391B (en) Hyperspectral image unsupervised feature extraction method based on generation countermeasure network
CN112241762B (en) Fine-grained identification method for pest and disease damage image classification
CN102915446A (en) Plant disease and pest detection method based on SVM (support vector machine) learning
Sabrol et al. Fuzzy and neural network based tomato plant disease classification using natural outdoor images
CN111860330A (en) Apple leaf disease identification method based on multi-feature fusion and convolutional neural network
CN113657294B (en) Crop disease and insect pest detection method and system based on computer vision
CN109829425B (en) Farmland landscape small-scale ground feature classification method and system
CN112528726B (en) Cotton aphid pest monitoring method and system based on spectral imaging and deep learning
Lin et al. Identification of pumpkin powdery mildew based on image processing PCA and machine learning
CN110349176B (en) Target tracking method and system based on triple convolutional network and perceptual interference learning
CN114332534B (en) Hyperspectral image small sample classification method
CN114965501A (en) Peanut disease detection and yield prediction method based on canopy parameter processing
Orillo et al. Rice plant nitrogen level assessment through image processing using artificial neural network
Guo et al. Dual-concentrated network with morphological features for tree species classification using hyperspectral image
Miao et al. Crop weed identification system based on convolutional neural network
CN109670408A (en) A kind of object-based remote sensing images Clean water withdraw method
AU2021102735A4 (en) Cotton aphid stress monitoring method and system based on high-throughput plant phenotyping platform and deep learning
CN110516727B (en) Hyperspectral image classification method based on FPGA (field programmable Gate array) depth edge filter
CN104463230B (en) A kind of band selection method using the target in hyperspectral remotely sensed image of constraint piecemeal in pairs
CN116863345A (en) High-resolution image farmland recognition method based on dual attention and scale fusion
Zhang et al. Digital instruments recognition based on PCA-BP neural network

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry