CN117541423A - Aphis gossypii harm monitoring method and system based on fusion map features - Google Patents

Aphis gossypii harm monitoring method and system based on fusion map features Download PDF

Info

Publication number
CN117541423A
CN117541423A CN202311815522.2A CN202311815522A CN117541423A CN 117541423 A CN117541423 A CN 117541423A CN 202311815522 A CN202311815522 A CN 202311815522A CN 117541423 A CN117541423 A CN 117541423A
Authority
CN
China
Prior art keywords
cotton
features
feature
acquiring
cotton aphid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311815522.2A
Other languages
Chinese (zh)
Inventor
安静杰
郭江龙
李耀发
高占林
党志红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Plant Protection Institute hebei Academy Of Agricultural And Forestry Sciences
Original Assignee
Plant Protection Institute hebei Academy Of Agricultural And Forestry Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Plant Protection Institute hebei Academy Of Agricultural And Forestry Sciences filed Critical Plant Protection Institute hebei Academy Of Agricultural And Forestry Sciences
Priority to CN202311815522.2A priority Critical patent/CN117541423A/en
Publication of CN117541423A publication Critical patent/CN117541423A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Mining & Mineral Resources (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention discloses a cotton aphid hazard monitoring method and system based on fusion map features, comprising the steps of obtaining hyperspectral data and image data of a target cotton field area, constructing a feature extraction network, and obtaining hyperspectral features and image features corresponding to cotton plants; carrying out image significance analysis and identification by utilizing image features, segmenting a cotton aphid pest area, obtaining a cotton aphid pest area representation, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area representation, and obtaining fused map features after fusion; constructing a cotton aphid hazard monitoring model, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution characteristics; and extracting influencing factors of the target cotton field area, and determining the prevention and control direction of the target cotton field according to the influencing factors. The method breaks through the traditional manual investigation method, improves the timeliness and accuracy of monitoring and early warning of the cotton aphids, acquires the influence mechanism of regional cotton aphid distribution, and provides reference for prevention and control of the cotton aphids.

Description

Aphis gossypii harm monitoring method and system based on fusion map features
Technical Field
The invention relates to the technical field of pest monitoring, in particular to a cotton aphid hazard monitoring method and system based on fusion map features.
Background
The cotton aphid sucks juice in cotton leaves or tender tissues by using a piercing-sucking type mouth gag to cause nutrition deterioration of cotton plants, so that seedlings wilt, leaf atrophy deformity, leaf withering or leaf withering are caused, and the cotton aphid can influence respiration and photosynthesis of the leaves to cause reduction of cotton yield and quality. Therefore, the prevention and control of cotton aphids are of great significance to cotton production. In the research of monitoring and controlling cotton aphid insect pests, the statistics of cotton aphid population quantity is usually carried out manually, and some researches also need to judge the death and the alive of cotton aphids. The cotton aphids are extremely small in shape, a plurality of test areas with higher cotton aphid densities are difficult to count directly, the workload is large, time and labor are wasted, and an intelligent monitoring method is urgently needed to replace manual counting.
At present, cotton aphid monitoring in cotton fields still mainly depends on a traditional method, namely a base layer measuring and reporting personnel adopts a visual inspection and manual inspection mode in the field so as to observe whether cotton aphids occur and the hazard degree of the cotton aphids, predict the possibility of cotton aphid outbreaks according to experience, and develop effective plant protection, prevention and control work based on the method. The method is time-consuming and labor-consuming, hysteresis and error of the acquired information influence the accuracy of aphid pest forecasting, and meanwhile, the excessive use of pesticides causes environmental pollution. By accurately and timely collecting the quantity information of cotton aphids in the cotton growth stage, the method for quickly monitoring and identifying the cotton aphids is researched, warning and forecasting can be carried out before the cotton aphids appear, guidance for preventing and controlling the cotton aphids is provided for cotton farmers, and the method is helpful for reducing yield loss caused by the cotton aphids.
Disclosure of Invention
In order to solve the technical problems, the invention provides a cotton aphid hazard monitoring method and system based on fusion map features.
The first aspect of the invention provides a cotton aphid hazard monitoring method based on fusion map features, which comprises the following steps:
acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
carrying out image significance analysis and identification and segmentation on the cotton aphid pest areas by utilizing the image features, obtaining cotton aphid pest area characterization, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area characterization, and obtaining fused map features after fusion;
constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution features;
and extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
In the scheme, a feature extraction network is constructed to obtain hyperspectral features and image features corresponding to cotton plants, and the method specifically comprises the following steps:
Acquiring hyperspectral image data corresponding to a cotton crown layer in a cotton field of a current timestamp target, performing splicing, geometric correction and radiation calibration on the hyperspectral image data, extracting spectral domain data through the preprocessed hyperspectral image data, and acquiring hyperspectral reflectivity as hyperspectral data;
extracting spatial domain data in the preprocessed hyperspectral image data as image data, preprocessing the hyperspectral data and the image data, retrieving and acquiring response characteristics of spectrums and images in insect pest identification examples according to big data, and constructing a response characteristic set;
acquiring correlations of different characteristic indexes and insect pest severity from the response characteristic set, screening response characteristic subsets meeting preset standards by utilizing the correlations, and respectively training one-dimensional convolutional neural networks in two branches through the characteristic subset training to construct a characteristic extraction network;
and extracting characteristic data of different branches by utilizing the characteristic extraction network to obtain hyperspectral characteristics and image characteristics of cotton plants in the target cotton fields.
In the scheme, the image features are utilized to analyze and identify the image saliency and divide the cotton aphid pest areas, and the method specifically comprises the following steps:
Obtaining image features corresponding to image data, extracting texture feature subsets and color feature subsets from the image features, calculating information entropy of different wave bands in the image data through the texture features, and filtering wave bands which do not accord with a preset threshold;
performing super-pixel segmentation on the image data to obtain corresponding super-pixel blocks, clustering the super-pixel blocks, randomly selecting an initial clustering center, distributing the super-pixel blocks to the initial clustering center closest to the initial clustering center, calculating a new clustering center, and updating a class cluster of iterative super-pixel blocks;
acquiring a final clustering result, acquiring initial partitioning results of a cotton plant area and a background area according to partitioning results of different clusters, generating different super-pixel areas, and acquiring color features corresponding to the cotton plant super-pixel areas;
acquiring distance measurement of super pixel blocks in a color space in the cotton plant super pixel region according to the color characteristics, and carrying out image significance analysis by combining the distance measurement in the color space with Euclidean distance measurement among different super pixel blocks to acquire a potential cotton aphid pest target region;
optimizing the asymptotic feature pyramid network through cavity convolution, acquiring feature graphs of different scales of the target potential area, fusing, and identifying and dividing the cotton aphid pest area of the target potential area by utilizing the fused feature graphs.
In this scheme, based on cotton aphid area characterization carries out the feature selection to hyperspectral feature and image feature, specifically does:
acquiring different scale feature images of the cotton aphid pest area as the characteristic of the cotton aphid pest area, and calculating the independence among the selected feature, the candidate feature, the condition mutual information of the characteristic of the cotton aphid pest area and the combined mutual information acquisition feature in the hyperspectral feature and the image feature;
calculating the correlation between the mutual information acquisition characteristics of the selected characteristics and the characteristics of the cotton aphid pest areas and the characteristics of the pest damage, selecting a preset number of characteristics according to the correlation and the independence, copying the selected characteristics and replacing the characteristics with the original characteristic set according to the preset characteristics;
constructing a cotton aphid damage classifier, acquiring a precision reduction value of the classifier in a replacement process, screening a precision reduction maximum value, and selecting a replacement feature with the precision reduction value larger than the precision reduction maximum value as a key feature;
and outputting a key feature subset in the hyperspectral features and the image features after iterative selection, and performing splicing and fusion of the key hyperspectral features and the key image features in the key feature subset to obtain fusion map features.
In the scheme, a cotton aphid hazard monitoring model is constructed by utilizing a deep learning method according to the fusion map features, and the method specifically comprises the following steps:
Acquiring a characteristic index based on the fusion map characteristic, acquiring sample data from historical hyperspectral image data through the characteristic index, and constructing a training set and a verification set according to the sample data;
constructing a cotton aphid hazard monitoring model according to a self-encoder network, performing model training by utilizing the training set, inputting fusion map features as a model, and sampling in the fusion map features according to the cotton aphid hazard monitoring model to obtain feature distribution;
performing feature reconstruction by using a self-encoder according to the feature distribution, obtaining reconstruction data, obtaining feature distribution corresponding to the reconstruction data by using a decoder, and importing a follow-up full-connection layer to obtain residual errors of the feature distribution corresponding to the reconstruction data and the feature distribution corresponding to a normal cotton plant;
and comparing the residual error with a preset threshold interval, and acquiring the hazard class of the cotton aphids according to the threshold interval in which the residual error falls.
In this scheme, through the space-time distribution distributes the influence factor that draws the regional extraction target cotton field, according to the influence factor confirms the prevention and control direction of target cotton field, specifically does:
dividing grids of cotton plant areas in the image data, acquiring cotton aphid hazard grades in different grids by using the cotton aphid hazard monitoring model, and setting label information of the grids according to the cotton aphid hazard grades;
Acquiring a hazard change sequence of cotton aphid insect pests by utilizing historical monitoring data of a target cotton field area, acquiring hazard time sequence features according to the hazard change sequence, acquiring hazard space features by the tag information distribution, and acquiring space-time distribution features by combining the hazard time sequence features with the hazard space features;
acquiring a cotton aphid thermodynamic diagram of a target cotton field area according to the space-time distribution characteristics, acquiring environmental characteristics corresponding to a high-hazard area, carrying out principal component analysis on the environmental characteristics in combination with meteorological characteristics, and carrying out principal component projection according to the principal component characteristics;
acquiring deviation features of a feature scatter diagram of a high-hazard region and a feature scatter diagram of a low-hazard region, extracting influence factors of a target cotton field region according to the deviation features, and acquiring a potential high-hazard region by combining the influence factors with the cotton aphid thermodynamic diagram;
and searching and obtaining similar prevention and control examples according to the influence factors and insect pest characterization, and obtaining prevention and control directions by utilizing the similar prevention and control examples to carry out statistical analysis.
The second aspect of the invention also provides a cotton aphid hazard monitoring system based on fusion map features, which comprises: the cotton aphid hazard monitoring system comprises a memory and a processor, wherein the memory comprises a cotton aphid hazard monitoring method program based on fusion map features, and the cotton aphid hazard monitoring method program based on the fusion map features realizes the following steps when being executed by the processor:
Acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
carrying out image significance analysis and identification and segmentation on the cotton aphid pest areas by utilizing the image features, obtaining cotton aphid pest area characterization, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area characterization, and obtaining fused map features after fusion;
constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution features;
and extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
The invention discloses a cotton aphid hazard monitoring method and system based on fusion map features, comprising the steps of obtaining hyperspectral data and image data of a target cotton field area, constructing a feature extraction network, and obtaining hyperspectral features and image features corresponding to cotton plants; carrying out image significance analysis and identification by utilizing image features, segmenting a cotton aphid pest area, obtaining a cotton aphid pest area representation, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area representation, and obtaining fused map features after fusion; constructing a cotton aphid hazard monitoring model, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution characteristics; and extracting influencing factors of the target cotton field area, and determining the prevention and control direction of the target cotton field according to the influencing factors. The method breaks through the traditional manual investigation method, improves the timeliness and accuracy of monitoring and early warning of the cotton aphids, acquires the influence mechanism of regional cotton aphid distribution, and provides reference for prevention and control of the cotton aphids.
Drawings
FIG. 1 shows a flow chart of a method for monitoring Aphis gossypii harm based on fusion map features of the present invention;
FIG. 2 shows a flow chart of the present invention for segmenting cotton aphid pest areas using image features;
FIG. 3 shows a flow chart of the invention for constructing a cotton aphid hazard monitoring model;
fig. 4 shows a block diagram of a cotton aphid hazard monitoring system based on fusion map features of the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description. It should be noted that, in the case of no conflict, the embodiments of the present application and the features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those described herein, and therefore the scope of the present invention is not limited to the specific embodiments disclosed below.
Fig. 1 shows a flow chart of a cotton aphid hazard monitoring method based on fusion map features.
As shown in FIG. 1, the first aspect of the invention provides a method for monitoring Aphis gossypii harm based on fusion map features, comprising
S102, acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
s104, carrying out image significance analysis and identification and segmentation on the cotton aphid area by utilizing the image features, obtaining the cotton aphid area characterization, carrying out feature selection on the hyperspectral features and the image features based on the cotton aphid area characterization, and obtaining the fused map features after fusion;
s106, constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of the target cotton field area, and acquiring corresponding space-time distribution features;
s108, extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
It should be noted that, the imaging spectrum technology mainly relies on obtaining image information of crop canopy to monitor growth conditions, nutrient characteristics and pest and disease stress conditions of field crops. Acquiring hyperspectral image data corresponding to a cotton crown layer in a cotton field of a current timestamp target according to a navigation point planned by a shooting path through an unmanned aerial vehicle remote sensing technology, splicing, geometrically correcting and radiating the hyperspectral image data, cutting images of spliced images, and removing an unmanned aerial vehicle image boundary part with a distortion phenomenon; extracting spectral domain data through the preprocessed hyperspectral image data to obtain hyperspectral reflectivity as hyperspectral data; extracting spatial domain data in the preprocessed hyperspectral image data as image data, preprocessing the hyperspectral data and the image data, filtering the image data, acquiring a sensitive wave band of the spectral data, acquiring response characteristics of a spectrum and an image in a pest identification example according to big data retrieval, and constructing a response characteristic set; acquiring correlations of different characteristic indexes and insect pest severity from the response characteristic set, screening response characteristic subsets meeting preset standards by utilizing the correlations, and respectively training one-dimensional convolutional neural networks in two branches through the characteristic subset training to construct a characteristic extraction network; and extracting characteristic data of different branches by utilizing the characteristic extraction network to obtain hyperspectral characteristics and image characteristics of cotton plants in the target cotton fields.
FIG. 2 shows a flow chart of the present invention for segmenting cotton aphid infestations using image features.
According to the embodiment of the invention, the image features are utilized to analyze and identify the image saliency and divide the cotton aphid pest areas, and the method specifically comprises the following steps:
s202, obtaining image features corresponding to image data, extracting a texture feature subset and a color feature subset from the image features, calculating information entropy of different wave bands in the image data through the texture features, and filtering out the wave bands which do not accord with a preset threshold;
s204, performing super-pixel segmentation on the image data to obtain corresponding super-pixel blocks, clustering the super-pixel blocks, randomly selecting an initial clustering center, distributing the super-pixel blocks to the initial clustering center closest to the initial clustering center, calculating a new clustering center, and updating a class cluster of iterative super-pixel blocks;
s206, acquiring a final clustering result, acquiring initial partitioning results of the cotton plant area and the background area according to partitioning results of different clusters, generating different super-pixel areas, and acquiring color features corresponding to the cotton plant super-pixel areas;
s208, acquiring distance measurement of super pixel blocks in a cotton plant super pixel region in a color space according to the color characteristics, and carrying out image significance analysis by combining the distance measurement in the color space with Euclidean distance measurement among different super pixel blocks to acquire a cotton aphid pest target potential region;
S210, optimizing an asymptotic feature pyramid network through cavity convolution, acquiring feature graphs of different scales of the target potential area, fusing, and identifying and dividing a cotton aphid pest area of the target potential area by utilizing the fused feature graphs.
The information entropy is calculated from the texture features of the image data, and the larger the information entropy is, the higher the quality of the image is, and the less noise is included. Selecting higher-quality wave band image data by utilizing information entropy, performing super-pixel segmentation by utilizing an SLIC algorithm to obtain super-pixel blocks with clear boundary segmentation, and performing self-adaptive segmentation of a cotton plant area and a background area by utilizing a clustering algorithm. And the cavity convolution is introduced into the asymptotic feature pyramid network, so that the size of the feature map is kept unchanged while the receptive field is increased, the recognition of cotton aphid is facilitated, the receptive field is enlarged through the expansion convolution and pyramid structure, the feature information of different scales is fused, and the detection effect on the weak cotton aphid target is improved.
It should be noted that, different scale feature maps of the cotton aphid pest area are obtained as the cotton aphid pest area characterization, and the selected feature x is obtained in the feature subset S corresponding to the hyperspectral feature and the image feature i Selecting the remaining features of the feature subset as candidate features, and calculating the selected featuresSign x i Candidate feature x j Mutual information of conditions of cotton aphid pest zone representation zCombined mutual information->Obtaining independence between featuresAnd calculating mutual information I (x) representing the selected features and the cotton aphid pest areas i The method comprises the steps of carrying out a first treatment on the surface of the z) obtaining a correlation of the characteristic with the pest damage characterization, combining according to the correlation and independenceSelecting a preset number of features, copying the selected features and replacing the features with the original feature set according to the preset number; training an SVM classifier to construct a cotton aphid pest classifier for cotton aphid pest identification, acquiring a precision reduction value of the classifier in a replacement process, screening a maximum value of precision reduction, and selecting a replacement feature with the precision reduction value larger than the maximum value of precision reduction as a key feature; and outputting a key feature subset in the hyperspectral features and the image features after iterative selection, and performing splicing and fusion of the key hyperspectral features and the key image features in the key feature subset to obtain fusion map features.
FIG. 3 shows a flow chart of the invention for constructing a cotton aphid hazard monitoring model.
According to the embodiment of the invention, a cotton aphid hazard monitoring model is constructed by utilizing a deep learning method according to the fusion map features, and the method specifically comprises the following steps:
S302, acquiring a characteristic index based on the fusion map characteristic, acquiring sample data from historical hyperspectral image data through the characteristic index, and constructing a training set and a verification set according to the sample data;
s304, constructing a cotton aphid hazard monitoring model according to a self-encoder network, performing model training by utilizing the training set, inputting fusion map features as a model, and sampling in the fusion map features according to the cotton aphid hazard monitoring model to obtain feature distribution;
s306, performing feature reconstruction by using a self-encoder according to the feature distribution, obtaining reconstruction data, obtaining feature distribution corresponding to the reconstruction data by using a decoder, and importing a subsequent full-connection layer to obtain residual errors of the feature distribution corresponding to the reconstruction data and the feature distribution corresponding to a normal cotton plant;
s308, comparing the residual error with a preset threshold interval, and acquiring the Aphis gossypii harm grade according to the threshold interval in which the residual error falls.
It should be noted that, extracting the average mean and the average variance of the fusion map features to obtain the feature distribution corresponding to the fusion map features; acquiring characteristic distribution according to the characteristic training of the fusion map corresponding to the normal cotton plant as a healthy plant reference, and acquiring reconstruction data by utilizing the characteristic distribution of the input data; and analyzing the input data according to the relation between the residual error and the threshold value, and determining the hazard level of the cotton aphids. By encoding and decoding the fusion map features, effective modeling of the inherent correlation among different map features is achieved, and the cotton aphid hazard features of the target cotton field area are effectively excavated.
The cotton plant areas in the image data are subjected to grid division, the cotton aphid hazard grades in different grids are obtained by using the cotton aphid hazard monitoring model, and the label information of the grids is set according to the cotton aphid hazard grades; acquiring a hazard change sequence of cotton aphid insect pests by utilizing historical monitoring data of a target cotton field area, acquiring hazard time sequence features according to the hazard change sequence, acquiring hazard space features by the tag information distribution, and acquiring space-time distribution features by combining the hazard time sequence features with the hazard space features; acquiring a cotton aphid thermodynamic diagram of a target cotton field area according to the space-time distribution characteristics, acquiring environmental characteristics corresponding to a high-hazard area, carrying out principal component analysis on the environmental characteristics in combination with meteorological characteristics, and carrying out principal component projection according to the principal component characteristics; acquiring deviation features of a feature scatter diagram of a high-hazard region and a feature scatter diagram of a low-hazard region, extracting influence factors of a target cotton field region according to the deviation features, and acquiring a potential high-hazard region by combining the influence factors with the cotton aphid thermodynamic diagram; and searching and obtaining similar prevention and control examples according to the influence factors and insect pest characterization, and obtaining prevention and control directions by utilizing the similar prevention and control examples to carry out statistical analysis.
According to the implementation of the invention, the method comprises the steps of obtaining the harmful symptoms of a target cotton field area, determining the aphid characterization of cotton, obtaining the generation period of cotton aphids according to the aphid characterization of cotton, predicting the outbreak period of cotton aphids according to the generation period in combination with climate factors and space-time distribution characteristics, and extracting the time stamp of the outbreak period of cotton aphids; extracting a neighborhood cotton field of a target cotton field, obtaining a predicted cotton aphid hazard level of the neighborhood cotton field at the time stamp, determining cotton aphid hazard level deviation of the target cotton field and the neighborhood cotton field at the time stamp, setting migration weights of different neighborhood cotton fields according to the cotton aphid hazard level deviation, predicting a migration path of cotton aphids in a target cotton field area according to the migration weights, setting physical prevention measures in advance based on the migration path, and introducing natural enemies of cotton aphids into the neighborhood cotton field related to the migration path.
Fig. 4 shows a block diagram of a cotton aphid hazard monitoring system based on fusion map features of the present invention.
The second aspect of the invention also provides a cotton aphid hazard monitoring system 4 based on fusion map features, the system comprising: the memory 41 and the processor 42, wherein the memory comprises a cotton aphid hazard monitoring method program based on fusion map features, and the cotton aphid hazard monitoring method program based on fusion map features realizes the following steps when being executed by the processor:
Acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
carrying out image significance analysis and identification and segmentation on the cotton aphid pest areas by utilizing the image features, obtaining cotton aphid pest area characterization, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area characterization, and obtaining fused map features after fusion;
constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution features;
and extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
According to the embodiment of the invention, the image features are utilized to analyze and identify the image saliency and divide the cotton aphid pest areas, and the method specifically comprises the following steps:
obtaining image features corresponding to image data, extracting texture feature subsets and color feature subsets from the image features, calculating information entropy of different wave bands in the image data through the texture features, and filtering wave bands which do not accord with a preset threshold;
Performing super-pixel segmentation on the image data to obtain corresponding super-pixel blocks, clustering the super-pixel blocks, randomly selecting an initial clustering center, distributing the super-pixel blocks to the initial clustering center closest to the initial clustering center, calculating a new clustering center, and updating a class cluster of iterative super-pixel blocks;
acquiring a final clustering result, acquiring initial partitioning results of a cotton plant area and a background area according to partitioning results of different clusters, generating different super-pixel areas, and acquiring color features corresponding to the cotton plant super-pixel areas;
acquiring distance measurement of super pixel blocks in a color space in the cotton plant super pixel region according to the color characteristics, and carrying out image significance analysis by combining the distance measurement in the color space with Euclidean distance measurement among different super pixel blocks to acquire a potential cotton aphid pest target region;
optimizing the asymptotic feature pyramid network through cavity convolution, acquiring feature graphs of different scales of the target potential area, fusing, and identifying and dividing the cotton aphid pest area of the target potential area by utilizing the fused feature graphs.
The information entropy is calculated from the texture features of the image data, and the larger the information entropy is, the higher the quality of the image is, and the less noise is included. Selecting higher-quality wave band image data by utilizing information entropy, performing super-pixel segmentation by utilizing an SLIC algorithm to obtain super-pixel blocks with clear boundary segmentation, and performing self-adaptive segmentation of a cotton plant area and a background area by utilizing a clustering algorithm. And the cavity convolution is introduced into the asymptotic feature pyramid network, so that the size of the feature map is kept unchanged while the receptive field is increased, the recognition of cotton aphid is facilitated, the receptive field is enlarged through the expansion convolution and pyramid structure, the feature information of different scales is fused, and the detection effect on the weak cotton aphid target is improved.
It should be noted that, different scale feature maps of the cotton aphid pest area are obtained as the cotton aphid pest area characterization, and the selected feature x is obtained in the feature subset S corresponding to the hyperspectral feature and the image feature i Selecting the remaining features of the feature subset as candidate features, and calculating the selected feature x i Candidate feature x j Mutual information of conditions of cotton aphid pest zone representation zCombined mutual information->Obtaining independence between featuresAnd calculating mutual information I (x) representing the selected features and the cotton aphid pest areas i The method comprises the steps of carrying out a first treatment on the surface of the z) obtaining a correlation of the characteristic with the pest damage characterization, combining according to the correlation and independenceSelecting a preset number of features, copying the selected features and replacing the features with the original feature set according to the preset number; training an SVM classifier to construct a cotton aphid pest classifier for cotton aphid pest identification, acquiring a precision reduction value of the classifier in a replacement process, screening a maximum value of precision reduction, and selecting a replacement feature with the precision reduction value larger than the maximum value of precision reduction as a key feature; laminationAnd outputting a key feature subset in the hyperspectral features and the image features after the substitution selection, and performing splicing fusion of the key hyperspectral features and the key image features in the key feature subset to obtain fusion map features.
According to the embodiment of the invention, a cotton aphid hazard monitoring model is constructed by utilizing a deep learning method according to the fusion map features, and the method specifically comprises the following steps:
acquiring a characteristic index based on the fusion map characteristic, acquiring sample data from historical hyperspectral image data through the characteristic index, and constructing a training set and a verification set according to the sample data;
constructing a cotton aphid hazard monitoring model according to a self-encoder network, performing model training by utilizing the training set, inputting fusion map features as a model, and sampling in the fusion map features according to the cotton aphid hazard monitoring model to obtain feature distribution;
performing feature reconstruction by using a self-encoder according to the feature distribution, obtaining reconstruction data, obtaining feature distribution corresponding to the reconstruction data by using a decoder, and importing a follow-up full-connection layer to obtain residual errors of the feature distribution corresponding to the reconstruction data and the feature distribution corresponding to a normal cotton plant;
and comparing the residual error with a preset threshold interval, and acquiring the hazard class of the cotton aphids according to the threshold interval in which the residual error falls.
It should be noted that, extracting the average mean and the average variance of the fusion map features to obtain the feature distribution corresponding to the fusion map features; acquiring characteristic distribution according to the characteristic training of the fusion map corresponding to the normal cotton plant as a healthy plant reference, and acquiring reconstruction data by utilizing the characteristic distribution of the input data; and analyzing the input data according to the relation between the residual error and the threshold value, and determining the hazard level of the cotton aphids. By encoding and decoding the fusion map features, effective modeling of the inherent correlation among different map features is achieved, and the cotton aphid hazard features of the target cotton field area are effectively excavated.
The third aspect of the present invention also provides a computer readable storage medium, wherein the computer readable storage medium includes a cotton aphid hazard monitoring method program based on a fusion map feature, and when the cotton aphid hazard monitoring method program based on the fusion map feature is executed by a processor, the steps of the cotton aphid hazard monitoring method based on the fusion map feature are implemented.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units; can be located in one place or distributed to a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk or an optical disk, or the like, which can store program codes.
Alternatively, the above-described integrated units of the present invention may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A cotton aphid hazard monitoring method based on fusion map features is characterized by comprising the following steps:
acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
carrying out image significance analysis and identification and segmentation on the cotton aphid pest areas by utilizing the image features, obtaining cotton aphid pest area characterization, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area characterization, and obtaining fused map features after fusion;
constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution features;
And extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
2. The method for monitoring the damage of cotton aphids based on fusion map features as set forth in claim 1, wherein the feature extraction network is constructed to obtain hyperspectral features and image features corresponding to cotton plants, specifically:
acquiring hyperspectral image data corresponding to a cotton crown layer in a cotton field of a current timestamp target, performing splicing, geometric correction and radiation calibration on the hyperspectral image data, extracting spectral domain data through the preprocessed hyperspectral image data, and acquiring hyperspectral reflectivity as hyperspectral data;
extracting spatial domain data in the preprocessed hyperspectral image data as image data, preprocessing the hyperspectral data and the image data, retrieving and acquiring response characteristics of spectrums and images in insect pest identification examples according to big data, and constructing a response characteristic set;
acquiring correlations of different characteristic indexes and insect pest severity from the response characteristic set, screening response characteristic subsets meeting preset standards by utilizing the correlations, and respectively training one-dimensional convolutional neural networks in two branches through the characteristic subset training to construct a characteristic extraction network;
And extracting characteristic data of different branches by utilizing the characteristic extraction network to obtain hyperspectral characteristics and image characteristics of cotton plants in the target cotton fields.
3. The cotton aphid hazard monitoring method based on the fusion map features as claimed in claim 1, wherein the image features are used for carrying out image saliency analysis and identifying and segmenting cotton aphid hazard areas, specifically:
obtaining image features corresponding to image data, extracting texture feature subsets and color feature subsets from the image features, calculating information entropy of different wave bands in the image data through the texture features, and filtering wave bands which do not accord with a preset threshold;
performing super-pixel segmentation on the image data to obtain corresponding super-pixel blocks, clustering the super-pixel blocks, randomly selecting an initial clustering center, distributing the super-pixel blocks to the initial clustering center closest to the initial clustering center, calculating a new clustering center, and updating a class cluster of iterative super-pixel blocks;
acquiring a final clustering result, acquiring initial partitioning results of a cotton plant area and a background area according to partitioning results of different clusters, generating different super-pixel areas, and acquiring color features corresponding to the cotton plant super-pixel areas;
Acquiring distance measurement of super pixel blocks in a color space in the cotton plant super pixel region according to the color characteristics, and carrying out image significance analysis by combining the distance measurement in the color space with Euclidean distance measurement among different super pixel blocks to acquire a potential cotton aphid pest target region;
optimizing the asymptotic feature pyramid network through cavity convolution, acquiring feature graphs of different scales of the target potential area, fusing, and identifying and dividing the cotton aphid pest area of the target potential area by utilizing the fused feature graphs.
4. The method for monitoring the damage of cotton aphids based on the fusion map features as claimed in claim 1, wherein the feature selection is performed on hyperspectral features and image features based on the characterization of the cotton aphid regions, specifically:
acquiring different scale feature images of the cotton aphid pest area as the characteristic of the cotton aphid pest area, and calculating the independence among the selected feature, the candidate feature, the condition mutual information of the characteristic of the cotton aphid pest area and the combined mutual information acquisition feature in the hyperspectral feature and the image feature;
calculating the correlation between the mutual information acquisition characteristics of the selected characteristics and the characteristics of the cotton aphid pest areas and the characteristics of the pest damage, selecting a preset number of characteristics according to the correlation and the independence, copying the selected characteristics and replacing the characteristics with the original characteristic set according to the preset characteristics;
Constructing a cotton aphid damage classifier, acquiring a precision reduction value of the classifier in a replacement process, screening a precision reduction maximum value, and selecting a replacement feature with the precision reduction value larger than the precision reduction maximum value as a key feature;
and outputting a key feature subset in the hyperspectral features and the image features after iterative selection, and performing splicing and fusion of the key hyperspectral features and the key image features in the key feature subset to obtain fusion map features.
5. The cotton aphid hazard monitoring method based on the fusion map features as claimed in claim 1, wherein the cotton aphid hazard monitoring model is constructed by utilizing a deep learning method according to the fusion map features, and specifically comprises the following steps:
acquiring a characteristic index based on the fusion map characteristic, acquiring sample data from historical hyperspectral image data through the characteristic index, and constructing a training set and a verification set according to the sample data;
constructing a cotton aphid hazard monitoring model according to a self-encoder network, performing model training by utilizing the training set, inputting fusion map features as a model, and sampling in the fusion map features according to the cotton aphid hazard monitoring model to obtain feature distribution;
Performing feature reconstruction by using a self-encoder according to the feature distribution, obtaining reconstruction data, obtaining feature distribution corresponding to the reconstruction data by using a decoder, and importing a follow-up full-connection layer to obtain residual errors of the feature distribution corresponding to the reconstruction data and the feature distribution corresponding to a normal cotton plant;
and comparing the residual error with a preset threshold interval, and acquiring the hazard class of the cotton aphids according to the threshold interval in which the residual error falls.
6. The cotton aphid hazard monitoring method based on the fusion map features of claim 1, wherein the method is characterized in that the influence factors of the target cotton field area are extracted through the space-time distribution, and the prevention and control direction of the target cotton field is determined according to the influence factors, specifically:
dividing grids of cotton plant areas in the image data, acquiring cotton aphid hazard grades in different grids by using the cotton aphid hazard monitoring model, and setting label information of the grids according to the cotton aphid hazard grades;
acquiring a hazard change sequence of cotton aphid insect pests by utilizing historical monitoring data of a target cotton field area, acquiring hazard time sequence features according to the hazard change sequence, acquiring hazard space features by the tag information distribution, and acquiring space-time distribution features by combining the hazard time sequence features with the hazard space features;
Acquiring a cotton aphid thermodynamic diagram of a target cotton field area according to the space-time distribution characteristics, acquiring environmental characteristics corresponding to a high-hazard area, carrying out principal component analysis on the environmental characteristics in combination with meteorological characteristics, and carrying out principal component projection according to the principal component characteristics;
acquiring deviation features of a feature scatter diagram of a high-hazard region and a feature scatter diagram of a low-hazard region, extracting influence factors of a target cotton field region according to the deviation features, and acquiring a potential high-hazard region by combining the influence factors with the cotton aphid thermodynamic diagram;
and searching and obtaining similar prevention and control examples according to the influence factors and insect pest characterization, and obtaining prevention and control directions by utilizing the similar prevention and control examples to carry out statistical analysis.
7. Cotton aphid hazard monitoring system based on fusion map features, which is characterized in that the system comprises: the cotton aphid hazard monitoring system comprises a memory and a processor, wherein the memory comprises a cotton aphid hazard monitoring method program based on fusion map features, and the cotton aphid hazard monitoring method program based on the fusion map features realizes the following steps when being executed by the processor:
acquiring hyperspectral data and image data of a target cotton field area, preprocessing the hyperspectral data and the image data, constructing a feature extraction network, and acquiring hyperspectral features and image features corresponding to cotton plants;
Carrying out image significance analysis and identification and segmentation on the cotton aphid pest areas by utilizing the image features, obtaining cotton aphid pest area characterization, carrying out feature selection on hyperspectral features and image features based on the cotton aphid pest area characterization, and obtaining fused map features after fusion;
constructing a cotton aphid hazard monitoring model by using a deep learning method according to the fusion map features, outputting the cotton aphid hazard severity of a target cotton field area, and acquiring corresponding space-time distribution features;
and extracting influence factors of the target cotton field area through the space-time distribution, and determining the prevention and control direction of the target cotton field according to the influence factors.
8. The cotton aphid hazard monitoring system based on fusion map features according to claim 7, wherein the image features are used for image saliency analysis and identification and segmentation of cotton aphid hazard areas, specifically:
obtaining image features corresponding to image data, extracting texture feature subsets and color feature subsets from the image features, calculating information entropy of different wave bands in the image data through the texture features, and filtering wave bands which do not accord with a preset threshold;
performing super-pixel segmentation on the image data to obtain corresponding super-pixel blocks, clustering the super-pixel blocks, randomly selecting an initial clustering center, distributing the super-pixel blocks to the initial clustering center closest to the initial clustering center, calculating a new clustering center, and updating a class cluster of iterative super-pixel blocks;
Acquiring a final clustering result, acquiring initial partitioning results of a cotton plant area and a background area according to partitioning results of different clusters, generating different super-pixel areas, and acquiring color features corresponding to the cotton plant super-pixel areas;
acquiring distance measurement of super pixel blocks in a color space in the cotton plant super pixel region according to the color characteristics, and carrying out image significance analysis by combining the distance measurement in the color space with Euclidean distance measurement among different super pixel blocks to acquire a potential cotton aphid pest target region;
optimizing the asymptotic feature pyramid network through cavity convolution, acquiring feature graphs of different scales of the target potential area, fusing, and identifying and dividing the cotton aphid pest area of the target potential area by utilizing the fused feature graphs.
9. The cotton aphid hazard monitoring system based on the fusion map features of claim 7, wherein the feature selection is performed on hyperspectral features and image features based on the cotton aphid hazard zone characterization, specifically:
acquiring different scale feature images of the cotton aphid pest area as the characteristic of the cotton aphid pest area, and calculating the independence among the selected feature, the candidate feature, the condition mutual information of the characteristic of the cotton aphid pest area and the combined mutual information acquisition feature in the hyperspectral feature and the image feature;
Calculating the correlation between the mutual information acquisition characteristics of the selected characteristics and the characteristics of the cotton aphid pest areas and the characteristics of the pest damage, selecting a preset number of characteristics according to the correlation and the independence, copying the selected characteristics and replacing the characteristics with the original characteristic set according to the preset characteristics;
constructing a cotton aphid damage classifier, acquiring a precision reduction value of the classifier in a replacement process, screening a precision reduction maximum value, and selecting a replacement feature with the precision reduction value larger than the precision reduction maximum value as a key feature;
and outputting a key feature subset in the hyperspectral features and the image features after iterative selection, and performing splicing and fusion of the key hyperspectral features and the key image features in the key feature subset to obtain fusion map features.
10. The cotton aphid hazard monitoring system based on the fusion map features as claimed in claim 7, wherein the cotton aphid hazard monitoring model is constructed by utilizing a deep learning method according to the fusion map features, and specifically comprises:
acquiring a characteristic index based on the fusion map characteristic, acquiring sample data from historical hyperspectral image data through the characteristic index, and constructing a training set and a verification set according to the sample data;
Constructing a cotton aphid hazard monitoring model according to a self-encoder network, performing model training by utilizing the training set, inputting fusion map features as a model, and sampling in the fusion map features according to the cotton aphid hazard monitoring model to obtain feature distribution;
performing feature reconstruction by using a self-encoder according to the feature distribution, obtaining reconstruction data, obtaining feature distribution corresponding to the reconstruction data by using a decoder, and importing a follow-up full-connection layer to obtain residual errors of the feature distribution corresponding to the reconstruction data and the feature distribution corresponding to a normal cotton plant;
and comparing the residual error with a preset threshold interval, and acquiring the hazard class of the cotton aphids according to the threshold interval in which the residual error falls.
CN202311815522.2A 2023-12-27 2023-12-27 Aphis gossypii harm monitoring method and system based on fusion map features Pending CN117541423A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311815522.2A CN117541423A (en) 2023-12-27 2023-12-27 Aphis gossypii harm monitoring method and system based on fusion map features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311815522.2A CN117541423A (en) 2023-12-27 2023-12-27 Aphis gossypii harm monitoring method and system based on fusion map features

Publications (1)

Publication Number Publication Date
CN117541423A true CN117541423A (en) 2024-02-09

Family

ID=89794136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311815522.2A Pending CN117541423A (en) 2023-12-27 2023-12-27 Aphis gossypii harm monitoring method and system based on fusion map features

Country Status (1)

Country Link
CN (1) CN117541423A (en)

Similar Documents

Publication Publication Date Title
US11521380B2 (en) Shadow and cloud masking for remote sensing images in agriculture applications using a multilayer perceptron
CN113392775B (en) Sugarcane seedling automatic identification and counting method based on deep neural network
CN114445785B (en) Internet of things-based litchi insect pest monitoring and early warning method and system and storage medium
CN108764005A (en) A kind of high-spectrum remote sensing atural object space Spectral Characteristic extracting method and system
CN111918547B (en) Crown recognition device, crown recognition method, program, and recording medium
CN113378785A (en) Forest type identification method and device
CN113657158A (en) Google Earth Engine-based large-scale soybean planting region extraction algorithm
Helmholz et al. Semi-automatic verification of cropland and grassland using very high resolution mono-temporal satellite images
CN111476197A (en) Oil palm identification and area extraction method and system based on multi-source satellite remote sensing image
CN109960972B (en) Agricultural and forestry crop identification method based on middle-high resolution time sequence remote sensing data
CN116310913B (en) Natural resource investigation monitoring method and device based on unmanned aerial vehicle measurement technology
CN116579521B (en) Yield prediction time window determining method, device, equipment and readable storage medium
CN113962258A (en) Method, system and storage medium for identifying and preventing tobacco diseases
CN111344738A (en) Apparatus for collecting breeding data in farm, apparatus for analyzing breeding characteristics, method for collecting breeding data in farm, program, and recording medium
CN111582035B (en) Fruit tree age identification method, device, equipment and storage medium
CN112240869A (en) Grassland plot information extraction method based on high-resolution remote sensing image
CN117541423A (en) Aphis gossypii harm monitoring method and system based on fusion map features
Verma et al. A review on land cover classification techniques for major fruit crops in India-Present scenario and future aspects
CN116823918B (en) Crop seedling number measuring method, device, electronic equipment and storage medium
CN117745148B (en) Multi-source data-based rice stubble flue-cured tobacco planting quality evaluation method and system
CN116842351B (en) Coastal wetland carbon sink assessment model construction method, assessment method and electronic equipment
CN116758431A (en) Remote sensing identification method for seed production corn based on automatic land block identification and texture analysis
CN117152605A (en) Method and device for extracting farmland protective forest image and electronic equipment
De Oliveira et al. Clustering Weather Time Series used for Agricultural Disease Alert Systems in Florida
Gaggion Zulpo et al. ChronoRoot: High-throughput phenotyping by deep segmentation networks reveals novel temporal parameters of plant root system architecture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination