CN116626038A - Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method - Google Patents

Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method Download PDF

Info

Publication number
CN116626038A
CN116626038A CN202210729453.2A CN202210729453A CN116626038A CN 116626038 A CN116626038 A CN 116626038A CN 202210729453 A CN202210729453 A CN 202210729453A CN 116626038 A CN116626038 A CN 116626038A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
oak
area
forest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210729453.2A
Other languages
Chinese (zh)
Inventor
孙金华
杨柳
杨喜田
王婷
郭二辉
赵辉
林向彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Agricultural University
Original Assignee
Henan Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Agricultural University filed Critical Henan Agricultural University
Priority to CN202210729453.2A priority Critical patent/CN116626038A/en
Publication of CN116626038A publication Critical patent/CN116626038A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8466Investigation of vegetal material, e.g. leaves, plants, fruits
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Analytical Chemistry (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Signal Processing (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention discloses an unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method, which specifically comprises the following steps: s1, acquiring visible light and multispectral images of an unmanned aerial vehicle in a target area, classifying land utilization of the visible light images by a convolutional neural network method in deep learning, classifying the whole area according to woodland, cultivated land, construction land, water area and unused land, and acquiring a woodland distribution boundary range; the invention relates to the technical field of forest pest remote sensing monitoring. According to the unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method, the unmanned aerial vehicle aerial photographing angle is large, the observation range is wide, the ground investigation workload is greatly reduced, the monitoring in a forest farm level or county area range can be realized, the unmanned aerial vehicle aerial photographing image processing analysis result shows the oak integral insect pest degree more intuitively, the disaster-controlled condition can be macroscopically controlled on a large area scale, the error generated during manual measurement is reduced, the leaf loss rate is acquired from a three-dimensional angle, and the unmanned aerial vehicle aerial photographing image processing analysis method is more accurate and objective.

Description

Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method
Technical Field
The invention relates to the technical field of forest pest remote sensing monitoring, in particular to an unmanned aerial vehicle remote sensing oak leaf pest monitoring method.
Background
The leaf feeding insect pest is one of important factors for damaging forest resources in China, has the characteristics of large occurrence quantity and high damage degree, can cause the reduction of leaves or fallen leaves of trees, and can cause the death of the trees when the leaves are seriously lost, so that the ecological safety of the forest is affected.
At present, oak leaf eating insect pests are monitored, the traditional method is to observe oak tree crown changes by means of residents or forest guards in a forest area in a ground stepping investigation mode, find out abnormal reporting of local forest guard stations to take effective control measures, but the ground investigation is carried out on site to confirm whether insect pests occur or not and judge the hazard degree by perception due to the fact that the forest area is distributed in a rugged area of a mountain road, so that time and labor are wasted, serious subjectivity and hysteresis are achieved, and obvious defects and defects exist in the traditional oak leaf eating insect pest investigation and monitoring method mainly are shown in the following steps:
(1) Manual investigation is difficult to accurately and quantitatively estimate the leaf loss condition of oak insect pests due to the limitation of investigation view angles, and the boundary of an area where oak insect pests occur is difficult to accurately define;
(2) When calculating the leaf damage rate or the pest wood leaf loss rate by the traditional method, the standard procedure is required to be obtained, human errors exist, the working efficiency is low, and the monitoring of the large-area oak leaf eating pests is difficult to develop;
(3) The occurrence of oak insect pests has no periodicity rule, the traditional manual investigation is difficult to monitor in successive years, the monitoring efficiency is low, and the accuracy is not high.
Therefore, a new technology is urgently needed to be explored for monitoring the oak leaf eating insect pests so as to improve the timeliness and the effectiveness of the monitoring.
Disclosure of Invention
(one) solving the technical problems
Aiming at the defects of the prior art, the invention provides a remote sensing oak leaf pest monitoring method of an unmanned aerial vehicle, which solves the problems of time and labor waste, serious subjectivity and hysteresis caused by adopting a method of checking the ground of personnel, confirming whether pest occurs on site and judging the hazard degree by perception.
(II) technical scheme
In order to achieve the above purpose, the invention is realized by the following technical scheme: a method for monitoring unmanned aerial vehicle remote sensing oak leaf feeding insect pests specifically comprises the following steps:
s1, acquiring visible light and multispectral images of an unmanned aerial vehicle in a target area, classifying land utilization of the visible light images by a convolutional neural network method in deep learning, classifying the whole area according to woodland, cultivated land, construction land, water area and unused land, and acquiring a woodland distribution boundary range;
s2, combining unmanned aerial vehicle images, a ground spectrum library and forest resource class II investigation data, identifying oak species such as cork oak, quercus acutissima, mongolian oak and goose ear oak, and calculating insect population density and defoliation rate on a single wood scale according to investigation methods specified in forest pest occurrence and disaster forming standards;
s3, carrying out band calculation on the obtained multispectral image, constructing a vegetation index NDVI, an enhanced vegetation EVI, an RVI ratio vegetation index, improving a red edge normalized vegetation index mNSVI, a leaf area index LAI and the like, constructing a digital elevation model DEM, constructing spectrum information and a terrain information index in Matlab by using the vegetation index and the digital elevation model, constructing a model according to actually measured defoliation values by using a particle swarm optimization least square support vector machine method, judging defoliation rate by using field investigation and the vegetation index, determining an oak insect damage area, and judging the disaster-affected level of oaks;
and S4, periodically acquiring unmanned aerial vehicle point cloud data of the insect pest area, dividing the insect pest wood three-dimensional model in different periods, and calculating the single wood defoliation rate by utilizing the point cloud difference in different periods.
Preferably, the average individual pest amount a in the step S2 is calculated according to the following formula:
A=tn/t
wherein:
a, average single plant insect quantity, wherein the unit is head/plant;
tn—total insect mass in units of heads;
t-total plant number in plant.
The calculation formula of the single wood defoliation rate is as follows:
LLR=(n i /t i )×100%
wherein:
LLR-defoliation rate in units of;
n i -the amount of leaves lost on the crown of the individual plant;
t i total leaf number on individual crowns.
Preferably, in the step S3, in combination with NDVI, mNDVI750, LAI, DEM, RVI, EVI, and the particle swarm optimization least square support vector machine method, a model is built according to the actually measured defoliation data, learning factors c1=1.7 and c2=5 are set, when the iteration number of the kernel function is 100, the effect is optimal, and then, according to the constructed model, the damage level of oak leaf eating pests is judged by using the following method, and the criterion for judging the disaster-affected level of oak is: when the defoliation rate is less than 20%, judging that the plant is mildly subjected to disaster; when the defoliation rate is more than 20% and less than 60%, judging that the disaster is moderately affected; and when the defoliation rate is more than 60%, judging that the disaster is serious.
Preferably, in the step S4, three-dimensional point clouds of the pest area are periodically obtained by using an unmanned aerial vehicle oblique photography technology, when the growth of the forest tree leaves in 7 months before the disaster is reached, standard three-dimensional point clouds of the forest tree are obtained by using an unmanned aerial vehicle oblique photography method in an area with the canopy density being less than 0.3, standard three-dimensional point clouds of the forest tree are obtained by using a handheld three-dimensional laser scanner in an area with the canopy density being higher than 0.3, the three-dimensional point clouds of the forest tree are respectively obtained in the mild, moderate and severe periods, point cloud segmentation and clustering are performed by using a PointCNN convolutional neural network method, the tree leaf inclination angle, the branch descending index and the tree fractal dimension parameters are constructed by using a TIN irregular triangular network method, the point cloud data in the three periods of the mild, the moderate and the standard point cloud data before the disaster are respectively used for making differences, and the point clouds after the differences are separated, and the three-dimensional loss of the forest tree is calculated according to the leaf inclination angle, the branch descending index and the tree fractal dimension.
Preferably, in the step S4, the Pix4D software is used to process unmanned aerial vehicle point cloud data, the point cloud data of the handheld laser radar is input into LiDAR360 software, then the point cloud denoising, the ground point separation, the DEM generation, the normalization processing, the seed point generation, the seed point input and editing, the preprocessing based on the edited point cloud segmentation and the like are performed, and the point cloud data is input into PointCNN, expressed as F 1 ={(p 1 ,i,f 1 ,i):i=1,2,…,N 1 ' i.e. a set of pointsAnd the feature set corresponding to each point +.>C 1 Representing the initial feature channel depth, the PointCNN core operation is an X convolution, which can be abbreviated as: f (F) p =X-Conv(K,p,P,F)=Conv(K,MLP(P-p)×[MLP δ (P-p)F]) For the segmentation task, a high resolution point-by-point output is required, achieved by building a PointCNN after Conv-DeConv.
(III) beneficial effects
The invention provides a method for monitoring oak leaf eating insect pests by remote sensing of an unmanned aerial vehicle. The beneficial effects are as follows:
(1) According to the unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method, the unmanned aerial vehicle aerial photographing angle is large, the observation range is wide, the image acquisition difficulty is low, the ground investigation workload is greatly reduced, and the monitoring in a forest farm level or county area range can be realized.
(2) According to the unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method, the unmanned aerial vehicle aerial image processing analysis result shows the oak integral insect pest degree more intuitively, and the disaster situation can be controlled macroscopically on a large area scale.
(3) According to the unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method, the unmanned aerial vehicle is used for estimating the leaf loss rate through three-dimensional modeling of insect pests, so that errors generated during manual measurement are greatly reduced, the leaf loss rate is obtained from a three-dimensional angle, and the method is more accurate and objective.
(4) According to the unmanned aerial vehicle remote sensing oak leaf pest monitoring method, the damage degree of the pest areas is divided, sensitive wave bands are extracted from the heavy disaster areas, the occurrence of the oak pest in the areas can be monitored continuously, and the control efficiency is improved.
Detailed Description
The following description of the technical solutions in the embodiments of the present invention will be clear and complete, and it is obvious that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention provides a technical scheme that: a method for monitoring unmanned aerial vehicle remote sensing oak leaf feeding insect pests specifically comprises the following steps:
s1, acquiring visible light and multispectral images of an unmanned aerial vehicle in a target area, classifying the visible light images by using a convolutional neural network method in deep learning, classifying the whole area according to woodland, cultivated land, construction land, water area and unused land, acquiring a woodland distribution boundary range, processing the acquired unmanned aerial vehicle images by using a motion recovery structure (sfm algorithm) with visible light (380 nm-760 nm), red light (650 nm plus or minus 16 nm), green light (560 nm plus or minus 16 nm), blue light (450 nm plus or minus 16 nm), red edges (730 nm plus or minus 16 nm) and an unmanned aerial vehicle with a near infrared (840 nm plus or minus 26 nm) wave band lens, performing aerial imaging on the area with suspected leaf pests in the area, setting heading overlapping rate at least 80% and side overlapping rate at least 75%, and acquiring remote sensing image data of the target area by using intelligent map software or Pix4D software to acquire digital front images, digital surface images, DSM, red image points of the target area, blue image of the target area, and the target area;
s2, combining unmanned aerial vehicle images, ground spectrum library and forest resource class II investigation data, identifying oak tree species such as cork oak, quercus acutissima, mongolian oak and goose ear oak, calculating insect port density and defoliation rate on a single wood scale according to investigation methods specified in forest pest occurrence and disaster forming standards, classifying land utilization of the obtained unmanned aerial vehicle orthographic images, identifying forest land areas, carrying out object-oriented object segmentation on the orthographic images by adopting eCognition software, carrying out optimal parameter segmentation by setting proper scale parameters, shape parameters and compactness parameters according to characteristics of a research area, and carrying out extraction on forest land information by utilizing NDVI, spectrum, texture, forest size and spatial relation and combining a support vector machine and a random method;
s3, carrying out band calculation on the acquired multispectral image, constructing a vegetation index NDVI, an enhanced vegetation EVI, an RVI ratio vegetation index, an improved red edge normalized vegetation index mNPVI 750, a leaf area index LAI and the like, and a digital elevation model DEM, wherein the calculation method of each index is as follows:
NDVI=(NIR-R)/(NIR+R),EVI=2.5(NIR-R)/(NIR+6R-7.5B+1),RVI=NIR/R,mNDVI 750 =(NIR 750 -R)/(NIR 750 +R),
constructing spectrum information and topography information indexes in Matlab by using a vegetation index and a digital elevation model, constructing a model according to actually measured defoliation values by using a particle swarm optimization least square support vector machine method, determining a region of oak insect damage by determining defoliation rates through field investigation and the vegetation index, and judging the disaster-stricken level of oaks;
and S4, periodically acquiring unmanned aerial vehicle point cloud data of the insect pest area, dividing the insect pest wood three-dimensional model in different periods, and calculating the single wood defoliation rate by utilizing the point cloud difference in different periods.
In the embodiment of the invention, the average individual insect quantity A in the step S2 is calculated according to the following formula:
A=tn/t
wherein:
a, average single plant insect quantity, wherein the unit is head/plant;
tn—total insect mass in units of heads;
t-total plant number in plant.
The calculation formula of the single wood defoliation rate is as follows:
LLR=(n i /t i )×100%
wherein:
LLR-defoliation rate in units of;
n i -the amount of leaves lost on the crown of the individual plant;
t i total leaf number on individual crowns.
In the embodiment of the invention, in step S3, combining NDVI, mNDVI750, LAI, DEM, RVI and EVI, building a model according to actual measurement defoliation data by using a particle swarm optimization least square support vector machine method, setting a learning factor c1=1.7, c2=5, and when the iteration number of a kernel function is 100, the effect is optimal, and then judging the damage level of oak leaf eating pests according to the constructed model by using the following method, wherein the criterion for judging the disaster tolerance level of oak is as follows: when the defoliation rate is less than 20%, judging that the plant is mildly subjected to disaster; when the defoliation rate is more than 20% and less than 60%, judging that the disaster is moderately affected; and when the defoliation rate is more than 60%, judging that the disaster is serious.
In the embodiment of the invention, in step S4, three-dimensional point clouds of the insect pest area are periodically obtained by utilizing an unmanned aerial vehicle oblique photography technology, when the growth amount of the forest leaves before the disaster reaches 7 months, the three-dimensional point clouds of the standard forest are obtained by utilizing an unmanned aerial vehicle oblique photography method for the area with the canopy closure less than 0.3, and the specific parameters of the unmanned aerial vehicle are as follows in order to ensure the unification of flight parameters in different insect pest areas: the method comprises the steps of flight height 80m, heading overlap 80%, side overlap 70%, flight speed 9.2m/S, model of an industrial mapping unmanned aerial vehicle, obtaining standard forest three-dimensional point cloud by using a handheld three-dimensional laser scanner in a region with the canopy higher than 0.3, setting an S-shaped route in a oak region of the handheld laser radar, obtaining forest three-dimensional point cloud at intervals of 3m in the standard route, respectively acquiring the slightly, moderately and severely periods, carrying out point cloud segmentation and clustering by using a PointCNN convolutional neural network method, introducing the segmented and clustered point cloud data into ArcGIS10.2, constructing forest leaf tilt angle, lower branch index and forest fractal dimension parameters by using a TIN irregular triangular network method, respectively utilizing the slightly, moderately and severely three-period point cloud data and pre-disaster standard point cloud data to make differences, separating the point cloud after the differences, and calculating the forest three-dimensional defoliation quantity according to the leaf tilt angle, the lower branch index and the forest fractal dimension.
In the embodiment of the invention, in step S4, pix4D software is used for processing unmanned plane point cloud data, point cloud denoising, ground point separation, DEM generation, normalization processing, seed point generation, seed point introduction and editing, preprocessing based on edited point cloud segmentation and the like are performed after point cloud data of a handheld laser radar are imported into LiDAR360 software, and the point cloud data are input into PointCNN and expressed as F 1 ={(p 1 ,i,f 1 ,i):i=1,2,…,N 1 ' i.e. a set of pointsAnd the feature set corresponding to each point +.>C 1 Representing the initial feature channel depth, the PointCNN core operation is an X convolution, which can be abbreviated as: f (F) p =X-Conv(K,p,P,F)=Conv(K,MLP(P-p)×[MLP δ (P-p)F]) For segmentation tasks, high resolution point-by-point input is requiredOut, this is achieved by constructing a PointCNN after Conv-DeConv.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (5)

1. A method for monitoring unmanned aerial vehicle remote sensing oak leaf feeding insect pests is characterized by comprising the following steps of: the method specifically comprises the following steps:
s1, acquiring visible light and multispectral images of an unmanned aerial vehicle in a target area, classifying land utilization of the visible light images by a convolutional neural network method in deep learning, classifying the whole area according to woodland, cultivated land, construction land, water area and unused land, and acquiring a woodland distribution boundary range;
s2, combining unmanned aerial vehicle images, a ground spectrum library and forest resource class II investigation data, identifying oak species such as cork oak, quercus acutissima, mongolian oak and goose ear oak, and calculating insect population density and defoliation rate on a single wood scale according to investigation methods specified in forest pest occurrence and disaster forming standards;
s3, carrying out band calculation on the obtained multispectral image, constructing a vegetation index NDVI, an enhanced vegetation EVI, an RVI ratio vegetation index, improving a red edge normalized vegetation index mNSVI, a leaf area index LAI and the like, constructing a digital elevation model DEM, constructing spectrum information and a terrain information index in Matlab by using the vegetation index and the digital elevation model, constructing a model according to actually measured defoliation values by using a particle swarm optimization least square support vector machine method, judging defoliation rate by using field investigation and the vegetation index, determining an oak insect damage area, and judging the disaster-affected level of oaks;
and S4, periodically acquiring unmanned aerial vehicle point cloud data of the insect pest area, dividing the insect pest wood three-dimensional model in different periods, and calculating the single wood defoliation rate by utilizing the point cloud difference in different periods.
2. The unmanned aerial vehicle remote sensing oak type foliar pest monitoring method according to claim 1, wherein: in the step S2, the average individual insect quantity A is calculated according to the following formula:
A=tn/t
wherein:
a, average single plant insect quantity, wherein the unit is head/plant;
tn—total insect mass in units of heads;
t-total plant number in plant.
The calculation formula of the single wood defoliation rate is as follows:
LLR=(n i /t i )×100%
wherein:
LLR-defoliation rate in units of;
n i -the amount of leaves lost on the crown of the individual plant;
t i total leaf number on individual crowns.
3. The unmanned aerial vehicle remote sensing oak type foliar pest monitoring method according to claim 1, wherein: in the step S3, combining NDVI, mNDVI750, LAI, DEM, RVI, EVI, and the like, building a model according to measured defoliation data by using a particle swarm optimization least square support vector machine method, setting learning factors c1=1.7, c2=5, and optimizing effects when the iteration number of the kernel function is 100, and then judging the damage level of oak leaf eating pests according to the constructed model by using the following method, wherein the criterion for judging the disaster-stricken level of oak is as follows: when the defoliation rate is less than 20%, judging that the plant is mildly subjected to disaster; when the defoliation rate is more than 20% and less than 60%, judging that the disaster is moderately affected; and when the defoliation rate is more than 60%, judging that the disaster is serious.
4. The unmanned aerial vehicle remote sensing oak type foliar pest monitoring method according to claim 1, wherein: in the step S4, three-dimensional point clouds of the pest area are periodically obtained by using an unmanned aerial vehicle oblique photography technology, when the growth of the forest leaves before the disaster reaches 7 months, three-dimensional point clouds of the standard forest are obtained by using an unmanned aerial vehicle oblique photography method in an area with the canopy density less than 0.3, three-dimensional point clouds of the standard forest are respectively obtained by using a handheld three-dimensional laser scanner in an area with the canopy density higher than 0.3, three-dimensional point clouds of the forest are respectively obtained in the mild, moderate and severe periods, point cloud segmentation and clustering are carried out by using a PointCNN convolutional neural network method, tree leaf dip angle, branch down index and tree fractal dimension parameters are constructed by using a TIN irregular triangle network method, the point cloud data in the mild, moderate and severe periods and the standard point cloud data before the disaster are respectively used for making differences, and the three-dimensional defocusing point clouds after the differences are separated, and the three-dimensional leaf loss of the forest is calculated according to the leaf dip angle, branch index and tree fractal dimension.
5. The unmanned aerial vehicle remote sensing oak type foliar pest monitoring method according to claim 1, wherein: in the step S4, point cloud data of the unmanned aerial vehicle is processed by using Pix4D software, and after the point cloud data of the handheld laser radar is imported into LiDAR360 software, the point cloud denoising, the ground point separation, the DEM generation, the normalization processing, the seed point generation, the seed point importing and editing, the preprocessing based on the edited point cloud segmentation and the like are performed, and the point cloud data are input into the PointCNN, and expressed as: f (F) 1 ={(p 1 ,i,f 1 I) =1, 2, …, N1}, i.e. a set of pointsAnd feature sets corresponding to each pointC1 represents the initial feature channel depth, the PointCNN kernel operation is an X convolution, which can be abbreviated as: f (F) p =X-Conv(K,p,P,F)=Conv(K,MLP(P-p)×[MLP δ (P-p)F]) For the segmentation task, a high resolution point-by-point output is required, achieved by building a PointCNN after Conv-DeConv.
CN202210729453.2A 2022-06-24 2022-06-24 Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method Pending CN116626038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210729453.2A CN116626038A (en) 2022-06-24 2022-06-24 Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210729453.2A CN116626038A (en) 2022-06-24 2022-06-24 Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method

Publications (1)

Publication Number Publication Date
CN116626038A true CN116626038A (en) 2023-08-22

Family

ID=87637057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210729453.2A Pending CN116626038A (en) 2022-06-24 2022-06-24 Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method

Country Status (1)

Country Link
CN (1) CN116626038A (en)

Similar Documents

Publication Publication Date Title
US10614562B2 (en) Inventory, growth, and risk prediction using image processing
CN111091079B (en) TLS-based method for measuring vegetation advantage single plant structural parameters in friable region
CN110148116A (en) A kind of forest biomass evaluation method and its system
CN110533595A (en) A kind of method and system of crop disease and insect Spatial distributions distribution monitoring
CN113221806B (en) Cloud platform fusion multi-source satellite image and tea tree phenological period based automatic tea garden identification method
CN114863369B (en) Method, device, equipment and medium for monitoring corn lodging by laser radar
Vélez et al. Estimation of Leaf Area Index in vineyards by analysing projected shadows using UAV imagery
Koren et al. Use of terrestrial laser scanning to evaluate the spatial distribution of soil disturbance by skidding operations
CN114548277B (en) Method and system for ground point fitting and crop height extraction based on point cloud data
CN113932779B (en) Land and stone side calculation method, system and storage medium based on unmanned aerial vehicle oblique photography
Rocha et al. Automatic detection and evaluation of sugarcane planting rows in aerial images
CN112634213A (en) System and method for predicting winter wheat canopy leaf area index by unmanned aerial vehicle
CN116626038A (en) Unmanned aerial vehicle remote sensing oak leaf feeding insect pest monitoring method
CN114694020B (en) Construction method of cotton aphid remote sensing prediction model
CN113514402B (en) System and method for predicting chlorophyll content of winter wheat
CN114694048A (en) Sparse shrub species identification method and system based on unmanned aerial vehicle remote sensing technology
Bonaria Grapevine yield estimation using image analysis for the variety Arinto
RU2773144C1 (en) A method for determining the stocks of stem wood using aerial unmanned survey data
CN112241440A (en) Three-dimensional green quantity estimation and management method based on LiDAR point cloud data
CN112577907B (en) Urban green land tree crown loss rate calculation method
Jiménez-Bello et al. Use of remote sensing and geographic information tools for irrigation management of citrus trees
CN114972358B (en) Artificial intelligence-based urban surveying and mapping laser point cloud offset detection method
CN117739925B (en) Intelligent image analysis method for unmanned aerial vehicle
Poudel et al. Assessing Red Pine Seedlings Using Uav Point Clouds and Field-Verified Data
Grishin et al. An Efficient Technique for Determining Tree Coordinates using LiDAR Data via Deep Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination