CN115358440A - Tree coverage prediction method based on angular second moment and deep learning - Google Patents

Tree coverage prediction method based on angular second moment and deep learning Download PDF

Info

Publication number
CN115358440A
CN115358440A CN202210788208.9A CN202210788208A CN115358440A CN 115358440 A CN115358440 A CN 115358440A CN 202210788208 A CN202210788208 A CN 202210788208A CN 115358440 A CN115358440 A CN 115358440A
Authority
CN
China
Prior art keywords
tree species
pest
pests
diseases
tree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210788208.9A
Other languages
Chinese (zh)
Other versions
CN115358440B (en
Inventor
杨光
吴蔚
丰洁
赵智尧
汪琳
欧阳杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Normal University
South China Normal University Qingyuan Institute of Science and Technology Innovation Co Ltd
Original Assignee
South China Normal University
South China Normal University Qingyuan Institute of Science and Technology Innovation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Normal University, South China Normal University Qingyuan Institute of Science and Technology Innovation Co Ltd filed Critical South China Normal University
Priority to CN202210788208.9A priority Critical patent/CN115358440B/en
Publication of CN115358440A publication Critical patent/CN115358440A/en
Application granted granted Critical
Publication of CN115358440B publication Critical patent/CN115358440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • Medical Informatics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Mining & Mineral Resources (AREA)
  • Development Economics (AREA)
  • Remote Sensing (AREA)
  • Primary Health Care (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Agronomy & Crop Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention provides a tree coverage prediction method based on angular second moment and deep learning, which comprises the following steps: acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious pest and disease damage level and the growth rate of the next generation of the tree species, and identifying the dominant tree species; and predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for re-tree species identification by taking the alternate critical time of the dominant tree species as a node. The invention identifies the survival rate of the tree species by predicting the pest and disease condition and the growth rate of the tree species, and predicts the tree species which can cover the area in a large area in the future and the required time period.

Description

Tree coverage prediction method based on angular second moment and deep learning
[ technical field ] A
The invention relates to the technical field of information, in particular to a tree coverage prediction method based on angular second moment and deep learning.
[ background of the invention ]
In nature, various trees grow in large quantities in some geographical vegetation, but the growth rate varies from one tree to another, such as eucalyptus, poplar, and populus sinica, each having its own growth curve and rate. In the early stage, some plants with slow growth speed are sown, and the later vegetation is surpassed by other trees growing faster to cover the original trees. The remote sensing technology can identify multiple trees in the area, but after a period of time, the vegetation can be covered by trees growing faster, so that the identification result of the original vegetation is wrong. Therefore, for the identification of tree species, the forest section images acquired by the airborne camera are identified, so that the early-stage identification result and the later-stage identification result are greatly changed due to the growth of trees. In order to ensure that the identification result is accurate, the tree species identification is carried out on time, the identification period, the tree species survival rate and the tree species distribution probability are judged, and the targeted prevention of tree diseases and insect pests needs to be carried out on the tree species in a targeted time period. Before the forest section is covered by a certain big tree, the tree species is identified for many times, so that the tree species type can be judged and observed better through the unmanned aerial vehicle, prevention and control management such as pesticide spraying of the unmanned aerial vehicle with hidden positions is carried out, and the defect that the tree species at the bottommost layer cannot be identified and managed due to tree species coverage in the big forest is avoided.
[ summary of the invention ]
The invention provides a tree coverage prediction method based on angular second moment and deep learning, which mainly comprises the following steps:
acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; determining and storing prevention and control management scheme data based on pest and disease identification results and pest and condition monitoring results of various trees; predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species; predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for identifying the tree species again by taking the alternate critical time of the dominant tree species as a node;
further optionally, the acquiring forest stand image data based on the unmanned aerial vehicle includes:
presetting a route in advance, and patrolling a full-coverage and dead-corner-free patrolling forest section by an unmanned aerial vehicle; shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after the image is obtained, carrying out picture name, place name, outline and in-picture description labeling on the image; forming an image of the whole forest section by splicing; if the shot image is an RGB image, the RGB image needs to be converted into a gray level image; under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized.
Further optionally, the learning of the tree species classification based on the angular second moment and the texture feature includes:
the angular second moment is defined by using the joint probability density of the pixels at two positions, reflects the distribution characteristics of the brightness, reflects the position distribution characteristics between the pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image; it is the basis for defining a set of textural features; utilizing the angular second moment to establish the texture characteristics of the forest segment image, and taking the texture characteristics as the input of a Support Vector Machine (SVM); carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM); a Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples.
Further optionally, the prevention and control management scheme data is determined and stored based on the pest and disease identification results and the pest and condition monitoring results of the various tree species. The method comprises the following steps:
acquiring tree species categories and a disease and pest library corresponding to each type of tree species, and recording as a tree species category: disease and pest categories }; the pest and disease condition monitoring method mainly comprises the following steps: capturing diseases and pests, shooting the diseases and pests, calculating the number of the diseases and pests, identifying the types of the diseases and pests according to the tree seed disease and pest library, killing the diseases and pests, and establishing a corresponding prevention and control strategy; the device for capturing the diseases and the pests is mainly a color plate trapping device which mainly comprises more than two yellow plates, the two color plates are mutually installed in a crossed manner, and the crossed axis is coaxial with the cylindrical power grid; shooting a picture of the disease and the insect by using a camera; the method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree; recording the final result as { tree species category, pest category and pest quantity }; different corresponding schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof; the method for killing the diseases and the pests is medicament injection; by adopting a novel high-pressure injector, a medicament with stronger systemic property, permeability and fumigation effect is injected by punching a hole on a trunk, so that diseases and insects can be effectively prevented and treated; finally, storing different types of pest control management scheme data; the method comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set;
the method comprises the following steps of taking tree species and a disease and pest library corresponding to each type of tree species as a training set, and identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN, wherein the method specifically comprises the following steps:
firstly, identifying tree species categories, taking the tree species categories and a pest database corresponding to each type of tree species as training sets, and recording the characteristic information of pests corresponding to each type of tree species. And (3) taking the VGG16 as a feature extraction network, taking an original image shot by the unmanned aerial vehicle as an input layer, outputting a feature map after extracting the feature information of the diseases and insect pests, and performing category analysis and quantity statistics on the diseases and insect pests by using FastR-CNN. The FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer. And taking the characteristic diagram as an input of an RPN network layer, and then outputting all rectangular candidate boxes containing pests and insect pests and candidate box scores. And taking the feature map and the rectangular candidate box as input in the ROI pooling layer, and extracting feature information in the rectangular candidate box. And finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, and outputting the scores of the types to which the rectangular candidate frames belong, wherein the scores are recorded as { tree species type, disease and pest type and score }. Given a score threshold score, the result is considered trustworthy when the score of the rectangular candidate box is above the threshold. And counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of the plant diseases and insect pests. And (4) obtaining a tree species classification result and recording the tree species classification result as { tree species, pest category and quantity }.
Further optionally, the predicting the disease grade index of the next generation according to the time required for the disease and insect to enter the next generation and the occurrence amount of the disease and insect of the next generation, and the identifying the severity grade of the next generation of the disease and insect includes:
firstly, acquiring tree species categories and pest categories; inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation; the formula is N = K/(T-C), wherein N is the next generation development days of the pests, K is the effective accumulated temperature, C is the development starting temperature of the pests, and T is the daily average temperature; then calculating the next generation occurrence of the diseases and the pests; inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average death rate of the disease and insect of each generation; p = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest for this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation; predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the occurrence amount P of the disease and insect in the next generation; setting a disease and pest grading threshold value which is divided into 1 to 5 grades, wherein the asymptomatic rate is 1, the extremely serious rate is 5, and the grades are sequentially decreased according to the step length 1; identifying the pest severity level according to the pest grading standard, and taking the pest severity level as a pest prediction result; the final result is recorded as { tree species category, pest and disease severity level }.
Further optionally, the acquiring data of moisture, air, light and soil of the forest section comprises:
the method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; obtaining tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance.
Further optionally, the constructing the model for calculating the growth rate of each tree species for calculating the survival rate of the tree species comprises:
calculating the growth rate of each tree species according to the plant living environment data and the pest and disease data; the plant living environment data comprises temperature, humidity, illumination, soil pH value, pest number and growth rate; wherein the growth rate is obtained by the formula: (this measured tree height-last measured tree height)/last measured tree height; normalizing all data; constructing a linear model as a tree growth rate calculation model; the model formula is as follows: (X + Y + Z + K)/4-J) a, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH value, J represents the number of collected pests, and a represents tree seed growth rate (relative value); the growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded.
Further optionally, the calculating the survival rate of the tree species according to the severity level and the growth rate of the next generation pest of the tree species and identifying the dominant tree species comprises:
the dominant tree species have high survival rate; firstly, estimating the pest severity level of all the tree species according to the pest prediction model of each tree species; if the serious level of the plant diseases and insect pests is higher, the growth disadvantage of the tree species is more obvious, and if the serious level of the plant diseases and insect pests is lower, the advantage of the tree species is more obvious; then calculating the growth rate of each tree species; inputting the next generation pest severity level and growth rate of the tree species; taking survival rate as a dominance index of each type of tree species, wherein survival rate = growth rate of each tree species/pest severity level; and (4) sorting the superiority indexes of the various tree species, wherein the higher the survival rate is, the superior tree species are, namely, the tree species with the largest coverage area in the forest zone where the tree species are located can be changed in the future.
The method for predicting the mutual coverage of the dominant tree species and other tree species and determining the time for re-tree species identification by taking the alternate critical time of the dominant tree species as a node comprises the following steps:
the prediction of the degree of the tree species mutual coverage mainly comprises the following steps: calling superior tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network (LSTM) and an attention mechanism layer; the called picture is a shot forest section picture taken 10 times before the current shooting; calculating gaps among the tree species, namely uncovered areas among the tree species; establishing a model by taking the change of the coverage area between every two times, the shooting time and the growth rate of each tree species shot at each time as input vectors of the model; the structure of the prediction model based on the LSTM and the self-attention mechanism layer is mainly an LSTM layer and is connected with the self-attention mechanism; based on a prediction model of the mutual coverage degree between tree species, finding out the mutual coverage critical point between the next tree species; and taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the tree species recognition time again.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the method can identify the tree species and judge the distribution and growth rate of the tree species according to the classification of the tree species. The survival rate of the tree species is identified by predicting the pest and disease condition and the growth rate of the tree species, and the tree species which can cover the area in a large area in the future and the required time period are predicted. The method is favorable for predicting the future observation period of the unmanned aerial vehicle and better managing forest segments.
[ description of the drawings ]
Fig. 1 is a flowchart of a tree coverage prediction method based on angular second moment and deep learning according to the present invention.
Fig. 2 is a schematic diagram of a tree coverage prediction method based on angular second moment and deep learning according to the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a flowchart of a tree coverage prediction method based on angular second moment and deep learning according to the present invention. As shown in fig. 1, the tree coverage prediction method based on the angular second moment and the deep learning in this embodiment may specifically include:
step 101, acquiring forest stand image data based on an unmanned aerial vehicle.
A route is preset in advance, and a forest section is patrolled in a full-coverage and dead-angle-free manner through unmanned aerial vehicles. Shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after the image is obtained, carrying out picture name, place name, outline and in-picture description labeling on the image; the image of the entire forest gap is formed by stitching. If the shooting is carried out to obtain an RGB image, the RGB image needs to be converted into a gray scale image. Under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized. For example, it is necessary to manage the distribution of tree species in a certain forest section. The aerial photographing heights are 15m, 20m and 25m, the method that the lens rotates anticlockwise by 60 degrees, 120 degrees, 180 degrees, 240 degrees and 300 degrees is adopted, firstly, the unmanned aerial vehicle is used for shooting the forest section in low altitude, photos are shot from the three heights of 15m, 20m and 25m respectively, and the aerial photographing images are enhanced by rotating the lens anticlockwise by 60 degrees, 120 degrees, 180 degrees, 240 degrees and 300 degrees at each height. After image data of a large number of current forest sections are obtained, labeling and splicing are carried out to form a complete forest Duan Tu. In order to extract texture features subsequently, all images need to be grayed out. Since the amount of acquired data is large, in order to prevent excessive consumption of calculation resources, coarse quantization of image gray levels should be performed, thereby reducing calculation time. A typical gray scale coarse quantization selection quantizes 256 levels to 16 levels.
And 102, learning tree species classification based on the angular second moment and the texture features.
The angular second moment is defined by using the joint probability density of the pixels at two positions, which reflects not only the distribution characteristics of the brightness, but also the position distribution characteristics between pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image. Which is the basis for defining a set of texture features. And (3) establishing texture characteristics of the image of the forest segment by using the angular second moment, and taking the texture characteristics as the input of a Support Vector Machine (SVM). And carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM). A Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples. For example, a texture image requires fast computation of the angular second moment. Firstly, a co-occurrence matrix in four directions is calculated, wherein the four directions are respectively as follows: 0 °,45 °,90 ° and 135 °; secondly, normalizing the co-occurrence matrix by using a Gaussian function; and summing the joint probability densities to obtain the angular second moment. And (3) classifying each tree by using a Support Vector Machine (SVM) by taking the texture characteristics of the forest segment image established based on the angular second moment as input. In a classification task taking texture features as input, a Support Vector Machine (SVM) has been proved to have very good performance by a large number of experiments, and the SVM has short calculation time and high accuracy.
And 103, determining and storing prevention and control management scheme data based on the pest and disease identification results and the pest and disease monitoring results of various trees.
Acquiring tree species categories and a pest and disease database corresponding to each type of tree species, and recording as { tree species category: disease and pest categories }. The pest and disease damage condition monitoring method mainly comprises the following steps: capturing the diseases and the pests, shooting the diseases and the pests, calculating the number of the diseases and the pests, identifying the types of the diseases and the pests according to the tree species disease and pest library, killing the diseases and the pests and establishing a corresponding prevention and control strategy. The device that catch sick worm used mainly is the palette lures the collection device, and it mainly contains yellow board more than two and two palette intercrossing installations, and the cross axis is coaxial with cylindrical electric wire netting. A camera is used to take a picture of the pests. The method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree. And recording the final result as { tree species category, pest quantity }. Different coping schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof. The method for killing the diseases and the pests is medicament injection. By adopting the novel high-pressure injector, the trunk is perforated and injected with a medicament with stronger systemic property, permeability and fumigation effect, and diseases and insects can be effectively prevented and controlled. And finally, storing the different types of pest control management scheme data. Suppose that the pest and disease condition of a certain forest section is monitored, the tree species category and the corresponding pest and disease library which are monitored at present need to be obtained firstly. For example, the diseases and pests of apple trees include spider mites, aphids, scale insects, moths and dung beetles, and are recorded as { apple trees: tetranychus, aphids, scale insects, moths and dung beetles }. Then, a capture device needs to be installed, which needs to be fixed with the ground through foundation bolts and is arranged in the forest section. After the diseases and the pests are captured, the pictures of the diseases and the pests are taken, and the number of the diseases and the pests is calculated. The number of the diseases and the pests is calculated by using an image segmentation technology, the types of the diseases and the pests are identified by using VGG16 and FastR-CNN, the method is detected by a large number of scientific experiments, and the model has stable and excellent performance. For example, { apple tree, aphid, 200 }. Then, it is necessary to inject the drug to the pest. For example, the agent can be 10% pyrifenoxuron plus pyrifenoxuron, 10% fluvian and pyrifenoxuron cream, 2.5% deltamethrin cream, 10% thiacloprid soluble agent and the like. When the medicament is injected, a proper amount of penetrants such as organosilicon and the like can be added, so that the prevention and treatment effect is improved. For the damage period of the borer larva, the inspection of high-sensitivity trees is strengthened, and 80% of dichlorvos and 40% of dimethoate missible oil are injected into a fresh moth hole by an injector. The SGY ointment is prepared by taking diatomite, a thickening agent, a humectant as raw materials and the like, is prepared with 40% omethoate and 1.2% nicotine matrine, is filled into fresh wormholes by an ointment gun, and has good prevention effect. Compared with manual killing, the medicament injection killing has the advantage of convenience; compared with the injection of the root medicine application medicament, the injection has the characteristics of small environmental pollution, small dosage, strong safety and the like. Finally, physical control can be implemented by strengthening the quarantine of the seedlings, such as: artificial killing or trapping of pests, chemical control and biological control, such as: and the insect pests are controlled by methods of protecting predatory natural enemies, releasing parasitic natural enemies or releasing parasitic natural enemies and the like. And storing the formulated disease and pest control management scheme into a database.
And identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and the pest database corresponding to each type of tree species as training sets.
Firstly, identifying tree species categories, taking the tree species categories and a disease and pest library corresponding to each type of tree species as a training set, and recording characteristic information of disease and pest corresponding to each type of tree species. And (3) taking the VGG16 as a feature extraction network, taking an original image shot by the unmanned aerial vehicle as an input layer, outputting a feature map after extracting the feature information of the diseases and pests, and performing category analysis and quantity statistics on the diseases and pests by using FastR-CNN. The FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer. And taking the characteristic diagram as an input of an RPN network layer, and then outputting all rectangular candidate boxes containing pests and insect pests and candidate box scores. And taking the feature map and the rectangular candidate box as input in the ROI pooling layer, and extracting feature information in the rectangular candidate box. And finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, and outputting the scores of the types to which the rectangular candidate frames belong, wherein the scores are recorded as { tree species type, disease and pest type and score }. Given a score threshold score, the result is considered trustworthy when the score of the rectangular candidate box is above the threshold. And counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of the plant diseases and insect pests. And (4) obtaining a tree species classification result and recording the tree species classification result as { tree species, pest category and quantity }. For example, if the tree species is an apple tree, taking the pest database corresponding to the apple tree as a training set, namely recording the pests of the apple tree: and the characteristic information of spider mites, aphids, scale insects, moths and dung beetles. After a photo is identified by FastR-CNN, 5 rectangular candidate boxes are output, and the categories and scores are as follows: { aphid, 0.692} { coccid, 0.614} { coccid, 0.309} { coccid, 0.666} { aphid, 0.562}. With the selected threshold score =0.3, the confidence results include { aphid, 0.692} { coccid, 0.614} { coccid, 0.666} { aphid, 0.562}. Through statistics, the number of aphids is two, and the number of scale insects is two.
And 104, predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation.
Referring to FIG. 2, first, a tree species category and a pest category are obtained. Inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation. The formula is N = K/(T-C), wherein N is the next generation development days of the disease and insect, K is the effective accumulated temperature, C is the development starting temperature of the disease and insect, and T is the average daily temperature. And then calculating the occurrence amount of the next generation of the pests. Inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average disease and insect mortality of each generation. P = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest in this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation. And predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the occurrence quantity P of the disease and insect of the next generation. And (3) establishing a disease and pest grading threshold value which is divided into 1 to 5 grades, wherein the asymptomatic rate is 1, the serious rate is 5, and the grades are gradually decreased according to the step length 1. And identifying the pest severity grade according to the pest grading standard, and taking the pest severity grade as a pest prediction result. The final result is recorded as { tree species category, pest severity level }. For example, aphids are identified as pests on apple trees. Inputting the effective accumulated temperature K =462.4 of aphids, the development starting temperature C =13.8 and the daily average temperature T =19, and calculating that the next generation of aphids can be generated after 88.9 days according to N = K/(T-C) =88.9 days. And calculating the occurrence amount of next generation of aphids, and inputting the base number P0=12 of the aphids of the next generation, the average egg laying amount E =200, the percentage F/(M + F) =0.45 of the female insects, and the average survival rate of the aphids per generation (1-M) =0.4 according to P = P0 × (F/(M + F)) = 1-M) =432. And calculating the grade index T = P/N =4.86 of the next generation of aphids. If the grading threshold is (level 1, 0) (level 2, 5) (level 3, 10) (level 4, 20) (level 5, 40), the next generation aphid severity level is 2, which is marked as { apple tree, aphid, level 2}.
And 105, acquiring data of forest section moisture, air, illumination and soil.
The method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; acquiring tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance. Air humidity, air pollutant content, temperature, illumination intensity and soil pH value are all objective factors influencing the growth of trees. In order to subsequently predict tree species survival rates, the objective factors affecting tree growth need to be recorded. The temperature influences the growth speed of trees, and the temperature sensor is used for acquiring the daily temperature change of the forest section. All living things can not be boiled, and water is a necessity for living things. Soil humidity of the forest section is monitored in real time through a humidity sensor, and the trees are prevented from being killed due to over-dryness or over-humidity. The illumination influences the photosynthesis of the trees and is an important growth factor of the trees. Because some tree species are sensitive to light, excessive or insufficient light can cause tree morbidity. Therefore, the light sensor is used for acquiring the illumination intensity, the spectral components, the sunshine length and the like in the forest section, so as to manage the growth condition of the trees. And finally, the soil acidity-alkalinity is kept in a reasonable range by using a soil acidity regulator, and the survival rate of the trees is improved.
And 106, constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species.
And calculating the growth rate of each tree according to the plant living environment data and the pest and disease damage data. The plant living environment data comprises temperature, humidity, illumination, soil pH value, pest quantity and growth rate. Wherein the growth rate is obtained by the formula: (this time of tree height measurement-last time of tree height measurement)/last time of tree height measurement. All data were normalized. And constructing a linear model as a tree growth rate calculation model. The model formula is as follows: ((X + Y + Z + K)/4-J). A, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH, J represents the number of collected pests, and a represents the growth rate (relative value) of the tree species. The growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded. For example, it is necessary to calculate the growth rate of maple, wherein the temperature is 35 ℃, the humidity level is 25% rh, the light data is 80000lux, the soil pH value level is 5, the number of pests is 20, and the growth rate is 0.25. The data after normalization were 0.44661, 0.22695, 2.236068, 0.44762, 0.44711, 0.44778, respectively. The growth rate of maple resulted to be 0.09.
And step 107, calculating the survival rate of the tree species according to the severity level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species.
The dominant species have high survival rates. Firstly, estimating the pest severity level of all the tree species according to the pest prediction model of each tree species. The higher the severity level of the pest is, the more obvious the growth disadvantage of the tree species is, and the lower the severity level of the pest is, the more obvious the advantage of the tree species is. Then calculating the growth rate of each tree species. Inputting the next generation pest severity level and growth rate of the tree species. Survival was taken as the dominance index for each type of tree, where survival = individual tree growth rate/pest severity rating. And (4) sorting the superiority indexes of the various tree species, wherein the higher the survival rate is, the superior tree species are, namely, the tree species with the largest coverage area in the forest zone where the tree species are located can be changed in the future. For example, maple survival rates are 9, pest severity ratings of 2, oak survival rates of 6, and pest severity ratings of 4. The calculated maple dominance index is 4.5, and the oak dominance index is 1.5. The maple will become the tree species with the largest coverage area in the forest area in the future.
And step 108, predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for identifying the tree species again by taking the alternate critical time of the dominant tree species as a node.
The prediction of the degree of mutual coverage among tree species mainly comprises the following steps: and calling superior tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network (LSTM) and a self-attention mechanism layer. The calling picture is a picture of a shooting forest section which is taken 10 times before the shooting. And calculating gaps among the tree species, namely uncovered areas among the tree species. And establishing the model by taking the change of the coverage area between every two times, the shooting time and the growth rate of various tree species shot at each time as input vectors of the model. The structure based on the prediction model of the LSTM and the self-attention mechanism layer is mainly the LSTM layer and is connected with the self-attention mechanism. And finding out the mutual coverage critical point between the next tree species based on the prediction model of the mutual coverage degree between the tree species. And taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the secondary tree species identification time. Since the change in the uncovered area has a very large relationship with the growth rate and the time interval of photographing, the space between tree species, that is, the uncovered area between the tree species and the growth rate of the tree species are used as input features. Meanwhile, the growth rate among various tree species is non-linearly changed, so that the prediction effect is more accurate by using the LSTM model in the case. Meanwhile, because time has great relation to the change of uncovered area, the LSTM model based on the time series can improve the accuracy of the model. Furthermore, the number of pictures input is 10, which means that the input data is relatively large, and in order to enable the model to adaptively learn the weight between the input features of different tree types, a self-attention mechanism layer is added in the model to solve the problem. For example, if the dominant tree species is maple, a forest stand picture is taken every 10 days, the result of the previous 10 times of the current shot is called, and the gaps between the maple species and the shooting time are recorded for each shot. Obtaining a critical point value (70% coverage area, 360 days) through an LSTM and a self-attention mechanism layer prediction model, and then showing that the maple reaches the maximum coverage area after 360 days, the coverage degree is 70% at the moment, the maple advantages decline after 360 days, the next dominant tree species appears, and at the moment, the time for carrying out tree species recognition again, judging the recognition period, judging the tree species survival rate, judging the tree species distribution and establishing a targeted prevention scheme for tree plant diseases and insect pests is provided.

Claims (9)

1. A tree coverage prediction method based on angular second moment and deep learning is characterized by comprising the following steps:
acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; based on the pest and disease identification result and the pest situation monitoring result of each tree species, the control management scheme data are determined and stored, and the method specifically comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set; predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species; and predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for re-tree species identification by taking the alternate critical time of the dominant tree species as a node.
2. The method of claim 1, wherein the drone-based acquisition of forest segment image data comprises:
presetting a route in advance, and patrolling a full-coverage and dead-corner-free patrolling forest section by an unmanned aerial vehicle; shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after the image is obtained, carrying out picture name, place name, outline and in-picture description labeling on the image; forming an image of the whole forest section by splicing; if the RGB image is shot and obtained, the RGB image needs to be converted into a gray image; under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized.
3. The method of claim 1, wherein the learning of the tree species classification based on the angular second moments and the texture features comprises:
the angular second moment is defined by using the joint probability density of the pixels at two positions, reflects the distribution characteristics of the brightness, reflects the position distribution characteristics between the pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image; it is the basis for defining a set of texture features; utilizing the angular second moment to establish the texture characteristics of the forest segment image, and taking the texture characteristics as the input of a Support Vector Machine (SVM); carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM); a Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples.
4. The method of claim 1, wherein determining and storing control management plan data based on pest identification results and pest monitoring results for various species of trees comprises:
acquiring tree species categories and a pest and disease database corresponding to each type of tree species, and recording as { tree species category: types of diseases and pests }; the pest and disease condition monitoring method mainly comprises the following steps: capturing diseases and pests, shooting the diseases and pests, calculating the number of the diseases and pests, identifying the types of the diseases and pests according to the tree seed disease and pest library, killing the diseases and pests, and establishing a corresponding prevention and control strategy; the device for capturing the diseases and the pests is mainly a color plate trapping device which mainly comprises more than two yellow plates, the two color plates are mutually installed in a crossed manner, and the crossed axis is coaxial with the cylindrical power grid; shooting pictures of the diseases and the pests by using a camera; the method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree; recording the final result as { tree species category, pest category and pest quantity }; different corresponding schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof; the method for killing the diseases and the pests is medicament injection; by adopting a novel high-pressure injector, a medicament with stronger systemic property, permeability and fumigation effect is injected by punching a hole on a trunk, so that diseases and insects can be effectively prevented and treated; finally, storing different types of pest control management scheme data; the method comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set;
the disease and pest database corresponding to the tree species and each type of tree species is used as a training set, and plant diseases and pest damage of the tree species are identified and counted based on FastR-CNN, and the method specifically comprises the following steps:
firstly, identifying tree species categories, taking the tree species categories and a pest database corresponding to each type of tree species as training sets, and recording the characteristic information of pests corresponding to each type of tree species; the method comprises the steps of taking VGG16 as a feature extraction network, taking an original image shot by an unmanned aerial vehicle as an input layer, outputting a feature map after extracting feature information of diseases and insect pests, and performing category analysis and quantity statistics on the diseases and insect pests by using FastR-CNN; the FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer; taking the characteristic diagram as the input of an RPN network layer, and then outputting all rectangular candidate frames containing plant diseases and insect pests and candidate frame scores; taking the feature map and the rectangular candidate frame as input in the ROI pooling layer, and extracting feature information in the rectangular candidate frame; finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, outputting the scores of the types to which the rectangular candidate frames belong, and recording the scores as { tree species type, disease and pest type and score }; given a score threshold score, the result is considered to be trustworthy when the score of the rectangular candidate box is above the threshold; counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of plant diseases and insect pests; and obtaining the tree species classification result and recording the tree species classification result as { tree species, disease and pest category and quantity }.
5. The method of claim 1, wherein said predicting a next generation pest severity index based on a time required for the pest to enter the next generation and an amount of the pest occurring in the next generation, identifying a next generation pest severity level comprises:
firstly, acquiring tree species categories and pest categories; inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation; the formula is N = K/(T-C), wherein N is the next generation development days of the pests, K is the effective accumulated temperature, C is the development starting temperature of the pests, and T is the daily average temperature; then calculating the occurrence amount of the next generation of the pests; inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average death rate of the disease and insect of each generation; p = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest for this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation; predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the occurrence quantity P of the disease and insect of the next generation; setting a disease and pest grading threshold value which is divided into 1 to 5 grades, wherein the asymptomatic rate is 1, the extremely serious rate is 5, and the grades are sequentially decreased according to the step length 1; identifying the serious pest grade according to the pest grading standard, and taking the serious pest grade as a pest prediction result; the final result is recorded as { tree species category, pest and disease severity level }.
6. The method of claim 1, wherein the obtaining forest section moisture, air, light, soil data comprises:
the method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; acquiring tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance.
7. The method of claim 1, wherein said constructing each tree growth rate calculation model for calculating tree survival comprises:
calculating the growth rate of each tree species according to the plant living environment data and the pest and disease data; the plant living environment data comprises temperature, humidity, illumination, soil pH value, pest number and growth rate; wherein the growth rate is obtained by the formula: (this time the height of the tree seed is measured-the last time the height of the tree seed is measured)/last time the height of the tree seed is measured; normalizing all data; constructing a linear model as a tree growth rate calculation model; the model formula is as follows: (X + Y + Z + K)/4-J) a, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH value, J represents the number of collected pests, and a represents tree seed growth rate (relative value); the growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded.
8. The method of claim 1, wherein said calculating tree species survival from tree species next generation pest severity level and growth rate and identifying dominant tree species comprises:
the dominant tree species have high survival rate; firstly, estimating the pest severity level of all the tree species according to the pest prediction model of each tree species; if the serious level of the plant diseases and insect pests is higher, the growth disadvantage of the tree species is more obvious, and if the serious level of the plant diseases and insect pests is lower, the advantage of the tree species is more obvious; then calculating the growth rate of each tree species; inputting the next generation pest and disease severity level and growth rate of the tree species; taking the survival rate as the dominant index of each tree species, wherein the survival rate = the growth rate of each tree species/the pest severity level; and (4) sorting the superiority indexes of the various tree species, wherein the higher the survival rate is, the superior tree species are, namely, the tree species with the largest coverage area in the forest zone where the tree species are located can be changed in the future.
9. The method of claim 1, wherein the predicting the mutual coverage of the dominant tree species and other tree species and determining the time for re-tree species identification for the node with the critical time for the dominant tree species to alternate comprises:
the prediction of the degree of the tree species mutual coverage mainly comprises the following steps: calling superior tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network (LSTM) and an attention mechanism layer; the called picture is a shot forest section picture taken 10 times before the current shooting; calculating gaps among the tree species, namely uncovered areas among the tree species; establishing a model by taking the change of the coverage area between every two times, the shooting time and the growth rate of each tree species shot at each time as input vectors of the model; the structure of the prediction model based on the LSTM and the self-attention mechanism layer is mainly an LSTM layer and is connected with the self-attention mechanism; based on a prediction model of the mutual coverage degree between tree species, finding out the mutual coverage critical point between the next tree species; and taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the secondary tree species identification time.
CN202210788208.9A 2022-07-06 2022-07-06 Tree coverage prediction method based on angular second moment and deep learning Active CN115358440B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210788208.9A CN115358440B (en) 2022-07-06 2022-07-06 Tree coverage prediction method based on angular second moment and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210788208.9A CN115358440B (en) 2022-07-06 2022-07-06 Tree coverage prediction method based on angular second moment and deep learning

Publications (2)

Publication Number Publication Date
CN115358440A true CN115358440A (en) 2022-11-18
CN115358440B CN115358440B (en) 2023-04-18

Family

ID=84029996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210788208.9A Active CN115358440B (en) 2022-07-06 2022-07-06 Tree coverage prediction method based on angular second moment and deep learning

Country Status (1)

Country Link
CN (1) CN115358440B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436656A (en) * 2023-11-02 2024-01-23 星景科技有限公司 Urban landscaping maintenance decision support method and system based on artificial intelligence

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3865764B1 (en) * 2006-08-15 2007-01-10 アルスマエヤ株式会社 Forest resource survey method and forest resource survey apparatus
US20180336418A1 (en) * 2017-05-22 2018-11-22 State Farm Mutual Automobile Insurance Company Systems and methods for determining building damage
CN109409429A (en) * 2018-10-28 2019-03-01 南京林业大学 A kind of tree species classification method based on laser radar point cloud data
CN109492563A (en) * 2018-10-30 2019-03-19 深圳大学 A kind of tree species classification method based on unmanned plane Hyperspectral imaging and LiDAR point cloud
CN113591766A (en) * 2021-08-09 2021-11-02 中国林业科学研究院资源信息研究所 Multi-source remote sensing tree species identification method for unmanned aerial vehicle
CN114201636A (en) * 2021-12-10 2022-03-18 安吉县自然资源和规划局 Forest pest disaster occurrence prediction method based on big data analysis
CN114332630A (en) * 2022-01-12 2022-04-12 成都理工大学 Multi-stage accurate screening fruit tree pest and disease monitoring system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3865764B1 (en) * 2006-08-15 2007-01-10 アルスマエヤ株式会社 Forest resource survey method and forest resource survey apparatus
US20180336418A1 (en) * 2017-05-22 2018-11-22 State Farm Mutual Automobile Insurance Company Systems and methods for determining building damage
CN109409429A (en) * 2018-10-28 2019-03-01 南京林业大学 A kind of tree species classification method based on laser radar point cloud data
CN109492563A (en) * 2018-10-30 2019-03-19 深圳大学 A kind of tree species classification method based on unmanned plane Hyperspectral imaging and LiDAR point cloud
CN113591766A (en) * 2021-08-09 2021-11-02 中国林业科学研究院资源信息研究所 Multi-source remote sensing tree species identification method for unmanned aerial vehicle
CN114201636A (en) * 2021-12-10 2022-03-18 安吉县自然资源和规划局 Forest pest disaster occurrence prediction method based on big data analysis
CN114332630A (en) * 2022-01-12 2022-04-12 成都理工大学 Multi-stage accurate screening fruit tree pest and disease monitoring system and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436656A (en) * 2023-11-02 2024-01-23 星景科技有限公司 Urban landscaping maintenance decision support method and system based on artificial intelligence

Also Published As

Publication number Publication date
CN115358440B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
Roosjen et al. Deep learning for automated detection of Drosophila suzukii: potential for UAV‐based monitoring
US20200117897A1 (en) Adaptive Artificial Intelligence Training Data Acquisition and Plant Monitoring System
US20230292647A1 (en) System and Method for Crop Monitoring
CN109325431B (en) Method and device for detecting vegetation coverage in feeding path of grassland grazing sheep
CN115358440B (en) Tree coverage prediction method based on angular second moment and deep learning
CN114723667A (en) Agricultural fine planting and disaster prevention control system
RU2695490C2 (en) Method of agricultural lands monitoring
Zhang et al. Automatic counting of lettuce using an improved YOLOv5s with multiple lightweight strategies
Mazare et al. Pests detection system for agricultural crops using intelligent image analysis
FAISAL A pest monitoring system for agriculture using deep learning
CN115314851A (en) Agricultural information management platform based on big data platform
Chen et al. Deep learning for almond fruit detection at different growth stages for orchards pest management using unmanned aerial vehicles
Costa et al. Standardizing sampling for detection and monitoring of hemlock woolly adelgid in eastern hemlock forests
Ju et al. Remote sensing monitoring of wheat leaf rust based on UAV multispectral imagery and the BPNN method
Safonova Plant species detection in aerial and satellite images using deep learning
Heydari et al. An algorithm for detecting the location of rodent-made holes through aerial filming by drones
Poleshchenko et al. Development of a System for Automated Control of Planting Density, Leaf Area Index and Crop Development Phases by UAV Photos
Mitchell et al. Wetland occupancy by duck broods in cropland‐dominated landscapes of the United States Prairie Pothole Region
Johansen et al. Mapping canegrub damage from high spatial resolution satellite imagery
Barlow A comparative study of raster and vector based approaches in vegetation mapping on Five Islands off the coast of Port Kembla.
Porter Detection of Sericea Lespedeza (Lespedeza cuneata) Infestations Using Small Unmanned Aerial System (SUAS) Imagery
CN117392609A (en) Medium longicorn monitoring method, medium longicorn monitoring system, terminal equipment and storage medium
Flynn et al. UAV‐derived greenness and within‐crown spatial patterning can detect ash dieback in individual trees
Attinà Assessing Berry Number for Grapevine Yield Estimation by Image Analysis: Case Study with the Red Variety “Syrah”
Araújo Júnior et al. Counting of shoots of Eucalyptus sp. clones with convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant