CN115358440B - Tree coverage prediction method based on angular second moment and deep learning - Google Patents
Tree coverage prediction method based on angular second moment and deep learning Download PDFInfo
- Publication number
- CN115358440B CN115358440B CN202210788208.9A CN202210788208A CN115358440B CN 115358440 B CN115358440 B CN 115358440B CN 202210788208 A CN202210788208 A CN 202210788208A CN 115358440 B CN115358440 B CN 115358440B
- Authority
- CN
- China
- Prior art keywords
- tree species
- pests
- pest
- diseases
- tree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000013135 deep learning Methods 0.000 title claims abstract description 10
- 241000894007 species Species 0.000 claims abstract description 236
- 241000607479 Yersinia pestis Species 0.000 claims abstract description 233
- 201000010099 disease Diseases 0.000 claims abstract description 176
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 176
- 241000238631 Hexapoda Species 0.000 claims abstract description 66
- 230000012010 growth Effects 0.000 claims abstract description 66
- 230000004083 survival effect Effects 0.000 claims abstract description 36
- 239000002689 soil Substances 0.000 claims abstract description 29
- 238000005286 illumination Methods 0.000 claims abstract description 21
- 238000004364 calculation method Methods 0.000 claims abstract description 11
- 238000012706 support-vector machine Methods 0.000 claims description 26
- 239000003814 drug Substances 0.000 claims description 12
- 230000007246 mechanism Effects 0.000 claims description 11
- 230000002265 prevention Effects 0.000 claims description 11
- 238000011161 development Methods 0.000 claims description 10
- 238000009826 distribution Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 9
- 238000012544 monitoring process Methods 0.000 claims description 8
- 230000017448 oviposition Effects 0.000 claims description 7
- 238000004458 analytical method Methods 0.000 claims description 6
- 230000008901 benefit Effects 0.000 claims description 6
- 238000002347 injection Methods 0.000 claims description 6
- 239000007924 injection Substances 0.000 claims description 6
- 238000011176 pooling Methods 0.000 claims description 6
- 239000013598 vector Substances 0.000 claims description 6
- 239000003795 chemical substances by application Substances 0.000 claims description 5
- 238000010586 diagram Methods 0.000 claims description 4
- 238000002372 labelling Methods 0.000 claims description 4
- 239000002253 acid Substances 0.000 claims description 3
- 238000011217 control strategy Methods 0.000 claims description 3
- 230000003247 decreasing effect Effects 0.000 claims description 3
- 238000013136 deep learning model Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000003958 fumigation Methods 0.000 claims description 3
- 230000035699 permeability Effects 0.000 claims description 3
- 230000001105 regulatory effect Effects 0.000 claims description 3
- 230000009885 systemic effect Effects 0.000 claims description 3
- 238000004080 punching Methods 0.000 claims description 2
- 230000035040 seed growth Effects 0.000 claims description 2
- 241001124076 Aphididae Species 0.000 description 18
- 241000196324 Embryophyta Species 0.000 description 15
- 241000208140 Acer Species 0.000 description 9
- 241000258937 Hemiptera Species 0.000 description 9
- 244000141359 Malus pumila Species 0.000 description 8
- 235000011430 Malus pumila Nutrition 0.000 description 8
- 235000015103 Malus silvestris Nutrition 0.000 description 8
- 230000018109 developmental process Effects 0.000 description 7
- 239000002674 ointment Substances 0.000 description 6
- 241000254173 Coleoptera Species 0.000 description 3
- 210000003608 fece Anatomy 0.000 description 3
- 241000219000 Populus Species 0.000 description 2
- 241001454295 Tetranychidae Species 0.000 description 2
- 239000006071 cream Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000003071 parasitic effect Effects 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- SNICXCGAKADSCV-JTQLQIEISA-N (-)-Nicotine Chemical compound CN1CCC[C@H]1C1=CC=CN=C1 SNICXCGAKADSCV-JTQLQIEISA-N 0.000 description 1
- HOKKPVIRMVDYPB-UVTDQMKNSA-N (Z)-thiacloprid Chemical compound C1=NC(Cl)=CC=C1CN1C(=N/C#N)/SCC1 HOKKPVIRMVDYPB-UVTDQMKNSA-N 0.000 description 1
- 239000005892 Deltamethrin Substances 0.000 description 1
- 239000005947 Dimethoate Substances 0.000 description 1
- 244000166124 Eucalyptus globulus Species 0.000 description 1
- ZSBXGIUJOOQZMP-UHFFFAOYSA-N Isomatrine Natural products C1CCC2CN3C(=O)CCCC3C3C2N1CCC3 ZSBXGIUJOOQZMP-UHFFFAOYSA-N 0.000 description 1
- ZSBXGIUJOOQZMP-JLNYLFASSA-N Matrine Chemical compound C1CC[C@H]2CN3C(=O)CCC[C@@H]3[C@@H]3[C@H]2N1CCC3 ZSBXGIUJOOQZMP-JLNYLFASSA-N 0.000 description 1
- 241000422846 Sequoiadendron giganteum Species 0.000 description 1
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 1
- 239000005940 Thiacloprid Substances 0.000 description 1
- 239000000809 air pollutant Substances 0.000 description 1
- 231100001243 air pollutant Toxicity 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000010485 coping Effects 0.000 description 1
- 229960002483 decamethrin Drugs 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- OWZREIFADZCYQD-NSHGMRRFSA-N deltamethrin Chemical compound CC1(C)[C@@H](C=C(Br)Br)[C@H]1C(=O)O[C@H](C#N)C1=CC=CC(OC=2C=CC=CC=2)=C1 OWZREIFADZCYQD-NSHGMRRFSA-N 0.000 description 1
- OEBRKCOSUFCWJD-UHFFFAOYSA-N dichlorvos Chemical compound COP(=O)(OC)OC=C(Cl)Cl OEBRKCOSUFCWJD-UHFFFAOYSA-N 0.000 description 1
- 229950001327 dichlorvos Drugs 0.000 description 1
- MCWXGJITAZMZEV-UHFFFAOYSA-N dimethoate Chemical compound CNC(=O)CSP(=S)(OC)OC MCWXGJITAZMZEV-UHFFFAOYSA-N 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000003912 environmental pollution Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000003102 growth factor Substances 0.000 description 1
- 239000003906 humectant Substances 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229930014456 matrine Natural products 0.000 description 1
- LGZXYFMMLRYXLK-UHFFFAOYSA-N mercury(2+);sulfide Chemical compound [S-2].[Hg+2] LGZXYFMMLRYXLK-UHFFFAOYSA-N 0.000 description 1
- 229960002715 nicotine Drugs 0.000 description 1
- SNICXCGAKADSCV-UHFFFAOYSA-N nicotine Natural products CN1CCCC1C1=CC=CN=C1 SNICXCGAKADSCV-UHFFFAOYSA-N 0.000 description 1
- PZXOQEXFMJCDPG-UHFFFAOYSA-N omethoate Chemical compound CNC(=O)CSP(=O)(OC)OC PZXOQEXFMJCDPG-UHFFFAOYSA-N 0.000 description 1
- 230000029553 photosynthesis Effects 0.000 description 1
- 238000010672 photosynthesis Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000243 solution Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000002562 thickening agent Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Marketing (AREA)
- Medical Informatics (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Mining & Mineral Resources (AREA)
- Development Economics (AREA)
- Remote Sensing (AREA)
- Primary Health Care (AREA)
- Marine Sciences & Fisheries (AREA)
- Animal Husbandry (AREA)
- Agronomy & Crop Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Game Theory and Decision Science (AREA)
- Operations Research (AREA)
- Entrepreneurship & Innovation (AREA)
- Catching Or Destruction (AREA)
Abstract
The invention provides a tree coverage prediction method based on angular second moment and deep learning, which comprises the following steps: acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species; and predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for re-tree species identification by taking the alternate critical time of the dominant tree species as a node. The invention identifies the survival rate of the tree species by predicting the pest and disease condition and the growth rate of the tree species, and predicts the tree species which can cover the area in large area and the required time period in the future.
Description
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of information, in particular to a tree coverage prediction method based on angular second moment and deep learning.
[ background of the invention ]
In nature, various trees grow in large numbers in certain geographical vegetation, but the growth rates vary from tree to tree, such as eucalyptus, poplar, and Chinese red poplar, each with its own growth curve and rate. In the early stage, some plants with slow growth speed are sown, the latter vegetation will be overtaken by other trees growing faster and cover the original trees. The remote sensing technology can identify multiple trees in the area, but after a period of time, the vegetation can be covered by trees growing faster, so that the identification result of the original vegetation is wrong. Therefore, for the identification of tree species, the forest section images acquired by the airborne camera are identified, so that the early identification result and the later identification result are greatly changed due to the growth of trees. In order to ensure that the identification result is accurate, the tree species identification is carried out on time, the identification period is judged, the survival rate of the tree species is judged, the tree species distribution probability is judged, and the targeted prevention of tree diseases and insect pests is carried out, the tree species is required to be subjected to targeted time period judgment. Before this forest zone is covered by certain big trees, carry out many times the kind of trees discernment to better judge and observe the kind of trees type through unmanned aerial vehicle, carry out the unmanned aerial vehicle that implies the position and spout prevention and cure management such as medicine, the disappearance that the bottom kind of trees can't be discerned and managed that leads to is covered to the kind of trees in the big forest of avoiding.
[ summary of the invention ]
The invention provides a tree coverage prediction method based on angular second moment and deep learning, which mainly comprises the following steps:
acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; determining and storing prevention and control management scheme data based on pest and disease identification results and pest and condition monitoring results of various trees; predicting the disease grade index of the next generation according to the time required by the disease and insect to enter the next generation and the occurrence quantity of the disease and insect of the next generation, and identifying the severity grade of the disease and insect of the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species; predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for identifying the dominant tree species again by taking the alternate critical time of the dominant tree species as a node;
further optionally, the acquiring forest stand image data based on the unmanned aerial vehicle includes:
presetting a route in advance, and patrolling a full-coverage and dead-corner-free patrolling forest section by an unmanned aerial vehicle; shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after obtaining the image, labeling the image with a picture name, a place name, a picture outline and description in the picture; forming an image of the whole forest section by splicing; if the RGB image is shot and obtained, the RGB image needs to be converted into a gray image; under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized.
Further optionally, the learning of the tree species classification based on the angular second moment and the texture feature includes:
the angular second moment is defined by using the joint probability density of the pixels at two positions, reflects the distribution characteristics of the brightness, reflects the position distribution characteristics between the pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image; it is the basis for defining a set of textural features; utilizing the angular second moment to establish the texture characteristics of the forest segment image, and taking the texture characteristics as the input of a Support Vector Machine (SVM); carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM); a Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples.
Further optionally, the prevention and control management scheme data is determined and stored based on the pest and disease identification results and the pest and condition monitoring results of the various trees. The method comprises the following steps:
acquiring tree species categories and a pest and disease database corresponding to each type of tree species, and recording as { tree species category: types of diseases and pests }; the pest and disease condition monitoring method mainly comprises the following steps: capturing diseases and pests, shooting the diseases and pests, calculating the number of the diseases and pests, identifying the types of the diseases and pests according to the tree seed disease and pest library, killing the diseases and pests, and establishing a corresponding prevention and control strategy; the device for capturing the diseases and the pests is mainly a color plate trapping device which mainly comprises more than two yellow plates, the two color plates are mutually installed in a crossed manner, and the crossed axis is coaxial with the cylindrical power grid; shooting a picture of the disease and the insect by using a camera; the method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree; and recording the final result as { tree species category, pest number }; different corresponding schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof; the method for killing the diseases and the pests is medicament injection; by adopting a novel high-pressure injector, a medicament with stronger systemic property, permeability and fumigation effect is injected by punching a hole on a trunk, so that diseases and insects can be effectively prevented and treated; finally, storing different types of disease and pest control management scheme data; the method comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set;
the disease and pest database corresponding to the tree species and each type of tree species is used as a training set, and plant diseases and pest damage of the tree species are identified and counted based on FastR-CNN, and the method specifically comprises the following steps:
firstly, identifying tree species categories, taking the tree species categories and a pest database corresponding to each type of tree species as training sets, and recording the characteristic information of pests corresponding to each type of tree species. And (3) taking the VGG16 as a feature extraction network, taking an original image shot by the unmanned aerial vehicle as an input layer, outputting a feature map after extracting the feature information of the diseases and insect pests, and performing category analysis and quantity statistics on the diseases and insect pests by using FastR-CNN. The FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer. And taking the characteristic diagram as the input of an RPN network layer, and then outputting all rectangular candidate boxes containing the diseases and insect pests and the candidate box scores. And taking the feature map and the rectangular candidate box as input in the ROI pooling layer, and extracting feature information in the rectangular candidate box. And finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, and outputting the scores of the types to which the rectangular candidate frames belong, wherein the scores are recorded as { tree species type, disease and pest type and score }. Given a score threshold score, the result is considered trustworthy when the score of the rectangular candidate box is above the threshold. And counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of the plant diseases and insect pests. And obtaining the tree species classification result and recording the tree species classification result as { tree species, disease and pest category and quantity }.
Further optionally, the predicting the disease grade index of the next generation according to the time required for the disease and insect to enter the next generation and the occurrence amount of the disease and insect of the next generation, and the identifying the severity grade of the next generation of the disease and insect includes:
firstly, acquiring tree species categories and pest categories; inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation; the formula is N = K/(T-C), wherein N is the next generation development days of the pests, K is the effective accumulated temperature, C is the development starting temperature of the pests, and T is the daily average temperature; then calculating the occurrence amount of the next generation of the pests; inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average mortality of the disease and insect of each generation; p = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest for this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation; predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the occurrence amount P of the disease and insect in the next generation; establishing a pest grading threshold which is divided into 1 to 5 grades, wherein the asymptomatic grade is 1, the extremely serious grade is 5, and the grades are sequentially decreased progressively according to the step length 1; identifying the pest severity level according to the pest grading standard, and taking the pest severity level as a pest prediction result; the final result is recorded as { tree species category, pest and disease severity level }.
Further optionally, the acquiring data of moisture, air, light and soil of the forest section comprises:
the method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; acquiring tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance.
Further optionally, the constructing the model for calculating the growth rate of each tree species for calculating the survival rate of the tree species comprises:
calculating the growth rate of each tree species according to the plant living environment data and the pest and disease data; the plant living environment data comprises temperature, humidity, illumination, soil pH value, pest number and growth rate; wherein the growth rate is obtained by the formula: (this time the height of the tree seed is measured-the last time the height of the tree seed is measured)/last time the height of the tree seed is measured; normalizing all data; constructing a linear model as a tree growth rate calculation model; the model formula is as follows: (X + Y + Z + K)/4-J) a, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH value, J represents the number of collected pests, and a represents tree seed growth rate (relative value); the growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded.
Further optionally, the calculating the survival rate of the tree species according to the severity level and the growth rate of the next generation pest of the tree species and identifying the dominant tree species comprises:
the dominant tree species have high survival rate; firstly, estimating the pest severity level of all the tree species according to the pest prediction model of each tree species; if the serious level of the plant diseases and insect pests is higher, the growth disadvantage of the tree species is more obvious, and if the serious level of the plant diseases and insect pests is lower, the advantage of the tree species is more obvious; then calculating the growth rate of each tree species; inputting the next generation pest and disease severity level and growth rate of the tree species; taking survival rate as a dominance index of each type of tree species, wherein survival rate = growth rate of each tree species/pest severity level; and (4) sorting the superiority indexes of the various tree species, wherein the higher the survival rate is, the superior tree species are, namely, the tree species with the largest coverage area in the forest zone where the tree species are located can be changed in the future.
The method for predicting the mutual coverage of the dominant tree species and other tree species and determining the time for re-tree species identification by taking the alternate critical time of the dominant tree species as a node comprises the following steps:
the prediction of the degree of mutual coverage among tree species mainly comprises the following steps: calling superior tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network (LSTM) and an attention mechanism layer; the calling picture is a shot forest section picture which is taken 10 times before the current shooting; calculating gaps among the tree species, namely uncovered areas among the tree species; establishing a model by taking the change of the coverage area between every two times, the shooting time and the growth rate of each tree species shot at each time as input vectors of the model; the structure of the prediction model based on the LSTM and the self-attention mechanism layer is mainly an LSTM layer and is connected with the self-attention mechanism; based on a prediction model of the mutual coverage degree between tree species, finding out the mutual coverage critical point between the next tree species; and taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the tree species recognition time again.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the method can identify the tree species and judge the distribution and growth rate of the tree species according to the classification of the tree species. The survival rate of the tree species is identified by predicting the pest and disease condition and the growth rate of the tree species, and the tree species which can cover the area in a large area in the future and the required time period are predicted. The unmanned aerial vehicle observation period prediction method is favorable for predicting future unmanned aerial vehicle observation periods and better managing forest sections.
[ description of the drawings ]
Fig. 1 is a flowchart of a tree coverage prediction method based on angular second moment and deep learning according to the present invention.
Fig. 2 is a schematic diagram of a tree coverage prediction method based on angular second moment and deep learning according to the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a flowchart of a tree coverage prediction method based on angular second moment and deep learning according to the present invention. As shown in fig. 1, the tree coverage prediction method based on the angular second moment and the deep learning in this embodiment may specifically include:
step 101, acquiring forest stand image data based on an unmanned aerial vehicle.
A route is preset in advance, and a forest section is patrolled in a full-coverage and dead-angle-free manner through unmanned aerial vehicles. Shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after the image is obtained, carrying out picture name, place name, outline and in-picture description labeling on the image; an image of the entire forest stand is formed by stitching. If the shooting is carried out to obtain an RGB image, the RGB image needs to be converted into a gray scale image. Under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized. For example, it is necessary to manage the distribution of tree species in a certain forest section. The aerial photographing heights are 15m, 20m and 25m, the method that the lens rotates anticlockwise by 60 degrees, 120 degrees, 180 degrees, 240 degrees and 300 degrees is adopted, firstly, the unmanned aerial vehicle is used for shooting the forest section in low altitude, photos are shot from the three heights of 15m, 20m and 25m respectively, and the aerial photographing images are enhanced by rotating the lens anticlockwise by 60 degrees, 120 degrees, 180 degrees, 240 degrees and 300 degrees at each height. After acquiring a large amount of image data of the current forest section, carrying out labeling and splicing to form a complete forest section image. In order to extract texture features subsequently, all images need to be grayed out. Since the amount of acquired data is large, in order to prevent excessive consumption of calculation resources, coarse quantization of image gray levels should be performed, thereby reducing calculation time. A typical gray scale coarse quantization selection quantizes 256 levels to 16 levels.
And 102, learning tree species classification based on the angular second moment and the texture features.
The angular second moment is defined by using the joint probability density of the pixels at two positions, which reflects not only the distribution characteristics of the brightness, but also the position distribution characteristics between pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image. Which is the basis for defining a set of texture features. And (3) establishing texture characteristics of the forest segment image by using the angular second moment, and taking the texture characteristics as the input of a Support Vector Machine (SVM). And carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM). A Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples. For example, a texture image requires fast computation of the angular second moment. Firstly, a co-occurrence matrix of four directions is calculated, wherein the four directions are respectively as follows: 0 °,45 °,90 ° and 135 °; then, using a Gaussian function to carry out normalization processing on the co-occurrence matrix; and summing the joint probability densities to obtain the angular second moment. And (3) classifying each tree by using a Support Vector Machine (SVM) by taking the texture characteristics of the forest segment image established based on the angular second moment as input. In a classification task taking texture features as input, a Support Vector Machine (SVM) has been proved to have very good performance by a large number of experiments, and the SVM has short calculation time and high accuracy.
And 103, determining and storing prevention and control management scheme data based on the pest and disease identification results and the pest and disease monitoring results of various trees.
Acquiring tree species categories and a disease and pest library corresponding to each type of tree species, and recording as a tree species category: disease and pest categories }. The pest and disease condition monitoring method mainly comprises the following steps: capturing the diseases and the pests, shooting the diseases and the pests, calculating the number of the diseases and the pests, identifying the types of the diseases and the pests according to the tree species disease and pest library, killing the diseases and the pests and establishing a corresponding prevention and control strategy. The device that catch sick worm used mainly is the palette lures the collection device, and it mainly contains yellow board more than two and two palette intercrossing installations, and the cross axis is coaxial with cylindrical electric wire netting. A camera is used to take a picture of the pests. The method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree. And recording the final result as { tree species category, pest quantity }. Different coping schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof. The method for killing the diseases and the pests is medicament injection. By adopting the novel high-pressure injector, the trunk is perforated and injected with a medicament with stronger systemic property, permeability and fumigation effect, and diseases and insects can be effectively prevented and controlled. And finally, storing the different types of pest control management scheme data. Assuming that the pest and disease condition of a certain forest section is monitored, firstly, the currently monitored tree species and the corresponding pest and disease library are required to be obtained. For example, the diseases and pests of apple trees include spider mites, aphids, scale insects, moths and dung beetles, and are recorded as { apple trees: tetranychid mite, aphid, scale insect, moth and dung beetle }. Then, a capture device needs to be installed, which needs to be fixed with the ground through foundation bolts and is arranged in the forest section. After the diseases and the pests are captured, the pictures of the diseases and the pests are taken, and the number of the diseases and the pests is calculated. The number of the diseases and the pests is calculated by using an image segmentation technology, the types of the diseases and the pests are identified by using VGG16 and FastR-CNN, the method is detected by a large number of scientific experiments, and the model has stable and excellent performance. For example, { apple tree, aphid, 200 }. Then, it is necessary to inject the drug to the pest. For example, the agent can be 10% pyrifenoxuron plus pyrifenoxuron, 10% fluvian and pyrifenoxuron cream, 2.5% deltamethrin cream, 10% thiacloprid soluble agent and the like. When the medicament is injected, a proper amount of penetrants such as organosilicon and the like can be added, so that the prevention and treatment effect is improved. For the damage period of the borer larva, the inspection of high-sensitivity trees is strengthened, and 80% of dichlorvos and 40% of dimethoate missible oil are injected into a fresh moth hole by an injector. The SGY raw ointment is prepared by taking diatomite, a thickening agent, a humectant as raw materials and the like, the SGY raw ointment is prepared by mixing the SGY raw ointment with 40% omethoate and 1.2% nicotine matrine, and the SGY raw ointment is poured into a fresh moth hole by using an ointment gun, so that the SGY raw ointment has a good prevention effect. Compared with manual killing, the medicament injection killing has the advantage of convenience; compared with the injection of the root medicine application medicament, the injection has the characteristics of small environmental pollution, small dosage, strong safety and the like. Finally, physical control can be implemented by strengthening the quarantine of the seedlings, such as: artificial killing or trapping of pests, chemical control and biological control, such as: and the insect pests are controlled by methods of protecting predatory natural enemies, releasing parasitic natural enemies or releasing parasitic natural enemies and the like. And storing the formulated disease and pest control management scheme into a database.
And identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and the pest database corresponding to each type of tree species as training sets.
Firstly, identifying tree species categories, taking the tree species categories and a disease and pest library corresponding to each type of tree species as a training set, and recording characteristic information of disease and pest corresponding to each type of tree species. And (3) taking the VGG16 as a feature extraction network, taking an original image shot by the unmanned aerial vehicle as an input layer, outputting a feature map after extracting the feature information of the diseases and pests, and performing category analysis and quantity statistics on the diseases and pests by using FastR-CNN. The FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer. And taking the characteristic diagram as an input of an RPN network layer, and then outputting all rectangular candidate boxes containing pests and insect pests and candidate box scores. And taking the feature map and the rectangular candidate box as input in the ROI pooling layer, and extracting feature information in the rectangular candidate box. And finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, outputting the scores of the types to which the rectangular candidate frames belong, and recording the scores as { tree species type, disease and pest type and score }. Given a score threshold score, the result is considered trustworthy when the score of the rectangular candidate box is above the threshold. And counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of the plant diseases and insect pests. And obtaining the tree species classification result and recording the tree species classification result as { tree species, disease and pest category and quantity }. For example, if the tree species is an apple tree, taking the pest database corresponding to the apple tree as a training set, namely recording the pests of the apple tree: and the characteristic information of spider mites, aphids, scale insects, moths and dung beetles. After a photo is identified by FastR-CNN, 5 rectangular candidate boxes are output, and the categories and scores are as follows: { aphid, 0.692} { scale insect, 0.614} { scale insect, 0.309} { scale insect, 0.666} { aphid, 0.562}. With the selected threshold score =0.3, the confidence results include { aphid, 0.692} { scale insect, 0.614} { scale insect, 0.666} { aphid, 0.562}. According to statistics, the number of aphids is two, and the number of scale insects is two.
And step 104, predicting the disease grade index of the next generation according to the time required by the diseases and pests for entering the next generation and the occurrence quantity of the diseases and pests of the next generation, and identifying the severity grade of the diseases and pests of the next generation.
Referring to FIG. 2, first, a tree species category and a pest category are obtained. Inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation. The formula is N = K/(T-C), wherein N is the next generation development days of the disease and insect, K is the effective accumulated temperature, C is the development starting temperature of the disease and insect, and T is the average daily temperature. And then calculating the occurrence amount of the next generation of the pests. Inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average disease and insect mortality of each generation. P = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest in this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation. And predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the generation amount P of the disease and insect in the next generation. And (3) establishing a disease and pest grading threshold value which is divided into 1 to 5 grades, wherein the asymptomatic rate is 1, the serious rate is 5, and the grades are gradually decreased according to the step length 1. And identifying the pest severity grade according to the pest grading standard, and taking the pest severity grade as a pest prediction result. The final result is recorded as { tree species category, pest and disease severity level }. For example, aphids are identified as pests on apple trees. The effective accumulated temperature K =462.4 of input aphids, the development starting temperature C =13.8 and the daily average temperature T =19, and the next generation aphids can be calculated to be generated after 88.9 days according to N = K/(T-C) =88.9 days. And calculating the occurrence amount of next generation of aphids, and inputting the base number P0=12 of the aphids of the next generation, the average egg laying amount E =200, the percentage F/(M + F) =0.45 of the female insects, and the average survival rate of the aphids per generation (1-M) =0.4 according to P = P0 × (F/(M + F)) = 1-M) =432. And calculating the grade index T = P/N =4.86 of the next generation of aphids. If the grading threshold is (level 1, 0) (level 2, 5) (level 3, 10) (level 4, 20) (level 5, 40), the next generation aphid severity level is 2, which is marked as { apple tree, aphid, level 2}.
And 105, acquiring data of forest section moisture, air, illumination and soil.
The method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; acquiring tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance. Air humidity, air pollutant content, temperature, illumination intensity and soil pH value are all objective factors influencing the growth of trees. In order to subsequently predict tree species survival rates, the objective factors affecting tree growth need to be recorded. The temperature influences the growth speed of the trees, and the temperature sensor is used for acquiring the daily temperature change of the forest section. All living things can not be boiled, and water is a necessity for living things. Soil humidity of the forest section is monitored in real time through a humidity sensor, and the trees are prevented from being killed due to over-dryness or over-humidity. The illumination influences the photosynthesis of the trees and is an important growth factor of the trees. Because some tree species are sensitive to light, excessive or insufficient illumination can cause tree morbidity. Therefore, the light sensor is used for acquiring the illumination intensity, the spectral components, the sunshine length and the like in the forest section, so as to manage the growth condition of the trees. And finally, the soil acidity-alkalinity is kept in a reasonable range by using a soil acidity regulator, and the survival rate of the trees is improved.
And 106, constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species.
And calculating the growth rate of each tree according to the plant living environment data and the pest and disease damage data. The plant living environment data comprises temperature, humidity, illumination, soil pH value, pest number and growth rate. Wherein the growth rate is obtained by the formula: (this time of tree height measurement-last time of tree height measurement)/last time of tree height measurement. All data were normalized. And constructing a linear model as a tree growth rate calculation model. The model formula is as follows: ((X + Y + Z + K)/4-J). A, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH, J represents the number of collected pests, and a represents the growth rate (relative value) of the tree species. The growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded. For example, it is necessary to calculate the growth rate of maple, wherein the temperature is 35 ℃, the humidity level is 25%rh, the illumination data is 80000lux, the soil pH value level is 5, the number of pests is 20, and the growth rate is 0.25. The data after normalization were 0.44661, 0.22695, 2.236068, 0.44762, 0.44711, 0.44778, respectively. The growth rate result of maple was 0.09.
And step 107, calculating the survival rate of the tree species according to the severity level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species.
The dominant species have high survival rates. Firstly, estimating the pest severity level of all the tree species according to the pest prediction model of each tree species. The higher the severity level of the pest is, the more obvious the growth disadvantage of the tree species is, and the lower the severity level of the pest is, the more obvious the advantage of the tree species is. Then calculating the growth rate of each tree species. Inputting the severity level and growth rate of the next generation of pests and diseases of the tree species. Survival was taken as the dominance index for each species, where survival = growth rate of each species/pest severity scale. And (4) sorting the superiority indexes of the various tree species, wherein the higher the survival rate is, the superior tree species are, namely, the tree species with the largest coverage area in the forest zone where the tree species are located can be changed in the future. For example, maple survival was 9, oak survival was 6, and insect pest severity was 4. The calculated maple dominance index is 4.5 and the oak dominance index is 1.5. The maple will become the tree species with the largest coverage area in the forest area in the future.
And step 108, predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for identifying the tree species again by taking the alternate critical time of the dominant tree species as a node.
The prediction of the degree of the tree species mutual coverage mainly comprises the following steps: and calling superior tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network (LSTM) and a self-attention mechanism layer. The calling picture is a picture of a shooting forest section which is taken 10 times before the shooting. And calculating gaps among the tree species, namely uncovered areas among the tree species. And establishing the model by taking the change of the coverage area between every two times, the shooting time and the growth rate of various tree species shot at each time as input vectors of the model. The structure based on the prediction model of the LSTM and the self-attention mechanism layer is mainly the LSTM layer and is connected with the self-attention mechanism. And finding out the mutual coverage critical point between the next tree species based on the prediction model of the mutual coverage degree between the tree species. And taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the secondary tree species identification time. Since the change in the uncovered area has a very large relationship with the growth rate and the time interval of photographing, the space between tree species, that is, the uncovered area between the tree species and the growth rate of the tree species are used as input features. Meanwhile, the growth rate among various tree species changes in a nonlinear way, so that the prediction effect is more accurate by using the LSTM model in the situation. Meanwhile, because time has great relation to the change of uncovered area, the LSTM model based on the time series can improve the accuracy of the model. Furthermore, the number of the input photos is 10, which means that the input data is relatively large, and in order to enable the model to adaptively learn the weight between the input features of different tree types, a self-attention mechanism layer is added in the model to solve the problem. For example, if the dominant tree species is maple, a forest stand picture is taken every 10 days, the result of the previous 10 times of the current shot is called, and the gaps between the maple species and the shooting time are recorded for each shot. Obtaining a critical point value (70% coverage area, 360 days) through an LSTM and a self-attention mechanism layer prediction model, and then showing that the maple reaches the maximum coverage area after 360 days, the coverage degree is 70% at the moment, the maple advantages decline after 360 days, the next dominant tree species appears, and the time for carrying out tree species recognition again, judging the recognition period, judging the tree species survival rate, judging the tree species distribution and establishing a targeted prevention scheme for tree plant diseases and insect pests is at the moment.
Claims (9)
1. A tree coverage prediction method based on angular second moment and deep learning is characterized by comprising the following steps:
acquiring forest stand image data based on an unmanned aerial vehicle; learning tree species classification based on the angular second moment and the texture features; based on the pest and disease identification result and the pest situation monitoring result of each tree species, the control management scheme data is determined and stored, and the method specifically comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set; predicting the disease grade index of the next generation of diseases and insect pests according to the time required by the diseases and insect pests for entering the next generation and the occurrence quantity of the diseases and insect pests for the next generation, and identifying the severity grade of the diseases and insect pests for the next generation; acquiring data of forest section moisture, air, illumination and soil; constructing a growth rate calculation model of each tree species for calculating the survival rate of the tree species; calculating the survival rate of the tree species according to the serious level and the growth rate of the next generation of plant diseases and insect pests of the tree species, and identifying the dominant tree species; predicting the mutual coverage of the dominant tree species and other tree species, and determining the time for identifying the tree species again by taking the alternate critical time of the dominant tree species as a node;
calculating the growth rate of each tree species according to the plant living environment data and the pest and disease data; constructing a linear model as a tree growth rate calculation model; the model formula is as follows: (X + Y + Z + K)/4-J) a, wherein X represents temperature, Y represents humidity, Z represents illumination data, K represents soil pH value, J represents the number of collected pests, and a represents tree seed growth rate; wherein the growth rate is obtained by the formula: (this measured tree height-last measured tree height)/last measured tree height;
estimating the pest severity level of all the tree species according to the pest prediction model of each tree species; wherein survival = growth rate of individual tree species/pest severity level;
the prediction of the degree of the tree species mutual coverage mainly comprises the following steps: calling dominant tree species pictures, calculating gaps among the tree species, and establishing a prediction model based on a long-short term memory network and a self-attention mechanism layer; calculating gaps among the tree species, namely uncovered areas among the tree species; establishing a model by taking the change of the coverage area between every two times, the shooting time and the growth rate of each tree species shot at each time as input vectors of the model; the structure of the prediction model based on the LSTM and the self-attention mechanism layer is mainly the LSTM layer and is connected with the self-attention mechanism.
2. The method of claim 1, wherein the drone-based acquisition of forest segment image data comprises:
presetting a route in advance, and patrolling a full-coverage and dead-corner-free patrolling forest section by an unmanned aerial vehicle; shooting an image every 100 meters by an unmanned aerial vehicle, and shooting from a plurality of different heights and different angles to obtain an image of the current forest stand; after the image is obtained, carrying out picture name, place name, outline and in-picture description labeling on the image; forming an image of the whole forest section by splicing; if the RGB image is shot and obtained, the RGB image needs to be converted into a gray image; under the condition of ensuring the clearness of the texture features of the image, the gray level of the image is coarsely quantized.
3. The method of claim 1, wherein the learning of the treelike classification based on angular second moments and texture features comprises:
the angular second moment is defined by using the joint probability density of the pixels at two positions, reflects the distribution characteristic of the brightness, reflects the position distribution characteristic among the pixels with the same brightness or brightness, and is a second-order statistical characteristic related to the brightness change of the image; it is the basis for defining a set of texture features; utilizing the angular second moment to establish the texture characteristics of the forest segment image, and taking the texture characteristics as the input of a Support Vector Machine (SVM); carrying out tree species classification on the extracted texture feature vectors based on a Support Vector Machine (SVM); a Support Vector Machine (SVM) is a generalized linear classifier for binary classification of data in a supervised learning mode, and a decision boundary of the SVM is a maximum edge distance hyperplane for solving learning samples.
4. The method of claim 1, wherein determining and storing control management plan data based on pest identification results and pest monitoring results for various species of trees comprises:
acquiring tree species categories and a disease and pest library corresponding to each type of tree species, and recording as a tree species category: types of diseases and pests }; the pest and disease damage condition monitoring method mainly comprises the following steps: capturing diseases and pests, shooting the diseases and pests, calculating the number of the diseases and pests, identifying the types of the diseases and pests according to the tree seed disease and pest library, killing the diseases and pests, and establishing a corresponding prevention and control strategy; the device for capturing the diseases and the pests is mainly a color plate trapping device which mainly comprises more than two yellow plates, the two color plates are mutually installed in a crossed manner, and the crossed axis is coaxial with the cylindrical power grid; shooting pictures of the diseases and the pests by using a camera; the method for identifying the types of the diseases and the pests is based on a deep learning model, the VGG16 is used for extracting the characteristics in the pictures of the diseases and the pests, and FastR-CNN is used for carrying out category analysis and quantity statistics on the diseases and the pests according to a disease and pest library corresponding to each given type of tree; recording the final result as { tree species category, pest category and pest quantity }; different corresponding schemes are established for different types of diseases and pests so as to achieve the effect of preventing and treating the diseases and the pests and the larvae thereof; the method for killing the diseases and the pests is medicament injection; by adopting a novel high-pressure injector, a medicament with stronger systemic property, permeability and fumigation effect is injected into the trunk by punching, so that diseases and insects can be effectively prevented and controlled; finally, storing different types of pest control management scheme data; the method comprises the following steps: identifying and counting plant diseases and insect pests of the tree species based on FastR-CNN by taking the tree species and a disease and insect library corresponding to each type of tree species as a training set;
the disease and pest database corresponding to the tree species and each type of tree species is used as a training set, and plant diseases and pest damage of the tree species are identified and counted based on FastR-CNN, and the method specifically comprises the following steps:
firstly, identifying tree species categories, taking the tree species categories and a pest database corresponding to each type of tree species as training sets, and recording the characteristic information of pests corresponding to each type of tree species; the method comprises the steps of taking VGG16 as a feature extraction network, taking an original image shot by an unmanned aerial vehicle as an input layer, outputting a feature map after extracting feature information of diseases and pests, and performing category analysis and quantity statistics on the diseases and pests by using FastR-CNN; the FastR-CNN classification consists of three layers of operations: RPN network layer, ROI pooling layer, softmax layer; taking the characteristic diagram as the input of an RPN network layer, and then outputting all rectangular candidate frames containing plant diseases and insect pests and candidate frame scores; taking the feature map and the rectangular candidate frame as input in the ROI pooling layer, and extracting feature information in the rectangular candidate frame; finally, comparing the characteristic information in the rectangular candidate frames with the characteristic information of the diseases and the pests corresponding to each type of tree species, classifying the diseases and the pests of all the rectangular candidate frames by utilizing a softmax layer, outputting the scores of the types to which the rectangular candidate frames belong, and recording the scores as { tree species type, disease and pest type and score }; given a score threshold score, the result is considered to be trustworthy when the score of the rectangular candidate box is above the threshold; counting the number of rectangular candidate frames belonging to the same category in all the credible results, and taking the number as the number of plant diseases and insect pests; and obtaining the tree species classification result and recording the tree species classification result as { tree species, disease and pest category and quantity }.
5. The method of claim 1, wherein said predicting a next generation pest severity index based on a time required for the pest to enter the next generation and an amount of the pest occurring in the next generation, identifying a next generation pest severity level, comprises:
firstly, obtaining tree species categories and disease and pest categories; inputting the starting temperature, the effective accumulated temperature and the average air temperature of the development of the pests, and calculating the time required for the pests to enter the next generation; the formula is N = K/(T-C), wherein N is the next generation development days of the pests, K is the effective accumulated temperature, C is the development starting temperature of the pests, and T is the daily average temperature; then calculating the next generation occurrence of the diseases and the pests; inputting the base number of the disease and insect of the generation, the average egg laying amount of the female disease and insect, the number of the male disease and insect and the average death rate of the disease and insect of each generation; p = P0 × E (F/(M + F)) × (1-M), where P is the occurrence of the next generation of the pest, P0 is the base of the pest for this generation, E is the average egg laying amount of the female pest, F is the number of female pests, F/(M + F) is the percentage of females, M is the number of male pests, M is the average mortality of the pests per generation, and (1-M) is the average survival rate of the pests per generation; predicting the disease grade index T = P/N of the next generation according to the time N required by the disease and insect to enter the next generation and the occurrence amount P of the disease and insect in the next generation; setting a disease and pest grading threshold value which is divided into 1 to 5 grades, wherein the asymptomatic rate is 1, the extremely serious rate is 5, and the grades are sequentially decreased according to the step length 1; identifying the pest severity level according to the pest grading standard, and taking the pest severity level as a pest prediction result; the final result is recorded as { tree species category, pest and disease severity level }.
6. The method of claim 1, wherein said obtaining forest zone moisture, air, light, soil data comprises:
the method comprises the following steps of recording forest section moisture, air, illumination and soil data through various sensors, wherein the data recording method comprises the following steps: acquiring temperature data of the forest section through a temperature sensor; acquiring humidity data of the forest section through a humidity sensor; acquiring tree illumination data through a light sensor; and (3) adjusting the soil pH value by using a soil acid regulating agent, and keeping the soil acid-base balance.
7. The method of claim 1, wherein said constructing each tree growth rate calculation model for calculating tree survival comprises:
the plant living environment data comprises temperature, humidity, illumination, soil pH value, pest number and growth rate; normalizing all data; the growth rates of the various tree species are ranked according to the growth rates of all the tree species, and the values of the growth rates of the various tree species are recorded.
8. The method of claim 1, wherein said calculating tree species survival from tree species next generation pest severity level and growth rate and identifying dominant tree species comprises:
the dominant tree species have high survival rate; if the serious level of the plant diseases and insect pests is higher, the growth disadvantage of the tree species is more obvious, and if the serious level of the plant diseases and insect pests is lower, the advantage of the tree species is more obvious; then calculating the growth rate of each tree species; inputting the next generation pest and disease severity level and growth rate of the tree species; the survival rate is used as the dominant index of each type of tree species, the dominant indexes of the tree species are sorted, and the higher the survival rate is, the dominant tree species become the tree species with the largest coverage area in the forest region where the dominant tree species are located in the future.
9. The method of claim 1, wherein the predicting the mutual coverage of the dominant tree species and other tree species and determining the time for re-tree species identification for the node with the critical time for the dominant tree species to alternate comprises:
the called picture is a shot forest section picture taken 10 times before the current shooting; based on a prediction model of the mutual coverage degree between tree species, finding out the mutual coverage critical point between the next tree species; and taking the tree species coverage area of the critical point as the maximum coverage, and setting the time of the critical point as the secondary tree species identification time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210788208.9A CN115358440B (en) | 2022-07-06 | 2022-07-06 | Tree coverage prediction method based on angular second moment and deep learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210788208.9A CN115358440B (en) | 2022-07-06 | 2022-07-06 | Tree coverage prediction method based on angular second moment and deep learning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115358440A CN115358440A (en) | 2022-11-18 |
CN115358440B true CN115358440B (en) | 2023-04-18 |
Family
ID=84029996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210788208.9A Active CN115358440B (en) | 2022-07-06 | 2022-07-06 | Tree coverage prediction method based on angular second moment and deep learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115358440B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117436656A (en) * | 2023-11-02 | 2024-01-23 | 星景科技有限公司 | Urban landscaping maintenance decision support method and system based on artificial intelligence |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3865764B1 (en) * | 2006-08-15 | 2007-01-10 | アルスマエヤ株式会社 | Forest resource survey method and forest resource survey apparatus |
CN109492563A (en) * | 2018-10-30 | 2019-03-19 | 深圳大学 | A kind of tree species classification method based on unmanned plane Hyperspectral imaging and LiDAR point cloud |
CN113591766A (en) * | 2021-08-09 | 2021-11-02 | 中国林业科学研究院资源信息研究所 | Multi-source remote sensing tree species identification method for unmanned aerial vehicle |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11361544B2 (en) * | 2017-05-22 | 2022-06-14 | State Farm Mutual Automobile Insurance Company | Systems and methods for determining building damage |
CN109409429B (en) * | 2018-10-28 | 2021-08-20 | 南京林业大学 | Tree species classification method based on laser radar point cloud data |
CN114201636A (en) * | 2021-12-10 | 2022-03-18 | 安吉县自然资源和规划局 | Forest pest disaster occurrence prediction method based on big data analysis |
CN114332630A (en) * | 2022-01-12 | 2022-04-12 | 成都理工大学 | Multi-stage accurate screening fruit tree pest and disease monitoring system and method |
-
2022
- 2022-07-06 CN CN202210788208.9A patent/CN115358440B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3865764B1 (en) * | 2006-08-15 | 2007-01-10 | アルスマエヤ株式会社 | Forest resource survey method and forest resource survey apparatus |
CN109492563A (en) * | 2018-10-30 | 2019-03-19 | 深圳大学 | A kind of tree species classification method based on unmanned plane Hyperspectral imaging and LiDAR point cloud |
CN113591766A (en) * | 2021-08-09 | 2021-11-02 | 中国林业科学研究院资源信息研究所 | Multi-source remote sensing tree species identification method for unmanned aerial vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN115358440A (en) | 2022-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Roosjen et al. | Deep learning for automated detection of Drosophila suzukii: potential for UAV‐based monitoring | |
Cruzan et al. | Small unmanned aerial vehicles (micro‐UAVs, drones) in plant ecology | |
Mamdouh et al. | YOLO-based deep learning framework for olive fruit fly detection and counting | |
US20230292647A1 (en) | System and Method for Crop Monitoring | |
US20200117897A1 (en) | Adaptive Artificial Intelligence Training Data Acquisition and Plant Monitoring System | |
Louhaichi et al. | Spatially located platform and aerial photography for documentation of grazing impacts on wheat | |
CN109325431B (en) | Method and device for detecting vegetation coverage in feeding path of grassland grazing sheep | |
CN114818909B (en) | Weed detection method and device based on crop growth characteristics | |
CN115358440B (en) | Tree coverage prediction method based on angular second moment and deep learning | |
CN113822198B (en) | Peanut growth monitoring method, system and medium based on UAV-RGB image and deep learning | |
CN114723667A (en) | Agricultural fine planting and disaster prevention control system | |
Mortensen et al. | The oil radish growth dataset for semantic segmentation and yield estimation | |
Zhang et al. | Automatic counting of lettuce using an improved YOLOv5s with multiple lightweight strategies | |
CN115797764B (en) | Remote sensing big data interpretation method and system applied to farmland non-agrochemical monitoring | |
Mazare et al. | Pests detection system for agricultural crops using intelligent image analysis | |
CN115314851A (en) | Agricultural information management platform based on big data platform | |
Chen et al. | Deep learning for almond fruit detection at different growth stages for orchards pest management using unmanned aerial vehicles | |
Heydari et al. | An algorithm for detecting the location of rodent-made holes through aerial filming by drones | |
Safonova | Plant species detection in aerial and satellite images using deep learning | |
Poleshchenko et al. | Development of a System for Automated Control of Planting Density, Leaf Area Index and Crop Development Phases by UAV Photos | |
Barlow | A comparative study of raster and vector based approaches in vegetation mapping on Five Islands off the coast of Port Kembla. | |
Sun et al. | Machine Vision Based Phenotype Recognition of Plant and Animal | |
CN117612155A (en) | Method and system for litchi pedicel moth detection and eclosion rate calculation | |
Attinà | Assessing Berry Number for Grapevine Yield Estimation by Image Analysis: Case Study with the Red Variety “Syrah” | |
CN117392609A (en) | Medium longicorn monitoring method, medium longicorn monitoring system, terminal equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |