CN111638216A - Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests - Google Patents
Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests Download PDFInfo
- Publication number
- CN111638216A CN111638216A CN202010607427.3A CN202010607427A CN111638216A CN 111638216 A CN111638216 A CN 111638216A CN 202010607427 A CN202010607427 A CN 202010607427A CN 111638216 A CN111638216 A CN 111638216A
- Authority
- CN
- China
- Prior art keywords
- layer
- beet
- unmanned aerial
- aerial vehicle
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8466—Investigation of vegetal material, e.g. leaves, plants, fruits
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Chemical & Material Sciences (AREA)
- Artificial Intelligence (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Biochemistry (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Analytical Chemistry (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Catching Or Destruction (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses an unmanned aerial vehicle for monitoring plant diseases and insect pests and a beet related disease analysis method. Frame (1) is passed at the center of retainer plate (2), the bottom of frame (1) is connected with unmanned aerial vehicle intelligent monitoring module (6), the upper end of retainer plate (2) still sets up a set of mount (3), every mount (3) all are connected with a rotor (4), the bottom of retainer plate (2) sets up a set of support frame (5). The invention solves the problem of accurately and rapidly analyzing the growth stage and the disaster degree of the crops.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle monitoring, in particular to a beet related disease analysis method of an unmanned aerial vehicle system for monitoring plant diseases and insect pests.
Background
The diseases and pests are one of the main disasters of the planting industry, not only have serious harmfulness and destructiveness like flood disasters, but also have the particularity of biological disasters and long-term and difficult property in treatment, the monitoring and the prevention of the agricultural diseases and pests are important components of national disaster reduction projects and are also important components of agricultural work, the disease and pest prevention of the planting industry is vigorously carried out, and the harvest rate of agricultural products can be greatly improved.
The existing crop monitoring mode is mostly that the crops directly enter a farmland to be observed by naked eyes through manpower, the requirements of comprehensive monitoring and quick monitoring of the crops cannot be met under the monitoring mode, the crops cannot be monitored at each stage of the growth process, and the growth stage and the disaster degree of the crops cannot be accurately and quickly analyzed.
Disclosure of Invention
The invention provides an unmanned aerial vehicle for monitoring plant diseases and insect pests and a method for analyzing related diseases of beet, which solve the problems that the growth stage and the disaster degree of crops can be accurately and rapidly analyzed in the existing crop monitoring mode.
The invention is realized by the following technical scheme:
an unmanned aerial vehicle system for monitoring plant diseases and insect pests is disclosed, wherein the unmanned aerial vehicle comprises a frame 1, a fixed ring 2, fixed frames 3, rotor wings 4, support frames 5 and an unmanned aerial vehicle intelligent monitoring module 6, the center of the fixed ring 2 penetrates through the frame 1, the bottom end of the frame 1 is connected with the unmanned aerial vehicle intelligent monitoring module 6, the upper end of the fixed ring 2 is also provided with a group of fixed frames 3, each fixed frame 3 is connected with one rotor wing 4, and the bottom end of the fixed ring 2 is provided with a group of support frames 5;
unmanned aerial vehicle intelligent monitoring module 6 includes unmanned aerial vehicle unit and image analysis unit, unmanned aerial vehicle unit is used for controlling unmanned aerial vehicle, carries out information acquisition and transmission simultaneously, image analysis unit is used for algorithm processing image and analysis image.
A method of analyzing beet-related conditions for a drone system for monitoring pests, the method of analyzing comprising the steps of:
step 1: collecting the characteristics of the beets in a normal state and the characteristics of the beets in various pest and disease states in different degrees by utilizing the normal growth state of the beets in each growth period and the growth state pictures of the beets subjected to various pest and disease damages in different degrees;
step 2: constructing a filter1 corresponding to a first convolution layer in the Tensorflow model through an architecture module;
and step 3: performing differential processing on the beet images of 3 channels in a normal state and under various pest and disease states of different degrees by using the filter1 corresponding to the first convolutional layer set in the step 2, and acquiring the beet images of 3 channels into an array with the depth of 64;
and 4, step 4: entering a pooling layer, obtaining an array reflecting information by adopting a maximum pooling strategy, and reducing the size of the model by utilizing the pooling layer in a convolutional network model, thereby improving the calculation speed and improving the robustness of the extracted features;
and 5: taking the array in the step 4 as the input of the convolutional layer in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states in different degrees, adjusting the parameter of a convolutional filter2 used in the layer by using the Tensorflow model in the step 2, and finally selecting a filter2 with the dimension parameter of [5, 5, 64, 128] to perform filtering operation on the input;
step 6: 2d processing is carried out on the feature unit array obtained in the step 5 and extracted by the unit array S4, and a 50 x 50 unit set with the depth of 128 is output;
and 7: taking the 50 x 50 unit set with the depth of 128 in the step 6 as the input of the convolutional layer performed in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states with different degrees, adjusting the parameter of the convolutional filter3 applied to the layer by using the Tensorflow model in the step 2, and finally selecting the filter3 with the dimension parameters of [25, 25, 25, 128] to perform filtering operation on the input;
and 8: flattening and tiling the result units after the filtering operation in the step 7;
and step 9: marking the reduced feature nodes for the first time by using the first full-connection layer;
step 10: marking the characteristic nodes marked in the step 9 by using a second layer full connection layer for the second time of reducing the characteristic nodes;
step 11: adopting a dropout layer to reduce the degree of over-fitting and under-fitting of the second labeled feature nodes obtained in the step 10;
step 12: adjusting each parameter obtained in the steps 2-11 by taking the cross entropy as a loss function;
step 13: and finally, adding the features obtained in the step 12, distributing corresponding weights to the obtained sum to construct a classification function, and feeding back the model to output an identification result.
Further, in the step 10, the label processed by the first fully-connected layer is used for performing the second reduction, and the fully-connected layer of this layer reduces the range label from 1024 labels of the first reduced feature nodes to 512 features.
The invention has the beneficial effects that:
according to the invention, through arranging the communication module and the information display APP, the beet planting condition can be displayed by utilizing the APP, the targeted fertilization is a pesticide or irrigation suggestion, through arranging the data module, the architecture module and the analysis module, the growth condition of each beet plant in a planting area can be analyzed, through arranging the automatic cruise module and the camera module, the corresponding plant photo can be returned, according to a deep learning thought, the growth information of the plant is analyzed by utilizing an image recognition technology, whether drought and waterlogging conditions or plant diseases and insect pests occur is monitored, the information is provided for farmers in real time, a corresponding improvement suggestion is given, through arranging the image analysis unit, the model training of relevant diseases can be carried out on the beet, and the understanding of farmers on different damage degrees of the beet in each growth stage is improved.
Drawings
FIG. 1 is a schematic structural view of the present invention.
FIG. 2 is a diagram of the improved Le-Net5 neural network of the present invention.
Fig. 3 is a flow chart of adjusting parameters in the present invention.
FIG. 4 is a flow chart of the present invention for analyzing and determining a neural network model using Le-Net 5.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An unmanned aerial vehicle system for monitoring plant diseases and insect pests is disclosed, wherein the unmanned aerial vehicle comprises a frame 1, a fixed ring 2, fixed frames 3, rotor wings 4, support frames 5 and an unmanned aerial vehicle intelligent monitoring module 6, the center of the fixed ring 2 penetrates through the frame 1, the bottom end of the frame 1 is connected with the unmanned aerial vehicle intelligent monitoring module 6, the upper end of the fixed ring 2 is also provided with a group of fixed frames 3, each fixed frame 3 is connected with one rotor wing 4, and the bottom end of the fixed ring 2 is provided with a group of support frames 5;
unmanned aerial vehicle intelligent monitoring module 6 includes unmanned aerial vehicle unit and image analysis unit, unmanned aerial vehicle unit is used for controlling unmanned aerial vehicle, carries out information acquisition and transmission simultaneously, image analysis unit is used for algorithm processing image and analysis image.
The unmanned aerial vehicle unit comprises an automatic cruise module, a camera module and a communication module;
the automatic cruise module is used for navigating the unmanned aerial vehicle, and the unmanned aerial vehicle flies according to a set path by setting a program in the module;
the camera module is used for shooting beet pictures, pictures of the positions where the unmanned aerial vehicles are located and is matched with the automatic cruise module to assist the unmanned aerial vehicles to fly;
the communication module sends information to staff of the unmanned aerial vehicle base station, and information display APP for farmers to check the daily growth state of the beet can be provided;
the image analysis unit comprises a data module, an architecture module and an analysis module;
the data module is used for storing beet related data;
the architecture module is used for establishing a model for analyzing the image;
the analysis module processes and analyzes the image through the model;
the image analysis unit obtains a background algorithm of which kind of plant diseases and insect pests or drought and waterlogging conditions the image specifically corresponds to by identifying and analyzing the image;
the unmanned aerial vehicle unit utilizes an image recognition technology and depends on a camera module arranged on the unmanned aerial vehicle to transmit back the conditions in the plant growing area in real time; and the unmanned aerial vehicle unit analyzes the growth condition of each plant in the planting area through an algorithm.
A method of analyzing beet-related conditions for a drone system for monitoring pests, the method of analyzing comprising the steps of:
step 1: collecting the characteristics of the beets in a normal state and the characteristics of the beets in various pest and disease states in different degrees by utilizing the normal growth state of the beets in each growth period and the growth state pictures of the beets subjected to various pest and disease damages in different degrees;
step 2: constructing a filter1 corresponding to a first convolution layer in the Tensorflow model through an architecture module;
and step 3: performing differential processing on the beet images of 3 channels in a normal state and under various pest and disease states of different degrees by using the filter1 corresponding to the first convolutional layer set in the step 2, and acquiring the beet images of 3 channels into an array with the depth of 64;
and 4, step 4: entering a pooling layer, obtaining an array reflecting information by adopting a maximum pooling strategy, and reducing the size of the model by utilizing the pooling layer in a convolutional network model, thereby improving the calculation speed and improving the robustness of the extracted features; extracting feature salient values of the profiles of the beets under normal states and under various pest and disease states of different degrees by utilizing the layer, wherein the feature salient values are represented as feature units with one dimension of [ A, B, C and D ];
and 5: taking the array in the step 4 as the input of the convolutional layer in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states in different degrees, adjusting the parameter of a convolutional filter2 used in the layer by using the Tensorflow model in the step 2, and finally selecting a filter2 with the dimension parameter of [5, 5, 64, 128] to perform filtering operation on the input; the picture data in step 4 is differentiated into a unit with the depth of 128 after being filtered by the filter 2;
step 6: performing 2d data processing on the feature cell array obtained in the step 5 and extracted by the cell array S4 by using the conventional data processing method, and outputting a 50 x 50 cell set with the depth of 128;
and 7: taking the 50 x 50 unit set with the depth of 128 in the step 6 as the input of the convolutional layer performed in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states with different degrees, adjusting the parameter of the convolutional filter3 applied to the layer by using the Tensorflow model in the step 2, and finally selecting the filter3 with the dimension parameters of [25, 25, 25, 128] to perform filtering operation on the input;
and 8: flattening and tiling the result units after the filtering operation in the step 7; node marking and bedding are favorably carried out on the beet disease characteristics of the subsequent full-connection layer, so that the characteristics of the beet under the normal state and the beet under various pest and disease damage states with different degrees are deeply differentiated, and the characteristics of the beet under different states, including external contour and color characteristics, are extracted;
and step 9: marking the reduced feature nodes for the first time by using the first full-connection layer;
step 10: marking the characteristic nodes marked in the step 9 by using a second layer full connection layer for the second time of reducing the characteristic nodes;
step 11: adopting a dropout layer to reduce the degree of over-fitting and under-fitting of the second labeled feature nodes obtained in the step 10;
step 12: adjusting each parameter obtained in the steps 2-11 by taking the cross entropy as a loss function;
step 13: and finally, adding the features obtained in the step 12, distributing corresponding weights to the obtained sum to construct a classification function, and feeding back the model to output an identification result.
Further, in the step 10, the label processed by the first fully-connected layer is used for performing the second reduction, and the fully-connected layer of this layer reduces the range label from 1024 labels of the first reduced feature nodes to 512 features.
Claims (3)
1. The unmanned aerial vehicle system for monitoring the diseases and insect pests is characterized in that the unmanned aerial vehicle comprises a rack (1), a fixed ring (2), fixed frames (3), rotors (4), support frames (5) and an unmanned aerial vehicle intelligent monitoring module (6), wherein the center of the fixed ring (2) penetrates through the rack (1), the bottom end of the rack (1) is connected with the unmanned aerial vehicle intelligent monitoring module (6), the upper end of the fixed ring (2) is also provided with a group of fixed frames (3), each fixed frame (3) is connected with one rotor (4), and the bottom end of the fixed ring (2) is provided with a group of support frames (5);
the unmanned aerial vehicle intelligent monitoring module (6) comprises an unmanned aerial vehicle unit and an image analysis unit, wherein the unmanned aerial vehicle unit is used for controlling the unmanned aerial vehicle, collecting information and transmitting information simultaneously, and the image analysis unit is used for processing images and analyzing the images through algorithms.
2. A method of analyzing beet-related conditions using the drone system for monitoring pests of claim 1, the method comprising the steps of:
step 1: collecting the characteristics of the beets in a normal state and the characteristics of the beets in various pest and disease states in different degrees by utilizing the normal growth state of the beets in each growth period and the growth state pictures of the beets subjected to various pest and disease damages in different degrees;
step 2: constructing a filter1 corresponding to a first convolution layer in the Tensorflow model through an architecture module;
and step 3: performing differential processing on the beet images of 3 channels in a normal state and under various pest and disease states of different degrees by using the filter1 corresponding to the first convolutional layer set in the step 2, and acquiring the beet images of 3 channels into an array with the depth of 64;
and 4, step 4: entering a pooling layer, obtaining an array reflecting information by adopting a maximum pooling strategy, and reducing the size of the model by utilizing the pooling layer in a convolutional network model, thereby improving the calculation speed and improving the robustness of the extracted features;
and 5: taking the array in the step 4 as the input of the convolutional layer in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states in different degrees, adjusting the parameter of a convolutional filter2 used in the layer by using the Tensorflow model in the step 2, and finally selecting a filter2 with the dimension parameter of [5, 5, 64, 128] to perform filtering operation on the input;
step 6: performing 2d data processing on the feature unit array obtained in the step 5 and extracted by the unit array S4, and outputting a 50 x 50 unit set with the depth of 128;
and 7: taking the 50 x 50 unit set with the depth of 128 in the step 6 as the input of the convolutional layer performed in the step, performing deeper, finer-grained and higher-level unit differentiation and feature extraction on the beet in a normal state and the beet image data set in various pest and disease states with different degrees, adjusting the parameter of the convolutional filter3 applied to the layer by using the Tensorflow model in the step 2, and finally selecting the filter3 with the dimension parameters of [25, 25, 25, 128] to perform filtering operation on the input;
and 8: flattening and tiling the result units after the filtering operation in the step 7;
and step 9: marking the reduced feature nodes for the first time by using the first full-connection layer;
step 10: marking the characteristic nodes marked in the step 9 by using a second layer full connection layer for the second time of reducing the characteristic nodes;
step 11: adopting a dropout layer to reduce the degree of over-fitting and under-fitting of the second labeled feature nodes obtained in the step 10;
step 12: adjusting each parameter obtained in the steps 2-11 by taking the cross entropy as a loss function;
step 13: and finally, adding the features obtained in the step 12, distributing corresponding weights to the obtained sum to construct a classification function, and feeding back the model to output an identification result.
3. The method for analyzing beet-related disorders according to claim 2, wherein the step 10 is performed by performing a second narrowing with the labels processed by the first fully-connected layer, and the fully-connected layer of the layer narrows the range labels from 1024 labels of the first narrowed feature nodes to 512 features.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010607427.3A CN111638216A (en) | 2020-06-30 | 2020-06-30 | Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010607427.3A CN111638216A (en) | 2020-06-30 | 2020-06-30 | Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111638216A true CN111638216A (en) | 2020-09-08 |
Family
ID=72332180
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010607427.3A Pending CN111638216A (en) | 2020-06-30 | 2020-06-30 | Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111638216A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112370016A (en) * | 2020-11-04 | 2021-02-19 | 安晓玲 | Method for predicting health by fusing living environment information and body physiological parameter information |
CN114295614A (en) * | 2021-12-31 | 2022-04-08 | 湖北省农业科学院农业质量标准与检测技术研究所 | Tea tree pest and disease detection vehicle, detection device, detection system and detection method |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104794898A (en) * | 2015-04-30 | 2015-07-22 | 山东大学 | Special-region band-type private network transportation communication navigation monitoring and warning device and working method |
CN205940634U (en) * | 2016-08-15 | 2017-02-08 | 幻飞智控科技(上海)有限公司 | Environmental monitoring unmanned aerial vehicle |
CN206750135U (en) * | 2017-04-21 | 2017-12-15 | 黄鹏 | The dedicated unmanned machine of forestry pests & diseases monitoring preventing and treating |
CN108389354A (en) * | 2018-04-16 | 2018-08-10 | 南京森林警察学院 | A kind of method of unmanned plane joint ground micro robot detection forest ground fire |
CN208079019U (en) * | 2018-04-16 | 2018-11-09 | 哈尔滨哈程电气科技发展有限公司 | A kind of photovoltaic plant inspection device based on unmanned plane remote control and regulation |
CN109948693A (en) * | 2019-03-18 | 2019-06-28 | 西安电子科技大学 | Expand and generate confrontation network hyperspectral image classification method based on super-pixel sample |
CN209159986U (en) * | 2018-09-29 | 2019-07-26 | 比亚迪股份有限公司 | Unmanned plane |
CN209382263U (en) * | 2018-12-06 | 2019-09-13 | 胡良柏 | A kind of project planning remote sensing mapping aircraft |
CN110309762A (en) * | 2019-06-26 | 2019-10-08 | 扆亮海 | A kind of forestry health assessment system based on air remote sensing |
CN110427922A (en) * | 2019-09-03 | 2019-11-08 | 陈�峰 | One kind is based on machine vision and convolutional neural networks pest and disease damage identifying system and method |
CN210037304U (en) * | 2019-04-04 | 2020-02-07 | 哈尔滨跃渊环保智能装备有限责任公司 | Unmanned aerial vehicle system device for water quality collection |
CN111046793A (en) * | 2019-12-11 | 2020-04-21 | 北京工业大学 | Tomato disease identification method based on deep convolutional neural network |
CN111178121A (en) * | 2018-12-25 | 2020-05-19 | 中国科学院合肥物质科学研究院 | Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology |
CN111178177A (en) * | 2019-12-16 | 2020-05-19 | 西京学院 | Cucumber disease identification method based on convolutional neural network |
CN111339921A (en) * | 2020-02-24 | 2020-06-26 | 南京邮电大学 | Insect disease detection unmanned aerial vehicle based on lightweight convolutional neural network and detection method |
CN112370016A (en) * | 2020-11-04 | 2021-02-19 | 安晓玲 | Method for predicting health by fusing living environment information and body physiological parameter information |
-
2020
- 2020-06-30 CN CN202010607427.3A patent/CN111638216A/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104794898A (en) * | 2015-04-30 | 2015-07-22 | 山东大学 | Special-region band-type private network transportation communication navigation monitoring and warning device and working method |
CN205940634U (en) * | 2016-08-15 | 2017-02-08 | 幻飞智控科技(上海)有限公司 | Environmental monitoring unmanned aerial vehicle |
CN206750135U (en) * | 2017-04-21 | 2017-12-15 | 黄鹏 | The dedicated unmanned machine of forestry pests & diseases monitoring preventing and treating |
CN108389354A (en) * | 2018-04-16 | 2018-08-10 | 南京森林警察学院 | A kind of method of unmanned plane joint ground micro robot detection forest ground fire |
CN208079019U (en) * | 2018-04-16 | 2018-11-09 | 哈尔滨哈程电气科技发展有限公司 | A kind of photovoltaic plant inspection device based on unmanned plane remote control and regulation |
CN209159986U (en) * | 2018-09-29 | 2019-07-26 | 比亚迪股份有限公司 | Unmanned plane |
CN209382263U (en) * | 2018-12-06 | 2019-09-13 | 胡良柏 | A kind of project planning remote sensing mapping aircraft |
CN111178121A (en) * | 2018-12-25 | 2020-05-19 | 中国科学院合肥物质科学研究院 | Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology |
CN109948693A (en) * | 2019-03-18 | 2019-06-28 | 西安电子科技大学 | Expand and generate confrontation network hyperspectral image classification method based on super-pixel sample |
CN210037304U (en) * | 2019-04-04 | 2020-02-07 | 哈尔滨跃渊环保智能装备有限责任公司 | Unmanned aerial vehicle system device for water quality collection |
CN110309762A (en) * | 2019-06-26 | 2019-10-08 | 扆亮海 | A kind of forestry health assessment system based on air remote sensing |
CN110427922A (en) * | 2019-09-03 | 2019-11-08 | 陈�峰 | One kind is based on machine vision and convolutional neural networks pest and disease damage identifying system and method |
CN111046793A (en) * | 2019-12-11 | 2020-04-21 | 北京工业大学 | Tomato disease identification method based on deep convolutional neural network |
CN111178177A (en) * | 2019-12-16 | 2020-05-19 | 西京学院 | Cucumber disease identification method based on convolutional neural network |
CN111339921A (en) * | 2020-02-24 | 2020-06-26 | 南京邮电大学 | Insect disease detection unmanned aerial vehicle based on lightweight convolutional neural network and detection method |
CN112370016A (en) * | 2020-11-04 | 2021-02-19 | 安晓玲 | Method for predicting health by fusing living environment information and body physiological parameter information |
Non-Patent Citations (3)
Title |
---|
SHREYA GHOSAL 等: "Rice Leaf Diseases Classification Using CNN With Transfer Learning", 《PROCEEDINGS OF 2020 IEEE CALCUTTA CONFERENCE》 * |
庞浩 等: "用于糖尿病视网膜病变检测的深度学习模型", 《软件学报》 * |
毕秀丽 等: "基于级联卷积神经网络的图像篡改检测算法", 《电子与信息学报》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112370016A (en) * | 2020-11-04 | 2021-02-19 | 安晓玲 | Method for predicting health by fusing living environment information and body physiological parameter information |
CN114295614A (en) * | 2021-12-31 | 2022-04-08 | 湖北省农业科学院农业质量标准与检测技术研究所 | Tea tree pest and disease detection vehicle, detection device, detection system and detection method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109470299A (en) | A kind of plant growth information monitoring system and method based on Internet of Things | |
CN103336966B (en) | A kind of weed images discrimination method being applied to agricultural intelligent machine | |
CN110455340A (en) | A kind of agricultural cultivation EMS based on big data | |
CN111638216A (en) | Beet-related disease analysis method for unmanned aerial vehicle system for monitoring plant diseases and insect pests | |
CN114818909B (en) | Weed detection method and device based on crop growth characteristics | |
Roldán-Serrato et al. | Automatic pest detection on bean and potato crops by applying neural classifiers | |
CN113034301A (en) | Crop growth management system and method | |
CN115861721B (en) | Livestock and poultry breeding spraying equipment state identification method based on image data | |
EP3626077A1 (en) | Pest control | |
CN114723667A (en) | Agricultural fine planting and disaster prevention control system | |
CN115294518B (en) | Intelligent monitoring method and system for precise greenhouse cultivation of horticultural plants | |
CN112116206A (en) | Intelligent agricultural system based on big data | |
CN113469112A (en) | Crop growth condition image identification method and system | |
CN114298615A (en) | Crop planting risk prevention method and device, storage medium and equipment | |
CN116300608A (en) | Intelligent agriculture remote monitoring system based on big data | |
FR3071644A1 (en) | METHOD AND DEVICE FOR CLASSIFYING PLANTS | |
CN113989689B (en) | Crop pest and disease damage identification method and system based on unmanned aerial vehicle | |
CN111045467B (en) | Intelligent agricultural machine control method based on Internet of things | |
CN113377141A (en) | Artificial intelligence agricultural automatic management system | |
KR102393265B1 (en) | System for detecting pests of shiitake mushrooms | |
Zhang et al. | Automatic counting of lettuce using an improved YOLOv5s with multiple lightweight strategies | |
CN116681929A (en) | Wheat crop disease image recognition method | |
CN115424151A (en) | Agricultural intelligent platform based on image processing | |
CN113344009A (en) | Light and small network self-adaptive tomato disease feature extraction method | |
Santhosh Kumar et al. | Review on disease detection of plants using image processing and machine learning techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200908 |
|
RJ01 | Rejection of invention patent application after publication |