CN115115275A - AI-based unattended crop disease and pest monitoring and early warning system - Google Patents
AI-based unattended crop disease and pest monitoring and early warning system Download PDFInfo
- Publication number
- CN115115275A CN115115275A CN202210915784.5A CN202210915784A CN115115275A CN 115115275 A CN115115275 A CN 115115275A CN 202210915784 A CN202210915784 A CN 202210915784A CN 115115275 A CN115115275 A CN 115115275A
- Authority
- CN
- China
- Prior art keywords
- pest
- warning system
- early warning
- subsystem
- monitoring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 241000607479 Yersinia pestis Species 0.000 title claims abstract description 162
- 238000012544 monitoring process Methods 0.000 title claims abstract description 85
- 201000010099 disease Diseases 0.000 title claims abstract description 53
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 title claims abstract description 53
- 238000001514 detection method Methods 0.000 claims abstract description 36
- 239000002689 soil Substances 0.000 claims abstract description 25
- 238000004891 communication Methods 0.000 claims abstract description 22
- 238000012806 monitoring device Methods 0.000 claims abstract description 13
- 230000006870 function Effects 0.000 claims abstract description 12
- 238000003062 neural network model Methods 0.000 claims abstract description 5
- 230000003993 interaction Effects 0.000 claims abstract description 4
- 238000012545 processing Methods 0.000 claims abstract description 3
- 239000000575 pesticide Substances 0.000 claims description 23
- 238000005507 spraying Methods 0.000 claims description 10
- 239000007921 spray Substances 0.000 claims description 6
- 238000013178 mathematical model Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000007405 data analysis Methods 0.000 claims description 3
- 238000013480 data collection Methods 0.000 claims description 2
- 239000013589 supplement Substances 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 claims description 2
- 241000238631 Hexapoda Species 0.000 abstract description 11
- 238000005516 engineering process Methods 0.000 abstract description 9
- 238000007726 management method Methods 0.000 description 46
- 238000000034 method Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012827 research and development Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000000889 atomisation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013079 data visualisation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06313—Resource planning in a project environment
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M5/00—Catching insects in fields, gardens, or forests by movable appliances
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M7/00—Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
- A01M7/0003—Atomisers or mist blowers
- A01M7/0014—Field atomisers, e.g. orchard atomisers, self-propelled, drawn or tractor-mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M2200/00—Kind of animal
- A01M2200/01—Insects
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Resources & Organizations (AREA)
- Pest Control & Pesticides (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Environmental Sciences (AREA)
- Zoology (AREA)
- Wood Science & Technology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Insects & Arthropods (AREA)
- Development Economics (AREA)
- Forests & Forestry (AREA)
- Ecology (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Mining & Mineral Resources (AREA)
- Agronomy & Crop Science (AREA)
- Educational Administration (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Primary Health Care (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Catching Or Destruction (AREA)
Abstract
The invention relates to the technical field of agricultural pest control, and particularly discloses an AI-based unattended crop pest monitoring and early warning system, which comprises a background management server with a man-machine interaction function, a plurality of pest monitoring robots capable of cruising autonomously and a plurality of soil environment monitoring devices respectively matched with the pest monitoring robots, wherein the background management server is provided with a background management and early warning system for dispatching the pest monitoring robots and receiving detection data of the soil environment monitoring devices, the pest monitoring robots are internally provided with a driving control board for executing component control and an edge computing platform for operating high-rise business and algorithm functions, a pest detection subsystem of the pest monitoring robots is provided with a deep neural network model for analyzing and computing the acquired data, and a data communication subsystem links the detection information and the processing result to the background management and early warning system, the problem of traditional plant diseases and insect pests monitoring technology monitoring untimely, inaccurate is solved.
Description
Technical Field
The application relates to the technical field of agricultural pest control, and particularly discloses an AI-based unattended crop pest monitoring and early warning system.
Background
Automatic monitoring of crop diseases and insect pests, prediction of diseases and insect pests and early intervention and fixed-point spraying are one of important research directions of agricultural modernization, and have important significance on national food safety and agricultural economic development. Therefore, the research and development of agricultural artificial intelligence needs to be accelerated, and a new generation of agricultural robots with strong research and development adaptability, high cost performance and intelligent decision making is required.
In the aspect of crop pest situation monitoring, in recent years, a plurality of students use an image recognition mode in machine learning to research the detection and early warning of various trapped and killed pests, but the detection and statistical modes are improved based on a fixed point indirect monitoring mode, the research of an image recognition algorithm is emphasized, the pest situations of all points in a farm place cannot be flexibly monitored, and large-range representative pest types and population densities cannot be acquired.
In the aspect of crop pest monitoring, the traditional method still is that agricultural technicians manually patrol the pest identification of crops in a mode of human eyes or mobile phone APP, and the method is low in efficiency and high in error rate. Some departments use unmanned aerial vehicle infrared imaging or satellite remote sensing technology to detect diseases of crops in a large range. The technology needs a large range of crops to generate diseases, so that the diseases are easy to detect, fine detection cannot be carried out, and the flexibility is low. The method has the advantages of high cost, poor flexibility, difficulty in obtaining comprehensive and accurate pest and disease conditions and incapability of timely carrying out fixed-point spraying and killing treatment.
In the aspect of crop pest treatment, manual fixed-point spraying or large-area spraying by an unmanned aerial vehicle is generally adopted. The efficiency of manual fixed-point spraying is low, the treatment is very untimely, and the production idea of advocating green food is not in line with that of large-area excessive spraying of the unmanned aerial vehicle.
Therefore, the inventor provides an AI-based unattended crop pest monitoring and early warning system to solve the above problems.
Disclosure of Invention
The invention aims to solve the problem that the traditional pest and disease monitoring technology is not timely and accurate in monitoring.
In order to achieve the aim, the basic scheme of the invention provides an AI-based unattended crop disease and pest monitoring and early warning system, which comprises a background management server with a man-machine interaction function, a plurality of disease and pest monitoring robots capable of cruising autonomously, and a plurality of soil environment monitoring devices respectively matched with the disease and pest monitoring robots;
the background management server is provided with a background management early warning system which is used for scheduling the pest monitoring robot and receiving the detection data of the soil environment monitoring device;
the pest monitoring robot is internally provided with a driving control board for executing component control and an edge computing platform for operating high-level business and algorithm functions, and a control protocol for communication is arranged between the edge computing platform and the driving control board;
the pest monitoring robot is characterized in that a pest monitoring subsystem for collecting crop and environment data, a pest trapping and killing subsystem for trapping and killing pests, a pest detection subsystem for detecting pests and diseases, an autonomous cruise subsystem for controlling an autonomous cruise path of the pest monitoring robot, a driving subsystem for controlling driving of the pest monitoring robot, a targeted pesticide application subsystem for spraying pesticide, a data communication subsystem for data transmission and a remote management subsystem for remote communication and remote control are preset in an edge computing platform;
a deep neural network model for analyzing and calculating data collected by the data collection subsystem is built in the pest detection subsystem, and the data communication subsystem links detection information and processing results to the background management early warning system.
Further, soil environment monitoring devices information collection includes: soil moisture content, soil pH value, soil EC value, soil temperature value, rainfall and microelement content, and uploading the collected information to a background management early warning system through a paired pest monitoring robot.
Furthermore, the data acquisition subsystem comprises two high-definition cameras which are electrically connected with the edge computing platform and are used for respectively acquiring images of the crops and the trapped and killed pests.
Furthermore, the driving main control board is electrically connected with a wireless radio frequency module used for detecting information detected by the soil environment detection device, and the data acquisition subsystem collects data received by the wireless radio frequency module and transmits the data to the background management early warning system.
Further, after the pest and disease damage detection subsystem detects the target position of the crop pest and disease damage, the target pesticide application subsystem is driven by the driving control board to spray pesticide to the target position of the pest and disease damage.
Further, the pest trapping and killing subsystem comprises a pest trapping and killing lamp and an LED light supplementing lamp which are electrically connected with the driving control board.
Furthermore, the pest trapping and killing subsystem can also detect the pest species and quantity through the deep neural network model and upload the pest species and quantity to the background management early warning system through the data communication subsystem.
Furthermore, a laser radar electrically connected with the edge computing platform is built in the autonomous cruise subsystem, the field environment can be scanned through the laser radar to build a point cloud map, and an autonomous navigation mode is arranged to sequentially drive the pest and disease monitoring robot to travel to a closed-loop navigation point set by the background management early warning system.
Further, the background management early warning system can provide visual historical data analysis, real-time pest and disease warning, pest and disease disaster early warning, robot configuration management, real-time remote video and robot remote control for an administrator.
Furthermore, the background management early warning system can classify and store data uploaded by each robot, and a disease and pest outbreak mathematical model for predicting the recent crop disease and pest occurrence range and probability is built.
The principle and effect of this basic scheme lie in:
1. the system loads the pest trapping and killing subsystem on the mobile robot, and is matched with an AI pest detection technology, so that the data monitoring of the crop pest species and the population density is more flexible and large-range, and has more representative characteristics.
2. The disease and pest monitoring robot provided by the system is a mobile robot, and when the robot is cruising in a farmland, a high-definition camera and an AI disease and pest detection technology are used, so that the crop disease and pest monitoring has the effects of flexibility, low cost, high accuracy, high efficiency, unattended operation and timeliness, and the problem that the traditional disease and pest monitoring technology is untimely and inaccurate in monitoring is solved.
3. The pest and disease monitoring robot of the system can carry out targeted pesticide application when moving, so that crop pest and disease killing operation is timely, efficient and green.
4. The background management early warning system provided by the system carries out early warning on serious disease and insect disasters in the future through the disease and insect outbreak mathematical model based on the influence of meteorological factors, and shows good economical efficiency.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating an application scenario of an AI-based unattended crop pest monitoring and early warning system according to an embodiment of the present application;
fig. 2 shows a hardware block diagram of a pest monitoring robot of an AI-based unattended crop pest monitoring and early warning system according to an embodiment of the application;
fig. 3 shows a system software block diagram of an AI-based unattended crop pest monitoring and early warning system provided by the embodiment of the application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following is further detailed by the specific embodiments:
an embodiment 1 of the AI-based unattended crop disease and pest monitoring and early warning system is shown in figure 3, and the system consists of a background management server, a plurality of disease and pest detection robots and a soil environment monitoring device. A background management early warning system is arranged in a background management server in the system, and a set of background management early warning system can simultaneously manage and receive a plurality of pest detection robots, soil environment monitoring devices and information transmitted by the pest detection robots and the soil environment monitoring devices. And the background management early warning system can realize the human-computer interaction with the administrator.
During the popularization and implementation of the system, the number of the pest monitoring robots and the number of the soil detection devices can be determined according to the planting range and density of crops. The larger and denser the crop planting range is, the more the number of pest monitoring robots and paired soil monitoring devices is. During the use, a plurality of pest monitoring robots can be connected to a background management early warning system through a network, and are uniformly managed and dispatched by the system. The connected network can be a common mobile network or a wireless network (the common mobile network can be a 4G or 5G network, and the wireless network is a WLAN network), when the pest monitoring robot is covered by a stable wireless network nearby, the pest monitoring robot is preferentially connected with the background management early warning system through the wireless network, and then the common mobile network is selected for connection. When using pest monitoring robot, still need be equipped with the electric pile that fills that supplies pest monitoring robot to charge, should all dispose in every pest monitoring robot cruising within range and fill electric pile.
The pest monitoring robot is characterized in that a crop and environment data acquisition subsystem, a pest trapping and killing subsystem, a pest detection subsystem, an autonomous cruise subsystem, a driving subsystem, a targeted pesticide application subsystem, a data communication subsystem and a remote management subsystem are preset in the edge computing platform, wherein the crop and environment data acquisition subsystem is used for acquiring crops and environment data, the pest trapping and killing subsystem is used for trapping and killing pests, the pest detection subsystem is used for detecting pests and diseases, the autonomous cruise subsystem is used for controlling an autonomous cruise path of the pest monitoring robot, the driving subsystem is used for controlling driving of the pest monitoring robot, the targeted pesticide application subsystem is used for spraying pesticide, the data communication subsystem is used for data transmission, and the remote management subsystem is used for remote communication and remote control.
In this embodiment, the design of the hardware architecture of the pest monitoring robot is shown in fig. 2. The pest monitoring robot comprises two modules: a drive control board and an edge computing platform.
The driving main control board is connected with a motor, a temperature and humidity illumination sensor, a wireless charging module, a wireless radio frequency module, a 6-degree-of-freedom holder, a pesticide spray gun (an execution part of a targeted pesticide application subprogram), an LED light supplement lamp and a pest trapping and killing lamp (an execution part of a pest trapping and killing subprogram) through a physical line. The motor is used to drive the wheels (which is part of the execution of the travel drive subroutine). The 6-degree-of-freedom holder has the functions of lifting, pitching and rotating, and the camera 1 and the pesticide application spray gun are installed on the 6-degree-of-freedom holder. The wireless radio frequency module is used for being paired with an external soil environment monitoring device for communication. The driving main control board executes the instruction of the edge computing platform (a control protocol is preset between the driving main control board and the edge computing platform) and completes the query control function on other connecting parts.
The pest monitoring robot uses an embedded edge computing platform (upper computer) to operate the high-level business and algorithm functions of the robot, uses an STM32 single chip microcomputer as a driving main control board (lower computer) to complete the control of an execution component, uses a USB serial port between the upper computer and the lower computer to communicate according to a control protocol, and can send inquiry and execution instructions to the lower computer.
The edge computing platform is connected with the camera 1, the camera 2, the Beidou positioning module, the WLAN/4G/5G communication module and the laser radar through USB interfaces, wherein the edge computing platform can be a Jetson series embedded artificial intelligence platform of the Invida company, such as Jetson AGX Orin and Jetson Xavier NX series. The specific parameters of the laser radar can be selected from different types and brands of laser radars according to the set cruising speed.
The business software architecture of the pest monitoring robot is shown in fig. 3. The service software can be developed on the basis of an ROS operating system framework by selecting to run a UBUNTU operating system. For the ROS, 8 subsystems in the pest monitoring robot can be programmed through 8 nodes in an ROS operating system. The nodes communicate with each other using a "topic/subscription" or "request/service" mechanism in the ROS. Various configuration parameters of the robot may be stored using a parameter server in the ROS.
In this embodiment, the data acquisition subsystem uses two universal USB cameras to acquire images, and issues the images to the pest trapping and killing subsystem and the pest detection subsystem through "topic/subscription" after the acquisition is completed. Under the condition of remote management, the acquisition subsystem also simultaneously issues video streams to the data communication subsystem for pushing the live video streams to users. The data acquisition subsystem in the embedded edge computing platform also queries data such as temperature, humidity, illumination, soil environment and the like from the driving main control board at regular time.
In the embodiment of the invention, the pest trapping and killing subsystem and the pest detecting subsystem are deployed on the edge computing platform by using a trained YOLOV5 multi-target detection model to detect and classify pests. The detected result is published to the data communication subsystem through topic/subscription. The Yolov5 model is trained by collecting actual pest and disease images in the farmland, and the training data cover 220 common scenes of diseases and pests and diseases as much as possible, and also cover different scenes such as rainy days, backlight, foggy days, nights and the like, so that the generalization capability of the model is improved.
In this embodiment, the targeted pesticide application system subscribes to the detection result issued by the pest detection subsystem, issues the motion parameter to the driving system according to the deviation degree between the position detected by the pest and the image center, and performs atomization pesticide application after adjusting the target positions of the camera 1 and the spray gun through the pan-tilt. The pesticide spray gun uses a small-sized water pump to connect with a pesticide box body through a soft conduit, and the pesticide box body is provided with a water level detection device. When the edge computing platform inquires the drive control panel that the pesticide amount is insufficient, the state is sent to a background management early warning system to inform an administrator.
In this embodiment, the data communication subsystem uses an HTTP protocol to periodically send a heartbeat packet to the background management and early warning system, where the heartbeat packet includes a device name, a product serial number, a device type, an MAC address, a packet, an IP, a signal value, an electric quantity, a positioning coordinate, an operation mode, a software version number, a fault code, pesticide quantity basic information, and the like of the robot. When the pest trapping and killing subsystem and the pest and disease damage detection subsystem detect and issue detection results to the data communication subsystem, the data communication subsystem packages the obtained results and uploads the packaged results to the platform management early warning system.
In the embodiment, the autonomous cruise subsystem uses the lidar to scan and map through the function packet slam with an open source in the ROS, and the mapping method can use gmapping, or vector, or karto. After the map is built, a plurality of closed-loop target point sequences are set to a background management early warning system and used as parameters to provide an open source Navigation (Navigation) functional packet in the ROS to realize closed-loop path planning Navigation, and the closed-loop path planning Navigation is used as a basis for unattended intelligent cruise.
In this embodiment, after receiving the response packet of the background management early warning system responding to the heartbeat, the pest monitoring robot analyzes the ifwebsocket connection in the response packet, and if the response packet is true, actively initiates a Websocket long connection to the background management early warning system. After the Websocket connection is established, the background management early warning system sends different commands to the robot to achieve the functions of manual driving, manual spraying, configuration modification, restoration, upgrading, field video stream pushing and the like.
In this embodiment, a software architecture of the background management early warning system is shown in fig. 3, the system uses Nginx as a reverse proxy to process http requests sent by a robot and a user terminal, a background service program is developed by using a Django web application framework based on Python, and an open-source uWSGI library is used to process the http service requests and forward the http service requests to the web application framework.
In this embodiment, the device management module of the background management early warning system receives heartbeat requests (status reports) periodically sent by all robots, and stores status information of the robots in the MySQL database according to different robot serial numbers.
In this embodiment, the data storage and visualization module of the background management and early warning system receives result data packets of pest and disease detection, pest situation monitoring, and crop growth situation reported by all robots, and stores the results in the MySQL database in a classified manner. When the user views the related historical data at the terminal, the historical data is acquired through the data sending request interface and displayed in the page.
In the embodiment, key meteorological index factors influencing the outbreak of the plant diseases and insect pests can be screened out by means of a geographic detector model, autoregression, principal component analysis and the like based on meteorological data of nearly 70 years in Guangxi, a regional plant disease and insect pest prediction and forecast model based on historical meteorological data is established, the parameters of the whole model for regional plant disease and insect pest prediction are finely adjusted according to microclimate data of a field meteorological small-sized meteorological station, and a plant disease and insect pest disaster prediction model under a microclimate condition is established. The predicted results are among the pages using the model in the data analysis module.
In this embodiment, the background management early warning system further includes a configuration management module, and configuration information of the pest monitoring robot can be acquired through the read parameters. The background management early warning system also comprises an authority management module and a remote management module. The administrator can realize grouping addition and deletion modification and user addition and deletion modification through the authority management module. The administrator can realize the functions of on-site video communication, map display and remote operation through the remote management module.
When the system is implemented, after the pest monitoring robot is connected with and registered with the background management early warning system, the cruising environment is scanned by the laser radar to build a picture. And then, the administrator sets closed loop position points for cruising on a site map in sequence through the background management early warning system, so that the pest monitoring robot can enter an automatic cruising mode to work.
The background management early warning system can be deployed on cloud platforms such as Ariiyun, Tencent cloud and Huashi cloud, and can also be deployed in an entity physical server in a local area network. The administrator can receive the field data and the early warning information in real time in the background management early warning system, and remotely control the robot and check the field real-time video pictures through a manual mode when necessary.
The system can load the pest trapping and killing subsystem on the mobile robot, and enables the data monitoring of the crop pest species and the population density to be more flexible and large-range and have more representative characteristics by matching with an AI pest detection technology;
the pest and disease monitoring robot used by the system is a movable robot, and when the system is used for cruising in a farm land, a high-definition camera and an AI pest and disease detection technology are used, so that the crop pest and disease monitoring has the effects of flexibility, low cost, high accuracy, high efficiency, unattended operation and timeliness;
the pest and disease monitoring robot used by the system performs targeted pesticide application while moving, so that crop pest and disease killing operation is timely, efficient and green;
the background management early warning system provided in the system carries out early warning on serious disease and insect disasters in the future through the disease and insect outbreak mathematical model based on the influence of meteorological factors, and shows good economical efficiency.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (10)
1. An AI-based unattended crop disease and pest monitoring and early warning system is characterized by comprising a background management server with a man-machine interaction function, a plurality of disease and pest monitoring robots capable of cruising autonomously, and a plurality of soil environment monitoring devices respectively matched with the disease and pest monitoring robots;
the background management server is provided with a background management early warning system which is used for scheduling the pest monitoring robot and receiving the detection data of the soil environment monitoring device;
the pest monitoring robot is internally provided with a driving control board for executing component control and an edge computing platform for operating high-level business and algorithm functions, and a control protocol for communication is arranged between the edge computing platform and the driving control board;
the pest monitoring robot is characterized in that a pest monitoring subsystem for collecting crop and environment data, a pest trapping and killing subsystem for trapping and killing pests, a pest detection subsystem for detecting pests and diseases, an autonomous cruise subsystem for controlling an autonomous cruise path of the pest monitoring robot, a driving subsystem for controlling driving of the pest monitoring robot, a targeted pesticide application subsystem for spraying pesticide, a data communication subsystem for data transmission and a remote management subsystem for remote communication and remote control are preset in an edge computing platform;
a deep neural network model for analyzing and calculating data collected by the data collection subsystem is built in the pest detection subsystem, and the data communication subsystem links detection information and processing results to the background management early warning system.
2. The AI-based unmanned crop pest monitoring and early warning system of claim 1, wherein the soil environment monitoring device collects information including: soil moisture content, soil pH value, soil EC value, soil temperature value, rainfall and microelement content, and uploading the collected information to a background management early warning system through a paired pest monitoring robot.
3. The AI-based unattended crop pest monitoring and warning system according to claim 2, wherein the data acquisition subsystem comprises two high-definition cameras electrically connected to the edge computing platform for image acquisition of the crop and the trapped and killed pests respectively.
4. The AI-based unattended crop pest monitoring and early warning system according to claim 3, wherein the driving main control board is electrically connected with a wireless radio frequency module for information detected by the soil environment detection device, and the data acquisition subsystem collects data received by the wireless radio frequency module and transmits the data to the background management and early warning system.
5. The AI-based unattended crop pest monitoring and early warning system according to claim 2 or 4, wherein the pest detection subsystem drives the targeted pesticide application subsystem to spray pesticide to the target pest position by driving the control board after detecting the target pest position of the crop.
6. The AI-based unattended crop pest monitoring and warning system according to claim 5, wherein the pest trapping and killing subsystem comprises a pest trapping and killing lamp and an LED light supplement lamp electrically connected with the driving control board.
7. The AI-based unmanned aerial vehicle pest monitoring and pre-warning system of claim 6, wherein the pest trapping and killing subsystem further detects the type and quantity of pests through the deep neural network model and uploads the type and quantity of pests to the background management and pre-warning system through the data communication subsystem.
8. The AI-based unattended crop pest monitoring and early warning system according to claim 2 or 6, wherein the autonomous cruise subsystem is internally provided with a laser radar electrically connected with the edge computing platform, and can scan the field environment through the laser radar to establish a point cloud map, and is provided with an autonomous navigation mode to sequentially drive the pest monitoring robot to travel to a closed-loop navigation point set by the background management and early warning system.
9. The AI-based unattended crop pest monitoring and early warning system according to claim 1, wherein the background management and early warning system can provide visual historical data analysis, real-time pest warning, pest disaster early warning, robot configuration management, real-time remote video and robot remote control to an administrator.
10. The AI-based unattended crop pest monitoring and early warning system according to claim 9, wherein the background management and early warning system is capable of classifying and storing data uploaded by each robot and building a pest outbreak mathematical model for predicting the recent crop pest occurrence range and probability.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210915784.5A CN115115275A (en) | 2022-08-01 | 2022-08-01 | AI-based unattended crop disease and pest monitoring and early warning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210915784.5A CN115115275A (en) | 2022-08-01 | 2022-08-01 | AI-based unattended crop disease and pest monitoring and early warning system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115115275A true CN115115275A (en) | 2022-09-27 |
Family
ID=83334862
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210915784.5A Pending CN115115275A (en) | 2022-08-01 | 2022-08-01 | AI-based unattended crop disease and pest monitoring and early warning system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115115275A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116206202A (en) * | 2023-02-21 | 2023-06-02 | 北京甲板智慧科技有限公司 | Forestry plant diseases and insect pests intelligent monitoring management system based on AI automatic identification |
-
2022
- 2022-08-01 CN CN202210915784.5A patent/CN115115275A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116206202A (en) * | 2023-02-21 | 2023-06-02 | 北京甲板智慧科技有限公司 | Forestry plant diseases and insect pests intelligent monitoring management system based on AI automatic identification |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11889793B2 (en) | Internet-of-things management and control system for intelligent orchard | |
Singh et al. | An intelligent WSN-UAV-based IoT framework for precision agriculture application | |
Namani et al. | Smart agriculture based on IoT and cloud computing | |
CN112990262B (en) | Integrated solution system for monitoring and intelligent decision of grassland ecological data | |
US20170172077A1 (en) | Property landscape management apparatus and method | |
CN106332855A (en) | Automatic early warning system for pests and diseases | |
CN104764533A (en) | Intelligent agricultural system based on unmanned aerial vehicle image collecting and thermal infrared imager | |
CN105737897A (en) | Distributed large-field meteorological remote data monitoring system | |
CN110738195A (en) | poultry farm cultivation quantity recognition equipment based on image recognition | |
CN112101276A (en) | Forest resource distributed management system | |
CN112465109A (en) | Green house controlling means based on cloud limit is in coordination | |
CN114545833B (en) | Intelligent interactive processing system of facility agriculture based on internet of things | |
CN112528912A (en) | Crop growth monitoring embedded system and method based on edge calculation | |
CN115115275A (en) | AI-based unattended crop disease and pest monitoring and early warning system | |
CN113865640A (en) | Monitoring system based on Internet of things | |
CN116295662B (en) | Crop growth state monitoring method and device, electronic equipment and medium | |
CN114115020A (en) | Intelligent control system and control method for height of unmanned aerial vehicle | |
CN116938972A (en) | End cloud collaborative agricultural condition detection system based on 5G communication and machine learning | |
CN113349188B (en) | Lawn and forage precise weeding method based on cloud weeding spectrum | |
CN118427519A (en) | Unmanned aerial vehicle intelligence inspection system based on artificial intelligence | |
Khan et al. | Harnessing 5G Networks for Enhanced Precision Agriculture: Challenges and potential Solutions | |
CN116820109A (en) | Agricultural robot path position control system | |
CN114898361A (en) | Peach orchard fruit state identification and counting method and system | |
Li et al. | UAVs-Based Smart Agriculture IoT Systems: An Application-Oriented Design | |
Saif et al. | Smart e-agriculture monitoring systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |