CN113781551B - Tea garden plant state monitoring and management system and method based on visual perception - Google Patents

Tea garden plant state monitoring and management system and method based on visual perception Download PDF

Info

Publication number
CN113781551B
CN113781551B CN202111039520.XA CN202111039520A CN113781551B CN 113781551 B CN113781551 B CN 113781551B CN 202111039520 A CN202111039520 A CN 202111039520A CN 113781551 B CN113781551 B CN 113781551B
Authority
CN
China
Prior art keywords
module
detection
algorithm
tea
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111039520.XA
Other languages
Chinese (zh)
Other versions
CN113781551A (en
Inventor
杨春勇
刘宇航
倪文军
舒振宇
侯金
周城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South Central Minzu University
Original Assignee
South Central University for Nationalities
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South Central University for Nationalities filed Critical South Central University for Nationalities
Priority to CN202111039520.XA priority Critical patent/CN113781551B/en
Publication of CN113781551A publication Critical patent/CN113781551A/en
Application granted granted Critical
Publication of CN113781551B publication Critical patent/CN113781551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/09Watering arrangements making use of movable installations on wheels or the like
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/16Control of watering
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation
    • G06N5/022Knowledge engineering; Knowledge acquisition
    • G06N5/025Extracting rules from data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

The invention discloses a visual perception-based tea garden plant state monitoring and management system and method, and relates to the field of intelligent monitoring and management of tea garden plant states. The system comprises a data acquisition module, an algorithm detection module and a robot module which are sequentially communicated; the data acquisition module is embedded with a camera module, a preprocessing module and a data output module which are sequentially interacted; the algorithm detection module is embedded with a disease and pest detection module, a tea detection module and a tea state detection module; the robot module is embedded with an algorithm positioning module, a real-time display module and a decision executing module, wherein the algorithm positioning module, the real-time display module and the decision executing module interact sequentially, and the algorithm positioning module interact back and forth. The intelligent monitoring management system and the intelligent monitoring management method realize the intelligent monitoring management of the tea garden plants with higher precision, reduce a large amount of manpower, reduce the overall cost of equipment, can be suitable for the field of intelligent monitoring management of the tea garden plants, and have wide prospects.

Description

Tea garden plant state monitoring and management system and method based on visual perception
Technical Field
The invention relates to the field of intelligent monitoring and management of plant states of tea gardens, in particular to a system and a method for monitoring and managing plant states of tea gardens based on visual perception.
Background
Along with the rapid expansion of the tea industry scale, the problems of pest control, weed control and the like in the tea production in China are more and more prominent, and the yield of the tea is seriously affected. Traditional tea garden management relies on a large amount of manual work, carries out regularly periodic monitoring and management to the plant in the tea garden, and is time consuming and labor consuming. The tea garden plant monitoring technology based on image recognition can build a related robot system platform to realize real-time tea plant pest identification, weed identification, growth state monitoring and the like, and the robot system is used for better helping staff manage the tea garden, so that the tea garden plant monitoring technology is beneficial to improving tea yield and tea quality, and has important significance for building a high-standard healthy tea garden.
At present, related researches are carried out at home and abroad aiming at the tea garden plant monitoring technology, and certain progress is made. But at the same time there are a number of problems. For example, in the publication of "Evaluating green tea quality based on multisensor data fusion combining hyperspectral imaging and olfactory visualization systems" by Li, xie et al in Journal of the Science of Food and Agriculture 2019,99 (4): 1787-1794, a green tea quality evaluation system based on a multi-sensor data fusion and olfactory visualization system is proposed, and in the publication of "Using FTIR spectra and pattern recognition for discrimination of tea varieties" by CAI, WANG et al in International Journal Of Biological Macromolecules 2015,78:439-446, near infrared spectrum is used to build a self-organizing neural network model for tea classification, so that the classification accuracy is improved. However, most of such related practices are costly to peripheral equipment, and it is difficult to define well-defined detection standards for data acquired by infrared spectra, hyperspectral spectra, sensors, and the like; wang Jian et al published "research on tea image segmentation algorithm combining color and region growth" in journal of tea science, and proposed a tea tender shoot segmentation recognition method based on color and region growth to complete segmentation of tea tender shoots. However, most of such tests are based on laboratory environment, and the background is single, so that the test is difficult to be applied to natural environment.
In summary, research on plant detection systems and methods for tea gardens at home and abroad mostly has the defects of high equipment cost, difficult definition of data acquisition standards and single data acquisition test environment, and cannot realize low-cost and high-efficiency real-time monitoring and management on plant states of tea gardens in natural environments.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings of the prior art, and provides a tea garden plant state monitoring and managing system and a method based on visual perception, which reduce equipment cost and realize low-cost and high-efficiency real-time monitoring and managing of the tea garden plant state.
In order to achieve the above purpose, the technical scheme of the invention is as follows:
1. tea garden plant state monitoring and management system (system for short) based on visual perception
The system comprises a data acquisition module, an algorithm detection module and a robot module which are sequentially communicated;
the data acquisition module is embedded with a camera module, a preprocessing module and a data output module which are sequentially interacted;
the algorithm detection module is embedded with a disease and pest detection module, a tea detection module and a tea state detection module which are communicated with each other;
the robot module is embedded with an algorithm positioning module, a real-time display module and a decision executing module which are interacted in sequence.
2. Tea garden plant state monitoring and management method based on visual perception (short method)
S1, a data acquisition module acquires video stream information in real time through a camera, formats and preprocesses the video stream information after encoding the video stream information, and finally sends a result to an algorithm detector;
s2, an algorithm detection module performs algorithm selection from a trained algorithm model according to different requirements, and sends detection results to a robot module;
s3, the robot module receives the detection result, firstly carries out multi-frame matching, carries out further optimization processing on the detection result, and sends the processing result to the control module;
and S4, the control module utilizes a Kalman filtering algorithm to realize real-time fusion of the data of the multiple sensors, and controls the remote sprinkler to move and perform range sprinkling.
Compared with the prior art, the invention has the following advantages and positive effects:
1. algorithm detection module is designed
According to the invention, different algorithm detection modules are designed aiming at different characters of various plants in the tea garden, a knowledge distillation target detection model based on attention migration is constructed for pest detection and tea detection, and a fine granularity image classification model based on multiple attention modules is constructed for tea state detection, so that the efficient detection of different plant characters in the tea garden is realized;
2. algorithm positioning module is constructed
The invention designs an algorithm positioning module aiming at the detection result of the algorithm detection module, realizes higher precision on the monitoring requirements under different scenes through a multi-frame matching module, constructs a coordinate positioning module, completes the real-time positioning output of the monitored plants, and provides a data basis for a decision execution module in the robot module;
3. the invention constructs a robot control module through multi-sensor fusion, realizes all-terrain movement, designs a coordinate positioning module, and controls a spraying device to accurately spray herbicide, pesticide and the like;
4. the intelligent monitoring management of the tea garden plants is realized, a large number of manpower is reduced, the total cost of equipment is reduced, the intelligent monitoring management method is applicable to the field of intelligent monitoring management of the tea garden plants, and the intelligent monitoring management method has a wide prospect.
Drawings
Fig. 1 is a block diagram of the structure of the present system, in which:
100-a data acquisition module;
110-a camera module, which is configured to receive a camera signal,
120-a pre-processing module, wherein,
130-a data output module;
200-an algorithm detection module,
210-a disease and pest detection module,
220-a tea grass detection module,
230-a tea status detection module;
300-a robot module, which is configured to perform a robot,
310-an algorithm locating module,
320-a real-time display module,
330-execute decision module.
Fig. 2 is a block diagram of the algorithm locating module 310, in which:
311-a multi-frame matching module,
312-a coordinate positioning module,
313-information communication module.
Fig. 3 is a block diagram of the structure of the decision execution module 330, in which:
331-a control module, which is connected with the control module,
332—an execution module;
fig. 4 is a step diagram of the present method.
Detailed Description
In order to make the technical scheme of the present invention more clear, the following detailed description is provided with reference to the accompanying drawings and examples:
1. system and method for controlling a system
1. Overall (L)
As shown in fig. 1, the system comprises a data acquisition module 100, an algorithm detection module 200 and a robot module 300 which are communicated in sequence;
the data acquisition module 100 is embedded with a camera module 110, a preprocessing module 120 and a data output module 130 which are sequentially interacted;
the algorithm detection module 200 is embedded with a plant disease and insect pest detection module 210, a tea detection module 220 and a tea state detection module 230;
the robot module 300 is embedded with an algorithm positioning module 310, a real-time display module 320 and a decision executing module 330, the algorithm positioning module 310, the real-time display module 320 and the decision executing module 330 interact sequentially, and the algorithm positioning module 310 interact back and forth.
The working mechanism is as follows:
the data acquisition module 100 encodes and compresses the acquired video stream information in real time through a network camera or a monitoring camera in the camera module 110 and then transmits the video stream information to the preprocessing module 120; the preprocessing module 120 performs format conversion and picture preprocessing on the video frame after decoding the video stream information, and transmits the data to the algorithm detection module 200 through the data output module 130; the algorithm detection module 200 detects by selecting the pest detection module 210, the tea detection module 220 and the tea status detection module 230, and transmits the detection result to the robot module 300; the robot module 300 realizes target positioning through the algorithm positioning module 310 and transmits the result to the real-time display module 320 for display, and the decision execution module 330 performs mobile spraying operation, so that the real-time monitoring and processing of the plant state of the tea garden are realized.
2. Functional component
1) Data acquisition module 100
The data acquisition module 100 is a commonly used functional component, and a camera development board, such as raspberry pie, is selected;
a camera module 110, a preprocessing module 120 and a data output module 130 which interact in sequence are embedded.
(1) Camera module 110
The camera module 110 includes a conventional webcam or monitoring camera, and functions to acquire real-time image information of plant status of a real tea garden.
(2) Preprocessing module 120
The preprocessing module 120 includes a video stream information processing algorithm, and functions to perform real-time encoding and decoding operations and format conversion on video stream information acquired by the camera module 110.
(3) Data output module 130
The data output module 130 includes a network transmission api, and its function is to transmit the processed video frame information to the algorithm detector 200 through ip/WIFI.
2) Algorithm detection module 200
The algorithm detection module 200 is a commonly used functional component, such as an embedded development board nvidia jetson series;
embedded are a pest detection module 210, a tea detection module 220, and a tea status detection module 230.
(1) Pest and disease detection module 210
The pest detection module 210 includes a knowledge distillation target detection model trained based on truly collected pest data sets, the function of which is to perform pest detection on tea garden plant information collected by the data collector 100.
(2) Tea grass detection module 220
The tea detection module 220 comprises a knowledge distillation target detection model based on the trained attention migration of the actually collected tea dataset, and has the function of carrying out tea detection on the tea garden plant information collected by the data collection module 100;
(3) Tea status detection module 230
The tea status detection module 230 includes a trained multi-attention fine-grained image classification network model based on the truly acquired tea status data set, and is used for performing tea status identification and judgment on tea information of the tea garden acquired by the data acquisition module 100.
3) Robot module 300
Referring to FIG. 1, a robot module 300 is a common functional component, such as a ZYNQ-based control system;
an algorithm positioning module 310, a real-time display module 320 and a decision executing module 330 are embedded;
the algorithm positioning module 310, the real-time display module 320 and the decision execution module 330 interact sequentially, and the algorithm positioning module 310 and the decision execution module 330 interact back and forth.
(1) Algorithm positioning module 310
As shown in fig. 2, the algorithm positioning module 310 includes a multi-frame matching module 311, a coordinate positioning module 312, and an information output module 313;
the multi-frame matching module 311, the coordinate positioning module 312 and the information output module 313 interact sequentially, and the multi-frame matching module 311 and the information output module 313 interact back and forth;
the main functions are as follows: the accuracy of the algorithm detection is further improved through the multi-frame matching module 311, and then information is output to the coordinate positioning module 312 for target positioning according to different detection results, and is output through the information output module 313.
(2) Real-time display module 320
The real-time display 320 is a common functional component, such as a display using HDMI/DP interfaces; the functions are as follows: the coordinate information positioned by the algorithm positioning module 310 is displayed on a screen in real time so as to conveniently confirm the accuracy of the coordinate information.
(3) Decision execution module 330
As shown in fig. 3, the decision execution module 330 includes a control module 331 and an execution module 332 that interact back and forth.
The functions are as follows: the control module 331 reads and writes each path of sensor in parallel through a control state machine in the FPGA in the ZYNQ chip, and the sensor comprises relative coordinate information output by the algorithm positioning module 310, and simultaneously utilizes a Kalman filtering algorithm for real-time fusion of dynamic low-level redundant sensor data, integrates the data information into the execution module 332, realizes accurate navigation and target positioning of the robot 300 outdoors, and performs corresponding spraying operation.
2. Method of
As shown in fig. 4, the method comprises the following steps:
s1, a data acquisition module acquires video stream information in real time through a camera, formats and preprocesses the video stream information after encoding the video stream information, and finally sends a result to an algorithm detector;
s2, the algorithm detection module selects an algorithm from the trained algorithm model according to different requirements,
sending the detection result to the robot;
s3, the robot module receives the detection result, firstly carries out multi-frame matching, carries out further optimization processing on the detection result, and sends the processing result to the control module;
and S4, the control module utilizes a Kalman filtering algorithm to realize real-time fusion of the data of the multiple sensors, and controls the remote sprinkler to move and perform range sprinkling.
Specifically, the following steps are followed:
(1) the camera module 110 is provided with a network camera or a monitoring camera, and the obtained video stream information is transmitted to the preprocessing module 120 after being encoded and compressed in real time; the preprocessing module 120 performs format conversion and picture data set preprocessing operation on the video frames after decoding the video stream information; the data output module 130 transmits the received video frame information to the algorithm detection module 200;
(2) the algorithm detection module 200 first performs algorithm selection to determine an algorithm used by the detection module;
A. if pest detection module 210 is selected, the system invokes a pest detection algorithm, where the target detection model (including but not limited to the YOLO series) will be distilled using knowledge of the attention shift that has been trained based on the actual collected pest data set;
B. if the tea detection module 220 is selected, the system invokes a tea detection algorithm, where a target detection algorithm (including but not limited to the YOLO series) will be distilled using knowledge of the trained attention shift based on the truly acquired tea dataset;
C. if the tea status detection module 230 is selected, the system invokes a tea status detection algorithm, where a multi-attention fine-grained image classification network model (including but not limited to the Efficient Net series) trained based on the true acquired tea status dataset will be used;
(3) the algorithm positioning module 310 receives the detection result from the algorithm detection module 200 and starts to perform multi-frame matching 311;
if the detection results of the pest detection module 210 and the tea detection module 220 are received, judging whether the detection results of the detected and output target frames are consistent or not by taking 5 frames as a reference, if so, outputting the detection results to the coordinate positioning module 312, otherwise, not outputting the detection results;
if the received detection result of the tea status detection module 230 is the water shortage status, based on 150 frames, judging whether more than half of the 150 frames of classification results are water shortage status, if yes, outputting the classification results to the information communication module 313;
(4) the coordinate positioning module 312 receives the information output by the multi-frame matching module 311, calculates the coordinates of the central points of various detection target frames in the received video frame information, and sequentially outputs the coordinates of the central points of each target frame to the information communication module 313 from top to bottom;
(5) the information output module 313 sequentially transmits the information sent by the multi-frame matching module 311 or the coordinate positioning module 312 to the real-time display module 320 and the decision execution module 330;
(6) the real-time display module 320 displays the received detection result on the screen in real time;
(7) the control module 331 utilizes a Kalman filtering algorithm to realize the real-time fusion of dynamic low-level redundant sensor data, if the control module 331 receives the central coordinate information of a target frame, the control module 331 starts to read and write each path of sensor information, controls the spraying device in the execution module 332 to move to a corresponding position, and executes pesticide or herbicide spraying operation; if the control module 331 receives the information that the classification result is that the water shortage state exceeds half the number, the remote sprinkler is controlled to spray water in a range.
3. Examples
The embodiment of the invention needs the following equipment, which is specifically: raspberry group camera module, nvidia jetson series embedded development board and ZYNQ7020 chip;
step 1: in the embodiment, a training sample set based on deep learning is from image data collected in a Hubei enroishi tea garden, firstly, a built target detection algorithm and a fine-granularity image classification algorithm are subjected to model training by utilizing a collected data set, and finally, a trained model is deployed to an nvidia jetson series development board;
step 2: firstly, calibrating an obtained camera module to obtain internal and external parameters of the camera module, then calling a raspberry group camera module by the camera module 110, encoding and compressing obtained video stream information in real time, and transmitting the video stream information to the preprocessing module 120; the preprocessing module 120 decodes the video stream information, then performs format conversion on the video frames, converts the video frame information in BGRx format into BGR format, and converts the video frame resolution into 1920×1080; the data output module 130 transmits the received video frame information to the algorithm detection module 200;
step 3: the algorithm detection module 200 performs algorithm selection, and determines the algorithm used by the algorithm detection module 200, which specifically includes the following steps:
1) If the pest detection module 210 is selected, the system invokes a pest detection algorithm, and uses knowledge distillation target detection models based on the trained attention migration of the truly acquired pest data set, wherein YOLOV5L is used to perform knowledge distillation on YOLOV5s by using multiple CBAM attention, so as to obtain a brand new model;
2) If the tea detection module 220 is selected, the system calls a tea detection algorithm, and uses a knowledge distillation target detection algorithm based on the trained attention migration of the truly acquired tea dataset, wherein the knowledge distillation is performed on the YOLOV5s by using the YOLOV5L and multiple CA attentions to obtain a brand new model;
3) If the tea status detection module 230 is selected, the system invokes a tea status detection algorithm, using a multi-attention fine-grained image classification network model that has been trained based on the truly acquired tea status dataset, using the Efficient Net B7 model as the backbone network;
step 4: the algorithm positioning module 310 receives the detection result from the algorithm detection module, and performs result optimization for the detection result, specifically the steps are as follows:
1) Carrying out multi-frame matching 311 on the received detection results, if the received detection results are the detection results of the plant diseases and insect pests detection module 210 and the tea grass detection module 220, judging whether the detection results of all the detected and output target frames are consistent with each other and the confidence coefficient is greater than 0.5 based on 5 frames, if so, outputting the relative coordinates of the detected target frames to the coordinate positioning module 312, otherwise, not outputting the relative coordinates; if the received detection result of the tea status detection module 230 is the detection result, based on 150 frames, judging whether more than half of the classification results of the 150 frames are in a water shortage status and the confidence is greater than 0.5, if so, outputting the classification result to the information communication module 313;
2) The coordinate positioning module 312 receives the information output by the multi-frame matching module 311, calculates the coordinates of the central points of various detection target frames in the received video frame information, and sequentially outputs the coordinates of the central points of each target frame to the information communication module 313 from top to bottom;
3) The information communication module 313 sequentially transmits the information sent by the multi-frame matching module 311 or the coordinate positioning module 312 to the real-time display module 320 and the decision execution module 330;
step 5: writing the internal and external parameters of the camera into a control program, reducing errors, enabling the control module 331 to realize dynamic low-level redundant sensor data real-time fusion by using a Kalman filtering algorithm, and if the control module 331 receives the center coordinate information of the target frame, starting to read and write each path of sensor information, controlling a spraying device in the execution module 332 to move to the relative position of the center coordinate of the target frame, and executing pesticide or herbicide spraying operation; if the control module 331 receives the information that the classification result is that the water shortage state is more than half of the water shortage states, the remote spraying device is controlled to spray water in a range;
step 6: the real-time display module 320 displays the detection result on the screen in real time, ensuring that the execution is correct.
According to the embodiment, the detection efficiency is improved by designing the algorithm detection module, the result optimization and the target positioning are realized by the algorithm positioning module, and the real-time monitoring and management of the plant state of the tea garden are finally realized by using the robot module. The invention uses the embedded equipment, the sensor and the like with low cost, reduces a great deal of manpower while reducing the cost, and has wide market prospect and application value.

Claims (3)

1. Tea garden plant state monitoring management system based on visual perception, its characterized in that:
the device comprises a data acquisition module (100), an algorithm detection module (200) and a robot module (300) which are sequentially communicated;
the data acquisition module (100) is embedded with a camera module (110), a preprocessing module (120) and a data output module (130) which are sequentially interacted;
the algorithm detection module (200) is embedded with a plant disease and insect pest detection module (210), a tea detection module (220) and a tea state detection module (230);
the robot module (300) is embedded with an algorithm positioning module (310), a real-time display module (320) and a decision execution module (330), the algorithm positioning module (310), the real-time display module (320) and the decision execution module (330) are sequentially interacted, and the algorithm positioning module (310) are interacted back and forth;
the algorithm detection module (200) comprises:
the plant disease and insect pest detection module (210) comprises a knowledge distillation target detection model based on the trained attention migration of a truly acquired plant disease and insect pest data set, and performs plant disease and insect pest detection on the tea garden plant information acquired by the data acquisition module (100);
the tea grass detection module (220) comprises a knowledge distillation target detection model based on trained attention migration of a truly acquired tea grass data set, and tea grass detection is carried out on tea garden plant information acquired by the data acquisition module (100);
the tea state detection module (230) comprises a trained multi-attention fine-granularity image classification network model based on a truly acquired tea state data set, and performs tea state identification judgment on tea information of a tea garden acquired by the data acquisition module (100);
in the robot module (300):
the algorithm positioning module (310) comprises a multi-frame matching module (311), a coordinate positioning module (312) and an information output module (313); the multi-frame matching module (311), the coordinate positioning module (312) and the information output module (313) are sequentially interacted, and the multi-frame matching module (311) and the information output module (313) are interacted back and forth;
the real-time display module (320) displays the coordinate information positioned by the algorithm positioning module (310) on a screen in real time so as to conveniently confirm the accuracy of the coordinate information;
the decision execution module (330) comprises a control module (331) and an execution module (332) which interact back and forth;
the control module (331) reads and writes each path of sensor in parallel through a control state machine in the FPGA in the ZYNQ chip, the sensor comprises relative coordinate information output by the algorithm positioning module (310), and simultaneously, a Kalman filtering algorithm is used for real-time fusion of dynamic low-level redundant sensor data, so that the data information is integrated into the execution module (332), the robot (300) realizes accurate navigation and target positioning outdoors, and corresponding spraying operation is performed;
the algorithm positioning module (310) receives the detection result from the algorithm detection module, and performs result optimization aiming at the detection result, and the specific steps are as follows:
carrying out multi-frame matching (311) on the received detection results, if the received detection results are the detection results of the plant diseases and insect pests detection module (210) and the tea grass detection module (220), judging whether the detection results of all kinds of detected and output target frames are consistent with each other and the confidence coefficient is larger than 0.5 based on 5 frames, if so, outputting the relative coordinates of the detected target frames to the coordinate positioning module (312), otherwise, not outputting the relative coordinates; if the received detection result of the tea status detection module (230) is the detection result, based on 150 frames, judging whether more than half of the 150 frames of classification results are in a water shortage state and the confidence coefficient is more than 0.5, if so, outputting the classification result to the information communication module (313).
2. A visual perception based tea garden plant condition monitoring and management system as claimed in claim 1, wherein:
in the data acquisition module (100):
the camera module (110) comprises a network camera or a monitoring camera, and acquires real-time plant state image information of a real tea garden;
the preprocessing module (120) comprises a video stream information processing algorithm, and performs real-time encoding and decoding operations and format conversion on the video stream information acquired by the camera module 110;
the data output module (130) comprises a network transmission api and transmits the processed video frame information to the algorithm detector (200) through ip/WIFI.
3. A tea garden plant condition monitoring and management method based on the system of claim 1 or 2, characterized by comprising the following steps:
s1, a data acquisition module acquires video stream information in real time through a camera, formats and preprocesses the video stream information after encoding the video stream information, and finally sends a result to an algorithm detector;
s2, an algorithm detection module performs algorithm selection from a trained algorithm model according to different requirements, and sends detection results to a robot module;
s3, the robot module receives the detection result, firstly carries out multi-frame matching, carries out further optimization processing on the detection result, and sends the processing result to the control module;
and S4, the control module utilizes a Kalman filtering algorithm to realize real-time fusion of the data of the multiple sensors, and controls the remote sprinkler to move and perform range sprinkling.
CN202111039520.XA 2021-09-06 2021-09-06 Tea garden plant state monitoring and management system and method based on visual perception Active CN113781551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111039520.XA CN113781551B (en) 2021-09-06 2021-09-06 Tea garden plant state monitoring and management system and method based on visual perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111039520.XA CN113781551B (en) 2021-09-06 2021-09-06 Tea garden plant state monitoring and management system and method based on visual perception

Publications (2)

Publication Number Publication Date
CN113781551A CN113781551A (en) 2021-12-10
CN113781551B true CN113781551B (en) 2023-10-31

Family

ID=78841163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111039520.XA Active CN113781551B (en) 2021-09-06 2021-09-06 Tea garden plant state monitoring and management system and method based on visual perception

Country Status (1)

Country Link
CN (1) CN113781551B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109470299A (en) * 2018-10-19 2019-03-15 江苏大学 A kind of plant growth information monitoring system and method based on Internet of Things
CN110832597A (en) * 2018-04-12 2020-02-21 因美纳有限公司 Variant classifier based on deep neural network
CN110892484A (en) * 2018-07-11 2020-03-17 因美纳有限公司 Deep learning-based framework for identifying sequence patterns causing sequence-specific errors (SSEs)
CN112464959A (en) * 2020-12-12 2021-03-09 中南民族大学 Plant phenotype detection system and method based on attention and multiple knowledge migration
CN112990262A (en) * 2021-02-08 2021-06-18 内蒙古大学 Integrated solution system for monitoring and intelligent decision of grassland ecological data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110832597A (en) * 2018-04-12 2020-02-21 因美纳有限公司 Variant classifier based on deep neural network
CN110892484A (en) * 2018-07-11 2020-03-17 因美纳有限公司 Deep learning-based framework for identifying sequence patterns causing sequence-specific errors (SSEs)
CN109470299A (en) * 2018-10-19 2019-03-15 江苏大学 A kind of plant growth information monitoring system and method based on Internet of Things
CN112464959A (en) * 2020-12-12 2021-03-09 中南民族大学 Plant phenotype detection system and method based on attention and multiple knowledge migration
CN112990262A (en) * 2021-02-08 2021-06-18 内蒙古大学 Integrated solution system for monitoring and intelligent decision of grassland ecological data

Also Published As

Publication number Publication date
CN113781551A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN108693119B (en) Intelligent pest and disease damage investigation and printing system based on unmanned aerial vehicle hyperspectral remote sensing
CN109470299A (en) A kind of plant growth information monitoring system and method based on Internet of Things
CN106406178A (en) Greenhouse crop growth information real-time peer-to-peer monitoring device and monitoring method
CN109781963A (en) A kind of field planting environmental monitoring system
CN103048266B (en) Automatic recognizing method and device for nitrogen phosphorus and potassium stress of protected tomatoes
CN110728332A (en) Agricultural data analysis method and system based on Internet of things
CN106406403A (en) Agriculture management and control system based on augmented reality
CN101936882A (en) Nondestructive testing method and device for nitrogen and water of crops
CN104732564B (en) A kind of Estimating Leaf Area In Maize dynamic nondestructive monitoring device and method
CN108120473B (en) Chamber crop three-dimensional configuration monitoring device and monitoring method based on depth camera
KR20210077504A (en) Smart Farm Data System
CN112990262A (en) Integrated solution system for monitoring and intelligent decision of grassland ecological data
CN206178392U (en) Real -time reciprocity monitoring devices of greenhouse crop growth information
CN115187943A (en) Air-ground integrated intelligent sensing system and method for plant growth state
CN114460080A (en) Rice disease and pest intelligent monitoring system
CN112528912A (en) Crop growth monitoring embedded system and method based on edge calculation
CN115526521A (en) Plant growth state monitoring and alarming system for plant factory
CN211477203U (en) Refined monitoring equipment system based on high-resolution remote sensing image
CN112883230A (en) Potato production management system
CN110321774B (en) Crop disaster situation evaluation method, device, equipment and computer readable storage medium
CN113781551B (en) Tea garden plant state monitoring and management system and method based on visual perception
CN117058607A (en) Plant growth state monitoring system based on image visual analysis
CN116616267A (en) Crop pest control device and method
CN110796639B (en) Pinellia ternata quality grading method based on neural network
CN114462485A (en) Red date jujube witches broom initial-stage control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant