CN114514914A - Intelligent sensing fertilization and pesticide spraying method and device - Google Patents

Intelligent sensing fertilization and pesticide spraying method and device Download PDF

Info

Publication number
CN114514914A
CN114514914A CN202111621352.5A CN202111621352A CN114514914A CN 114514914 A CN114514914 A CN 114514914A CN 202111621352 A CN202111621352 A CN 202111621352A CN 114514914 A CN114514914 A CN 114514914A
Authority
CN
China
Prior art keywords
crop
nitrogen content
development board
camera
raspberry group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111621352.5A
Other languages
Chinese (zh)
Inventor
苏文浩
刘博远
王亚虹
彭彦昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN202111621352.5A priority Critical patent/CN114514914A/en
Publication of CN114514914A publication Critical patent/CN114514914A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0025Mechanical sprayers
    • A01M7/0032Pressure sprayers
    • A01M7/0042Field sprayers, e.g. self-propelled, drawn or tractor-mounted
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C21/00Methods of fertilising, sowing or planting
    • A01C21/007Determining fertilization requirements
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C23/00Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
    • A01C23/007Metering or regulating systems
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C23/00Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
    • A01C23/04Distributing under pressure; Distributing mud; Adaptation of watering systems for fertilising-liquids
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental Sciences (AREA)
  • Soil Sciences (AREA)
  • Insects & Arthropods (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Pest Control & Pesticides (AREA)
  • Water Supply & Treatment (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Mechanical Engineering (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Catching Or Destruction (AREA)
  • Fertilizing (AREA)

Abstract

The invention relates to an intelligent sensing fertilization and pesticide spraying method and device. The method adopts a mode of positioning first and then photographing in the crop nitrogen content and disease detection process, thereby reducing the calculation time in the detection process. The speed of identification and detection is greatly improved, and speed optimization is performed in various aspects such as an image acquisition mode, a positioning mode, identification and the like, so that real-time detection, fertilization and pesticide spraying become practical; meanwhile, fertilization is combined with disease identification and pesticide spraying, and crop yield management in many aspects is realized.

Description

Intelligent sensing fertilization and pesticide spraying method and device
Technical Field
The invention relates to the technical field of agricultural intelligent sensing equipment, in particular to an intelligent sensing fertilization and pesticide spraying method and device.
Background
In the growth stage of the lettuce, fertilizer is required to supplement nutrition and detect diseases, and fertilizer and medicine are sprayed in time. Crop topdressing is an important way to promote crop growth, and accounts for more than one third of the total fertilizing amount. Mineral elements are essential in the growth and development of plants. Once mineral elements are lacked, corresponding nutrient deficiency symptoms occur to plants, so that the yield is reduced, and the production efficiency is greatly reduced. Disease detection is the most important thing in the growth stage of crops, and if the disease detection cannot be timely applied, the yield is seriously influenced. Common diseases of lettuce comprise gray mold, soft rot, leaf blight and the like; the development of computer vision and deep learning technology makes it possible to detect diseases at fixed points through deep learning and to administer medicines at fixed points.
The national ministry of agriculture promulgates the policy of weight reduction and drug reduction, and aims to vigorously promote the way of reducing the amount of chemical fertilizer, improving the effect, reducing the amount of pesticide and controlling the harm. The traditional fertilization quantitative mode is mainly regional quantitative, and fertilization can not be carried out according to the actual demand of each crop. The traditional large-area fertilization causes fertilizer waste and has great harm to the environment, plants are accurately positioned, the fixed-point fertilization is carried out, and the indispensable effect on the environmental protection and the fertilizer conservation is achieved.
The existing disease identification and nitrogen content prediction methods mainly comprise a spectrum mode, a machine vision mode and a combined mode of the spectrum mode and the machine vision mode. And (3) spectrum detection: collecting the spectral information of the leaves, establishing a mathematical model relation (a certain element is an element capable of reflecting diseases) between the spectral information and the certain element or nitrogen content, and predicting the diseases or the nitrogen content through the mathematical model. The spectrum detection speed is not as high as that of machine vision, only detection is possible, but the speed of spraying and fertilizing while real-time detection is carried out in the field cannot be kept up with that of the field. Spectrum, machine vision combine: for example, hyperspectral information of lettuce leaf samples in early, middle and late stages of anthracnose, sclerotinia sclerotiorum and powdery mildew and in a healthy state is respectively acquired by a hyperspectral suite, original spectral data is subjected to noise reduction smoothing processing by a polynomial smoothing algorithm, preprocessed data is subjected to characteristic wavelength optimization by a continuous projection algorithm, color features and texture features of sample images are respectively extracted by first-order to third-order moment and texture LBP operators, and finally, color, texture and spectral feature value data are trained through an SVR prediction model and prediction set samples are classified. Colloquially, two prediction methods are combined to find the optimal answer. The two are combined to increase the complexity of the algorithm structure, the speed cannot keep up with the speed, only detection can be realized, and the real-time detection and the simultaneous spraying and fertilizing are still limited by the speed.
Therefore, how to design a method and a device for quickly positioning lettuce coordinates, judging the fertilizer demand and the diseases through intelligent sensing and accurately spraying the lettuce coordinates becomes a problem to be solved in the field at present.
Disclosure of Invention
The invention aims to provide an intelligent sensing fertilization and pesticide spraying method and device, by adopting the method, accurate and quantitative fertilization can be realized in a plant topdressing stage, the utilization rate of fertilizer is maximized, and fertilizer waste and environmental pollution are reduced; by adopting the method, the health condition of crops can be monitored in the planting stage, large-area pesticide application is avoided, and accurate fixed-point spraying is carried out.
In order to achieve the purpose, the invention provides the following scheme:
an intelligent sensing fertilization and pesticide spraying device, comprising: the device comprises a crop positioning module, a crop disease and nitrogen content detection module, a spraying module and a vehicle body module.
Optionally, the crop positioning module includes a dark box 33, a grayscale camera 18, an optical filter, excitation LED lamps 25 and 26, a raspberry development board one 20, a vehicle speed sensor 34, and a power supply;
the speed sensor 34 is connected with the raspberry development board I20, and a signal of the speed sensor 34 is used for controlling a shooting interval of the gray camera;
the dark box 33 is formed by surrounding shading cloth and used for shading external light to form an image acquisition working space and ensure that the light is constant when an image is acquired;
the excitation LED lamps 25 and 26 are arranged at the corners of the dark box and are used for exciting fluorescent signals;
the gray camera 18 is connected with a USB port of the raspberry group development board I20 and is used for collecting and transmitting image information; the raspberry group development board I20 is used for positioning crop coordinates, and the optical filter is installed on a lens of the gray camera 18 and used for filtering background colors;
optionally, the crop disease and nitrogen content detection module comprises an RGB camera 17, a raspberry pi development board two 21, and daylight LED lamps 23 and 24;
the output serial port of the first raspberry development board 20 is connected with the RGB camera 17, the RGB camera 17 is connected with the USB port of the second raspberry development board 21, and the RGB camera 17 is used for receiving working moment signals and inputting image information to the second raspberry development board 21; the second raspberry pie development board 21 is used for detecting crop diseases and nitrogen content;
the dark box environment is built by the shading cloth, and the sunlight LED lamps 23 and 24 are used for constant light irradiation, so that the influence of light source change on image acquisition is reduced.
Optionally, the spraying module comprises: the device comprises an Arduino single chip microcomputer 22, a conical nozzle 16, a plurality of liquid tanks, a booster pump 14, a high-pressure liquid pipe 15, a motor and a plurality of electromagnetic valves;
the output serial port of the second raspberry development board 21 is connected with the input serial port of the Arduino single chip microcomputer 22, and crop state information is transmitted to the Arduino single chip microcomputer 22;
the output serial port of the Arduino single chip microcomputer 22 is connected with the motor and the plurality of electromagnetic valves, and the Arduino single chip microcomputer 22 controls the rotation of the motor and the opening and closing of the plurality of electromagnetic valves;
each liquid tank is connected with an electromagnetic valve, the lower end of the liquid tank is connected with a high-pressure liquid pipe 15, a booster pump 14 is installed on the high-pressure liquid pipe 15, and the tail end of the high-pressure liquid pipe 15 is connected with a conical nozzle 16; each liquid tank is controlled by a solenoid valve connected with the liquid tank to be transfused, and liquid is supplied to a booster pump 14 through a high-pressure liquid pipe 15 during transfusion; after passing through a booster pump 14, the fertilizer is sprayed on crops through a conical nozzle 16, so that the fertilizer application and pesticide spraying are realized.
The excitation LED lamps 25 and 26 are arranged on a chassis beam at the front part of the advancing direction of the vehicle body; the daylight LED lamps 23 and 24 are arranged on a chassis beam at the rear part of the vehicle body in the advancing direction, and the stability of image acquisition is ensured through a dark box and constant illumination.
Optionally, the vehicle body module includes: the section bar frame 27, the towing connecting mechanism 19, four wheels, the fixed support 4 and the control box 32;
the control box 32 is fixed on the section bar frame 27 through a plurality of positioning holes; the section bar frame 27 is fixed above the four wheels through the camera bellows fixed support 4;
the hitch coupler 19 is used to couple to a mobile agricultural vehicle to move the apparatus through the field.
The first raspberry group development board 20, the second raspberry group development board 21 and the Arduino single chip microcomputer 22 are arranged in the control box 32; the spatial arrangement is shown in figure 3.
A dark box 33 is formed by surrounding the profile frame 27 by shading cloth.
An intelligent sensing fertilization and pesticide spraying method utilizes the intelligent sensing fertilization and pesticide spraying device, and comprises the following steps:
s1, carrying out crop signal marking treatment on the plants;
s2, collecting image data of the crops by using a gray level camera provided with an optical filter to obtain image information, and processing the image information to obtain the positioning of the crops;
s3, detecting the crop diseases and the nitrogen content by adopting a crop disease and nitrogen content detection module;
s4, the spraying module performs fertilization and pesticide spraying according to the crop diseases and the nitrogen content classification signal and the disease signal of the nitrogen content detection module.
In the step S1, the crop signal is sprayed in advance on the crop, the crop signal is rhodamine B solution, and the concentration of the rhodamine B solution is 8 ug/ml.
In step S2, the processing of the image information includes: carrying out noise reduction, binarization, feature area contour extraction, small-area abandoning and feature area central point extraction on the image; the method comprises the following specific steps: inputting image information into a raspberry group development board I, performing Gaussian blur processing on the image in the raspberry group development board I, performing threshold segmentation on the image by using the Otsu method, and further extracting a characteristic region contour; and performing vertex extraction on the contour of the feature region and calculating the pixel area of the contour, wherein the pixel area of the contour is smaller than 5000 of the feature region, otherwise, calculating the pixel coordinate of the center point of the feature region and converting the pixel coordinate system into a camera coordinate system.
Optionally, 20 is taken as a segmentation threshold of the threshold segmentation;
when the image is subjected to feature extraction work, the minimum circumscribed rectangle in the image is taken as a feature region, the vertex coordinates of the feature region are read, and the pixel coordinates are determined, which are specifically expressed as follows:
Figure BDA0003437598720000051
wherein, Cv、ChRespectively representing a pixel abscissa and a pixel ordinate of the crop; x is the number of1、x3Respectively representing the horizontal coordinates of the vertexes of the characteristic areas; y is1、y3Respectively, the feature region vertex ordinates.
And converting the pixel coordinates of the top points of the characteristic areas into camera coordinates by a Zhang-Yongyou calibration method, and transmitting the camera coordinates of the top points of the characteristic areas as output to a crop disease and nitrogen content detection module.
Step S3 is specifically as follows:
inputting the camera coordinates of the characteristic area vertex into a raspberry group development board II of the crop disease and nitrogen content detection module; the raspberry group development boards calculate the shooting delay time of the RGB camera according to the vehicle speed information acquired by the vehicle speed sensor; the formula is as follows:
Figure BDA0003437598720000061
wherein, t1Represents a photographing delay time; s2Representing a distance between the RGB camera and the grayscale camera; t is t0Represents a shutter time; v represents vehicle speed information acquired by a vehicle speed sensor;
and the two raspberry group development boards control the RGB cameras to shoot according to the shooting delay time, the pictures are returned to the second raspberry group development board, and the second raspberry group development board identifies and detects diseases and nitrogen content.
Taking the camera coordinates at the top of the characteristic region as a center, extracting an ROI region in the image, after receiving the ROI region, simultaneously performing disease and nitrogen content detection operation through a trained M2Det deep learning model, and when a disease is identified, transmitting a disease signal to the Arduino single chip microcomputer by the raspberry development board II; and after the nitrogen content is detected, the nitrogen content is grouped, and the nitrogen content is divided into four types according to low, medium, high and excessive nitrogen content, the four types correspond to four signals respectively, and the four signals are transmitted to an Arduino single chip microcomputer by a raspberry group development board II.
The ROI area is an image range of 512 × 512 pixels size centered on the camera coordinates.
The M2Det deep learning model is established through the following steps: preparing a data set, training a model, predicting the model and verifying the model;
the data set is acquired by network collection, and a plurality of diseased leaves and a plurality of healthy leaves are selected; and carrying out data set expansion by image enhancement in a mode of turning, translating and manually adding noise points.
Step S4 is specifically as follows: the Arduino single chip microcomputer receives a classification signal of nitrogen content and a disease signal; respectively controlling the corresponding electromagnetic valves to work according to the signals; the booster pump works; the liquid tank is controlled by a solenoid valve to be ejected according to the ejection delay time (ejection delay).
The control of the electromagnetic valve and the booster pump is realized by an Arduino single chip microcomputer 22; the nitrogen content is low, the medium and high grade signals respectively control the opening and closing of the corresponding electromagnetic valves; when the nitrogen content is excessive, all electromagnetic valves connected with the liquid tank filled with the liquid nitrogen fertilizer are closed; the opening and closing of an electromagnetic valve connected with a liquid tank filled with pesticide are controlled by a disease signal; different solutions are selected for spraying through the opening and closing of the electromagnetic valve, and the aim of accurate spraying is achieved.
The signal of the vehicle speed sensor 34 is transmitted to the Arduino single chip microcomputer 22, and the Arduino single chip microcomputer 22 calculates the injection delay according to the data transmitted by the vehicle speed sensor 34;
Figure BDA0003437598720000071
wherein t represents an injection delay; s3Represents the longitudinal distance between the grayscale camera 18 and the cone-shaped nozzle 16; v represents vehicle speed information acquired by the vehicle speed sensor 34.
The conical nozzle 16 is controlled by a booster pump 14 and electromagnetic valves 5, 6, 7 and 11, and sprays crop signal solution according to the spraying delay, so that the single crop spraying of the minimum solution is realized.
The invention has the beneficial effects that: the method adopts a mode of positioning first and then photographing in the crop nitrogen content and disease detection process, thereby reducing the calculation time in the detection process. The speed of identification and detection is greatly improved, and speed optimization is performed in various aspects such as an image acquisition mode, a positioning mode, identification and the like, so that real-time detection, fertilization and pesticide spraying become practical; meanwhile, fertilization, disease identification and pesticide spraying are combined, and crop production management in many aspects is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required in the embodiments will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
The invention has the following drawings:
FIG. 1 is a schematic diagram of the overall structure of an apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the internal structure of the device according to the embodiment of the present invention;
FIG. 3 is a schematic diagram of the longitudinal layout of the apparatus according to the embodiment of the present invention;
FIG. 4 is a schematic view of a chassis arrangement provided by an embodiment of the present invention;
FIG. 5 is a flow chart of a method for locating a crop in an embodiment of the present invention;
FIG. 6 is a flow chart of a method for detecting crop diseases and nitrogen content in an embodiment of the present invention;
FIG. 7 is a flow chart of a spraying method in an embodiment of the present invention;
FIG. 8 is a schematic diagram of a method of iterative identification adjustment in accordance with an embodiment of the present invention;
1. 8, 9, 13-control box fixing holes, 2, 3, 10, 12-liquid tanks, 4-camera box fixing supports, 5, 6, 7, 11-electromagnetic valves, 14-booster pumps, 15-high-pressure liquid pipes, 16-conical nozzles, 17-RGB cameras, 18-gray level cameras, 19-dragging connecting mechanisms, 20-development board I, 21-raspberry development board II, 22-Arduino single-chip microcomputer, 23, 24-sunlight LED lamps, 25, 26-excitation LED lamps, 27-section bar frames, 28, 29, 30, 31-wheels, 32-control boxes, 33-camera boxes, 34-vehicle speed sensors and 35-mileage sensors.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and detailed description, in order to make the objects, features and advantages thereof more comprehensible.
An intelligent sensing fertilization and pesticide spraying method comprises the following steps:
s1, carrying out crop signal marking treatment on the plants;
the crop signal is sprayed on the crop in advance, and optionally, the crop signal is rhodamine B solution, and the concentration of the rhodamine B solution is 8 ug/ml.
S2, collecting image data of the crops by using a gray level camera provided with an optical filter to obtain image information, and processing the image information to obtain the positioning of the crops;
the processing of the image information includes: carrying out noise reduction, binaryzation, contour extraction, small-area region abandonment and feature region central point extraction on the image;
specifically, image information is input into a raspberry group development board I, Gaussian blur processing is carried out on the image in the raspberry group development board I, and an Otsu method is used for carrying out threshold segmentation on the image so as to extract the contour of a characteristic region; and (4) performing vertex extraction on the contour of the characteristic region and calculating the area of the contour, wherein the area of a pixel of the contour is less than 5000 of the characteristic region, otherwise, calculating the pixel coordinate of the center point of the characteristic region and converting the pixel coordinate system into a camera coordinate system.
Optionally, 20 is taken as a segmentation threshold of the threshold segmentation;
when the image is subjected to feature extraction, the minimum circumscribed rectangle in the image is taken as a feature region, vertex coordinates of the feature region (the feature region is a rectangle, and the vertex coordinates refer to coordinates of four vertexes of the rectangle) are read, and pixel coordinates are determined, which are specifically expressed as follows:
Figure BDA0003437598720000101
wherein, Cv、ChRespectively representing a pixel abscissa and a pixel ordinate of the crop; x is a radical of a fluorine atom1、x3Respectively representing the horizontal coordinates of the vertexes of the characteristic areas; y is1、y3Respectively, the feature region vertex ordinates.
Cv、ChIs a pixel coordinate and can be converted into a camera coordinate by adopting a Zhangyingyou calibration method.
And the horizontal and vertical coordinates of the pixels at the vertexes of the characteristic areas are converted into camera coordinates by a Zhang-Yongyou calibration method, and the camera coordinates at the vertexes of the characteristic areas are used as output and transmitted to a crop disease and nitrogen content detection module.
The extracted coordinate points are used as input information of a second raspberry development board 21, and are used for providing photographing signals for the RGB camera 17 of the crop disease and nitrogen content detection module and providing working signals for a lower computer;
s3, detecting crop diseases and nitrogen content, comprising the following steps:
inputting the coordinates of the characteristic point camera into a raspberry group development board II 21 of the crop disease and nitrogen content detection module; the second raspberry development board 21 calculates shooting delay time of the RGB camera 17 according to the vehicle speed information acquired by the vehicle speed sensor 34; the formula is as follows:
Figure BDA0003437598720000102
wherein, t1Represents a photographing delay time; s2Representing a distance between the RGB camera and the grayscale camera; t is t0Represents a shutter time; v represents vehicle speed information acquired by a vehicle speed sensor.
The method converts dynamic continuous shooting into dynamic fixed-point shooting, the images are the same background image which is easy to segment, and crops are located in the center of the images; the method simplifies the identification process, reduces the loss value of the pre-training model of the deep learning algorithm, improves the accuracy and stability of the model, and makes innovation in the technical aspect of quickly identifying the plant diseases.
The second raspberry development board 21 controls the RGB camera 17 to take a picture according to the shooting delay time, the picture is transmitted back to the second raspberry development board 21, and the second raspberry development board 21 identifies and detects diseases and nitrogen content;
the identification and detection process adopts a deep learning algorithm; taking the extracted camera coordinates as a center, extracting an ROI (region of interest) region in the image, and predicting a depth learning model only aiming at the ROI region; the ROI area is an image range of 512 × 512 pixels size centered on the camera coordinates.
The identification and detection of the diseases and the nitrogen content comprise the following steps: preparing a data set, training a model, predicting the model, verifying the model and detecting;
the deep learning algorithm adopts an M2Det deep learning model, and the identification and detection of diseases and fertilizing amount are realized by the model;
the data set is obtained by network collection, wherein 11021 diseased leaves and 5463 healthy leaves are obtained; carrying out data set expansion by image enhancement in modes of turning, translation, manual noise addition and the like, and finally obtaining 15000 diseased leaves and 9850 healthy leaves in the data set;
the processing process is completed by a second raspberry group development board 21, the second raspberry group development board 21 receives the ROI area and then simultaneously performs disease and nitrogen content detection operation through a trained M2Det deep learning model, and when a disease is identified, the second raspberry group development board 21 transmits a disease signal to an Arduino single-chip microcomputer 22; grouping the detected nitrogen content, classifying the detected nitrogen content into four types according to low, medium, high and excessive nitrogen content, respectively corresponding to four signals, and transmitting the four signals to an Arduino single chip microcomputer 22 through a raspberry group development board II 21;
s4, fertilizing and spraying, wherein the spraying process comprises the following steps: the Arduino single chip microcomputer 22 receives the classification signal of the nitrogen content and the disease signal; the electromagnetic valves 5, 6, 7 and 11 are respectively controlled to work according to signals; the booster pump works; the liquid tank is controlled by a solenoid valve to be ejected according to the ejection delay time (ejection delay).
The control of the electromagnetic valve and the booster pump is realized by an Arduino single chip microcomputer 22; the low, medium and high nitrogen content three-level signals respectively control the opening and closing of the electromagnetic valves 5, 6 and 7; when the signal is excessive, the electromagnetic valves 5, 6 and 7 of the electromagnetic valves are closed; the disease signal controls the opening and closing of the electromagnetic valve 11; different solutions are selected for spraying through the opening and closing of the electromagnetic valve, and the aim of accurate spraying is achieved.
The signal of the vehicle speed sensor 34 is transmitted to the Arduino single chip microcomputer 22, and the Arduino single chip microcomputer 22 calculates the injection delay according to the data transmitted by the vehicle speed sensor 34;
Figure BDA0003437598720000121
wherein t represents an injection delay; s3Represents the longitudinal distance between the grayscale camera 18 and the cone-shaped nozzle 16; v represents vehicle speed information acquired by the vehicle speed sensor 34.
The conical nozzle 16 is controlled by a booster pump 14 and electromagnetic valves 5, 6, 7 and 11, and sprays crop signal solution according to the spraying delay, so that the single crop spraying of the minimum solution is realized.
The fertilization and the pesticide spraying can be carried out simultaneously, but the applied pesticide and the fertilizer do not produce adverse reactions (such as toxicity to crops after mixing and the like) which can not be carried out simultaneously if the adverse reactions exist; if no adverse reaction occurs, the two reactions can be carried out simultaneously.
Example one
As shown in fig. 1-4: the example provides a device for intelligent lettuce sensing fertilization and pesticide spraying;
as shown in fig. 1, the whole structure of the device mainly comprises a control box 32, a dark box 33, wheels 28, 29, 30 and 31 and a towing connecting mechanism 19;
specifically, the control box 32 in this example is mounted on the profile frame 27 through the positioning holes 1, 8, 9 and 13; the camera bellows 33 is fixed on the vehicle body through the camera bellows fixing support 4; the hitch coupler 19 is used to couple a mobile agricultural vehicle.
As shown in fig. 2 and 3, the main internal structure of the device comprises a section frame 27, liquid tanks 2, 3, 10 and 12, electromagnetic valves 5, 6, 7 and 11, a raspberry group development board I20, a raspberry group development board II 21, an Arduino singlechip 22, a booster pump 14, a high-pressure liquid pipe 15 and a conical nozzle 16; the liquid tanks 2, 3, 10 and 12 can be respectively loaded with liquid nitrogen fertilizers and/or pesticides with different concentrations; the electromagnetic valves 5, 6, 7 and 11 are used for controlling the opening and closing of the four liquid tanks respectively and controlling the fertilizing concentration and the pesticide application time.
Specifically, in this example, the first raspberry development board 20, the second raspberry development board 21, and the Arduino single chip 22 are fixed in the control box and connected to the grayscale camera 18, the RGB camera 17, the electromagnetic valves 5, 6, 7, and 11, the booster pump 14, the vehicle speed sensor 34, and the mileage sensor 35.
As shown in fig. 4, the chassis of the device is mainly provided with a gray camera 18, an RGB camera 17, excitation LED lamps 25 and 26, and daylight LED lamps 23 and 24;
specifically, the grayscale camera 18 and the RGB camera 17 are arranged on the center line sill in this example; the excitation LED lamps 25 and 26 are arranged on the head cross beam; daylight LED lamps 23, 24 are arranged on the rear cross member of the vehicle.
Example two
As shown in fig. 5-8: the embodiment also provides an intelligent sensing fertilization and pesticide spraying method for lettuce, which comprises the following steps:
the method comprises the following steps: FIG. 5 is a flow chart of a crop positioning method, in which a rhodamine b solution is sprayed in advance on a crop, only crop information is contained in an image obtained by adding an optical filter to a grayscale camera 18, and a series of operations including noise reduction, binarization, contour extraction, small-area region rejection and contour center point extraction are performed on the image to obtain crop coordinate information. As shown in fig. 5, the gray camera 18 is used for image acquisition, image information is input into the raspberry group development board one 20, the image is subjected to gaussian blur processing in the raspberry group development board one 20, and the image is subjected to threshold segmentation by using the Otsu method, so as to extract the contour of the characteristic region; and (3) performing vertex extraction on the contour of the feature region and calculating the area of the contour, wherein the area of a pixel of the feature region is smaller than 5000, otherwise, calculating the pixel coordinate of the central point of the feature region, converting the pixel coordinate system into a camera coordinate system, and inputting the coordinate of the feature point under the camera coordinate system into a raspberry group development board II 21. The method is combined with the method shown in the figure 6, so that an algorithm model is simplified, and the original image is easy to process and high in speed due to the fact that the rhodamine b solution is sprayed in advance.
Step two: FIG. 6 is a flow chart of a method for detecting crop diseases and nitrogen content, wherein an M2Det model is used for training a collected data set, and a trained model is used for detecting, so that the continuous recognition detection speed is slow. As shown in fig. 6, the vehicle speed sensor 34 inputs data into the second raspberry development board 21, the second raspberry development board 21 calculates camera shooting delay, controls the RGB camera 17 to collect images according to the shooting delay, and returns the collected images to the second raspberry development board 21; extracting a characteristic region according to the camera coordinates of the characteristic points, carrying out disease identification and nitrogen content detection on the characteristic region by means of the trained model, and outputting a pesticide spraying signal after identifying a disease; and grading according to the nitrogen content level and outputting a grading signal, and outputting a nitrogen fertilizer injection signal when the nitrogen content level is less than a maximum threshold value.
Step three: FIG. 7 is a flow chart of a method for fertilizing and spraying pesticide, wherein the spraying is judged according to crop positioning information and detection results, the opening and closing of a liquid tank are controlled by an electromagnetic valve in the spraying process, and the liquid is sprayed through a nozzle after the pressurization of a booster pump. As shown in fig. 7, the vehicle speed sensor 34 inputs data into the Arduino single chip microcomputer 22, the injection delay of the nozzle is calculated, the Arduino single chip microcomputer 22 controls the booster pump to work after receiving a pesticide injection signal, and the electromagnetic valve 5 is opened; the Arduino single-chip microcomputer 22 receives the nitrogen fertilizer injection signal and the classification signal and then controls the booster pump to work, and the electromagnetic valves 6, 7 or 11 work according to the classification signal to realize the injection of the pesticide and the nitrogen fertilizer.
The present example provides a solution to the problem of duplicate identification, as shown in fig. 8; fig. 8 is a processing mode of repeated crops in the crop positioning process, position information of the vehicle based on an original point O is obtained through a mileage sensor, coordinates of the crops based on a camera are converted into coordinates based on a fixed point original point, a threshold value is set, repeated crops are judged based on the fact that the coordinate distance of the original point is smaller than the threshold value, and secondary fertilization and pesticide spraying are avoided.
In position 1, crops identified by the grayscale camera 18 are crop 1 and crop 2; in position 2, crops identified by the grayscale camera 18 are crop 2 and crop 3; at position 1, crop 1 has coordinates of (x)4,y40), the coordinates of crop 2 are (x)5,y50); at position 2, the coordinate of crop 2 is (x'5,y’50), the coordinates of crop 3 are (x)6,y6,0);
ki=s1+s4+xi
K1=ki-ki-1
K2=ki-ki-2
Wherein s is1Represents the distance from the mileage sensor 35 to the origin O; s4Represents the distance between the mileage sensor 35 and the grayscale camera 18; x is the number ofiIs the x-axis coordinate of the crop under the camera coordinate system; k is1And K2Setting a threshold K of 2 for determining the value, when K is1>2 or K2>2, determining as an effective crop point; otherwise, the plant is determined to be a repeated crop point.
The above embodiments are merely illustrative, and not restrictive, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the invention, and therefore all equivalent technical solutions also belong to the scope of the invention.
Those not described in detail in this specification are within the skill of the art.

Claims (10)

1. The utility model provides an intelligence perception fertilization and pesticide sprinkler which characterized in that includes: the system comprises a crop positioning module, a crop disease and nitrogen content detection module, a spraying module and a vehicle body module; the crop positioning module, the crop disease and nitrogen content detection module and the spraying module are all arranged on the vehicle body module;
the crop positioning module comprises a dark box (33), a gray camera (18), an optical filter, exciting LED lamps (25 and 26), a raspberry group development board I (20), a vehicle speed sensor (34) and a power supply; the vehicle speed sensor (34) is connected with the raspberry group development board I (20); the dark box (33) is formed by surrounding shading cloth and is used for shading external light; the excitation LED lamps (25, 26) are arranged at the corners of the dark box and used for exciting fluorescent signals; the gray camera (18) is connected with a USB port of the raspberry development board I (20) and is used for collecting and transmitting image information; the raspberry group development board I (20) is used for positioning crop coordinates, and the optical filter is installed on a lens of the gray camera (18) and used for filtering background colors;
the crop disease and nitrogen content detection module comprises an RGB camera (17), a raspberry group development board II (21) and daylight LED lamps (23 and 24); the output serial port of the first raspberry group development board (20) is connected with an RGB camera (17), the RGB camera (17) is connected with a USB port of a second raspberry group development board (21), and the RGB camera (17) is used for receiving working moment signals and inputting image information to the second raspberry group development board (21); a raspberry group development board II (21) is used for detecting crop diseases and nitrogen content; the daylight LED lamps (23, 24) are used for constant light irradiation;
the spray module includes: the device comprises an Arduino single chip microcomputer (22), a conical nozzle (16), a plurality of liquid tanks, a booster pump (14), a high-pressure liquid pipe (15), a motor and a plurality of electromagnetic valves; the output serial port of the raspberry group development board II (21) is connected with the input serial port of the Arduino single chip microcomputer (22); the output serial port of the Arduino single chip microcomputer (22) is connected with the motor and the plurality of electromagnetic valves, and the Arduino single chip microcomputer (22) controls the motor to rotate and the plurality of electromagnetic valves to be switched on and off; each liquid tank is connected with an electromagnetic valve, the lower end of each liquid tank is connected with a high-pressure liquid pipe (15), a booster pump (14) is installed on each high-pressure liquid pipe (15), and the tail end of each high-pressure liquid pipe (15) is connected with a conical nozzle (16).
2. The method of claim 1, wherein: the vehicle body module includes: the device comprises a section bar frame (27), a towing connecting mechanism (19), four wheels, a fixed support (4) and a control box (32); the control box (32) is fixed on the section bar frame (27) through a plurality of positioning holes; the section bar frame (27) is fixed above the four wheels through a camera bellows fixed support (4); the towing connecting mechanism (19) is used for connecting with the movable agricultural vehicle; the first raspberry group development board (20), the second raspberry group development board (21) and the Arduino single chip microcomputer (22) are arranged in a control box (32).
3. The method of claim 1, wherein: excitation LED lamps (25, 26) are arranged on a chassis beam at the front part of the advancing direction of the vehicle body; daylight LED lamps (23, 24) are arranged on the chassis cross member at the rear in the vehicle body advancing direction.
4. A method for intelligent sensing fertilization and pesticide spraying, which utilizes the intelligent sensing fertilization and pesticide spraying device of any one of claims 1 to 3, and is characterized by comprising the following steps:
s1, carrying out crop signal marking treatment on the plants;
s2, collecting image data of the crops by using a gray camera additionally provided with an optical filter to obtain image information, and processing the image information to obtain the positioning of the crops;
s3, detecting the crop diseases and the nitrogen content by adopting a crop disease and nitrogen content detection module;
s4, the spraying module performs fertilization and pesticide spraying according to the crop diseases and the nitrogen content classification signal and the disease signal of the nitrogen content detection module.
5. The intelligent sensing fertilization and pesticide spraying method of claim 4, wherein: in the step S1, the crop signal is sprayed in advance on the crop, the crop signal is rhodamine B solution, and the concentration of the rhodamine B solution is 8 ug/ml.
6. The intelligent sensing fertilization and pesticide spraying method of claim 4, wherein: in step S2, the processing of the image information includes: carrying out noise reduction, binarization, feature region contour extraction, small-area region abandoning and feature region center point extraction on the image; the method comprises the following specific steps: inputting image information into a raspberry group development board I, performing Gaussian blur processing on the image in the raspberry group development board I, performing threshold segmentation on the image by using the Otsu method, and further extracting a characteristic region contour; and performing vertex extraction on the contour of the feature region and calculating the pixel area of the contour, wherein the pixel area of the contour is smaller than 5000 of the feature region, otherwise, calculating the pixel coordinate of the center point of the feature region and converting the pixel coordinate system into a camera coordinate system.
7. The intelligent sensing fertilization and pesticide spraying method of claim 6, wherein: when the image is subjected to feature extraction, the minimum circumscribed rectangle in the image is taken as a feature region, the vertex coordinates of the feature region are read, and the pixel coordinates are determined, wherein the specific expression is as follows:
Figure FDA0003437598710000031
wherein, Cv、ChRespectively representing a pixel abscissa and a pixel ordinate of the crop; x is the number of1、x3Respectively representing the horizontal coordinates of the vertexes of the characteristic areas; y is1、y3Respectively representing the vertical coordinates of the vertexes of the characteristic areas;
and converting the pixel coordinates of the top points of the characteristic areas into camera coordinates by a Zhang-Yongyou calibration method, and transmitting the camera coordinates of the top points of the characteristic areas as output to a crop disease and nitrogen content detection module.
8. The intelligent sensing fertilization and pesticide spraying method as claimed in claim 7, wherein the step S3 is as follows:
inputting the camera coordinates of the characteristic area vertex into a raspberry group development board II of the crop disease and nitrogen content detection module; the raspberry group development boards calculate the shooting delay time of the RGB camera according to the vehicle speed information acquired by the vehicle speed sensor; the formula is as follows:
Figure FDA0003437598710000041
wherein, t1Represents a photographing delay time; s2Representing a distance between the RGB camera and the grayscale camera; t is t0Represents a shutter time; v represents vehicle speed information acquired by a vehicle speed sensor;
and the two raspberry group development boards take pictures according to the RGB cameras controlled by the shooting delay time, the pictures are transmitted back to the second raspberry group development board, and the second raspberry group development board identifies and detects diseases and nitrogen content.
9. The intelligent sensing fertilization and pesticide spraying method of claim 8, wherein: taking the camera coordinates at the top of the characteristic region as a center, extracting an ROI region in the image, after receiving the ROI region, simultaneously performing disease and nitrogen content detection operation through a trained M2Det deep learning model, and when a disease is identified, transmitting a disease signal to the Arduino single chip microcomputer by the raspberry development board II; after the nitrogen content is detected, the nitrogen content is divided into four types according to low, medium, high and excessive nitrogen content, the four types correspond to four signals respectively, and the four signals are transmitted to an Arduino single chip microcomputer by a raspberry group development board II;
the ROI area is an image range with the camera coordinate as the center and the size of 512 pixels by 512 pixels;
the M2Det deep learning model is established through the following steps: preparing a data set, training a model, predicting the model and verifying the model;
the data set is acquired through network collection, and a plurality of diseased leaves and a plurality of healthy leaves are selected; and carrying out data set expansion by image enhancement in a mode of turning, translating and manually adding noise points.
10. The intelligent sensing fertilization and pesticide spraying method as claimed in claim 4, wherein the step S4 is as follows: the Arduino single chip microcomputer receives a classification signal of nitrogen content and a disease signal; respectively controlling the corresponding electromagnetic valves to work according to the signals; controlling the liquid tank to spray through an electromagnetic valve according to the spraying delay;
the control of the electromagnetic valve and the booster pump is realized by an Arduino single chip microcomputer; the low, medium and high nitrogen content three-level signals respectively control the opening and closing of corresponding electromagnetic valves; when the nitrogen content is excessive, all electromagnetic valves connected with the liquid tank filled with the liquid nitrogen fertilizer are closed; the opening and closing of an electromagnetic valve connected with a liquid tank filled with pesticide are controlled by a disease signal;
the signal of the vehicle speed sensor is transmitted to the Arduino single chip microcomputer, and the Arduino single chip microcomputer calculates the injection delay according to the data transmitted by the vehicle speed sensor;
Figure FDA0003437598710000051
wherein t represents an injection delay; s3Representing the longitudinal distance between the grayscale camera and the conical nozzle; v represents vehicle speed information acquired by a vehicle speed sensor.
CN202111621352.5A 2021-12-28 2021-12-28 Intelligent sensing fertilization and pesticide spraying method and device Pending CN114514914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111621352.5A CN114514914A (en) 2021-12-28 2021-12-28 Intelligent sensing fertilization and pesticide spraying method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111621352.5A CN114514914A (en) 2021-12-28 2021-12-28 Intelligent sensing fertilization and pesticide spraying method and device

Publications (1)

Publication Number Publication Date
CN114514914A true CN114514914A (en) 2022-05-20

Family

ID=81596299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111621352.5A Pending CN114514914A (en) 2021-12-28 2021-12-28 Intelligent sensing fertilization and pesticide spraying method and device

Country Status (1)

Country Link
CN (1) CN114514914A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115115823A (en) * 2022-08-25 2022-09-27 深圳市城市交通规划设计研究中心股份有限公司 Road disease positioning and correcting method, device and equipment and readable storage medium
CN117378341A (en) * 2023-09-15 2024-01-12 佳木斯大学 Intelligent irrigation system based on soybean growth information

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103959973A (en) * 2014-04-18 2014-08-06 浙江大学 Refined crop fertilization system and nitrogenous fertilizer fertilization method
CN104076711A (en) * 2014-06-25 2014-10-01 中国农业大学 On-line fixed point predicating method for target soil fertilizer injection at crop root area
CN105631884A (en) * 2016-01-06 2016-06-01 上海交通大学 Crops spike number field active measurement device and method
CN109673609A (en) * 2019-01-17 2019-04-26 北京农业智能装备技术研究中心 Pesticide spraying traceability system
CN110235882A (en) * 2019-06-28 2019-09-17 南京农业大学 A kind of accurate variable chemical application to fruit tree robot based on multisensor
CN111066442A (en) * 2019-12-04 2020-04-28 中国农业大学 Targeted variable fertilization method and device for corn and application
CN111587872A (en) * 2020-06-24 2020-08-28 重庆文理学院 Robot for spraying pesticide
CN113100207A (en) * 2021-04-14 2021-07-13 郑州轻工业大学 Accurate formula pesticide application robot system based on wheat disease information and pesticide application method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103959973A (en) * 2014-04-18 2014-08-06 浙江大学 Refined crop fertilization system and nitrogenous fertilizer fertilization method
CN104076711A (en) * 2014-06-25 2014-10-01 中国农业大学 On-line fixed point predicating method for target soil fertilizer injection at crop root area
CN105631884A (en) * 2016-01-06 2016-06-01 上海交通大学 Crops spike number field active measurement device and method
CN109673609A (en) * 2019-01-17 2019-04-26 北京农业智能装备技术研究中心 Pesticide spraying traceability system
CN110235882A (en) * 2019-06-28 2019-09-17 南京农业大学 A kind of accurate variable chemical application to fruit tree robot based on multisensor
CN111066442A (en) * 2019-12-04 2020-04-28 中国农业大学 Targeted variable fertilization method and device for corn and application
CN111587872A (en) * 2020-06-24 2020-08-28 重庆文理学院 Robot for spraying pesticide
CN113100207A (en) * 2021-04-14 2021-07-13 郑州轻工业大学 Accurate formula pesticide application robot system based on wheat disease information and pesticide application method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEN-HAO SU等: "Development of a systemic crop signalling system for automated real-time plant care in vegetable crops", 《BIOSYSTEMS ENGINEERING》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115115823A (en) * 2022-08-25 2022-09-27 深圳市城市交通规划设计研究中心股份有限公司 Road disease positioning and correcting method, device and equipment and readable storage medium
CN117378341A (en) * 2023-09-15 2024-01-12 佳木斯大学 Intelligent irrigation system based on soybean growth information

Similar Documents

Publication Publication Date Title
CN114514914A (en) Intelligent sensing fertilization and pesticide spraying method and device
Li et al. A review of computer vision technologies for plant phenotyping
US20240020951A1 (en) Automated plant detection using image data
CN103891697B (en) The variable spray method of a kind of indoor autonomous spraying machine device people
Tian et al. Application status and challenges of machine vision in plant factory—A review
Huang et al. Deep localization model for intra-row crop detection in paddy field
CN107748886A (en) A kind of rail mounted contemporary standard orchard information sensory perceptual system based on depth camera
CN107976921B (en) A kind of fertilizer apparatus and method
CN114441457B (en) Method for eliminating background effect of rice canopy and improving monitoring precision of leaf nitrogen concentration based on multispectral image of unmanned aerial vehicle
Shah et al. Macro-nutrient deficiency identification in plants using image processing and machine learning
CN102024146A (en) Method for extracting foreground in piggery monitoring video
CN110121264A (en) Grow the image-taking system and method for cabin assembly line
Wang et al. A smart droplet detection approach with vision sensing technique for agricultural aviation application
Noguera et al. Nutritional status assessment of olive crops by means of the analysis and modelling of multispectral images taken with UAVs
Liu et al. UAV multispectral images for accurate estimation of the maize LAI considering the effect of soil background
Zhang et al. Application of convolution neural network algorithm based on intelligent sensor network in target recognition of corn weeder at seedling stage
CN115424247B (en) Greenhouse tomato identification and detection method adopting CBAM and octave convolution to improve YOLOV5
CN113643248B (en) Wheat fertility process monitoring method based on improved convolutional neural network
CN115862003A (en) Lightweight YOLOv 5-based in-vivo apple target detection and classification method
Liao et al. A double-layer model for improving the estimation of wheat canopy nitrogen content from unmanned aerial vehicle multispectral imagery
DE202022102591U1 (en) System for monitoring plant health in precision agriculture using image processing and convolutional neural network
CN114879744A (en) Night work unmanned aerial vehicle system based on machine vision
Zhang et al. Variable rate air‐assisted spray based on real‐time disease spot identification
Kavitha et al. Categorization of Nutritional Deficiencies in Plants With Random Forest
Zhang et al. Online Recognition of Small Vegetable Seed Sowing Based on Machine Vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination