CN112925310A - Control method, device, equipment and storage medium of intelligent deinsectization system - Google Patents

Control method, device, equipment and storage medium of intelligent deinsectization system Download PDF

Info

Publication number
CN112925310A
CN112925310A CN202110086758.1A CN202110086758A CN112925310A CN 112925310 A CN112925310 A CN 112925310A CN 202110086758 A CN202110086758 A CN 202110086758A CN 112925310 A CN112925310 A CN 112925310A
Authority
CN
China
Prior art keywords
image data
ground vehicle
unmanned ground
distance
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110086758.1A
Other languages
Chinese (zh)
Other versions
CN112925310B (en
Inventor
李致富
杜佳荣
曾俊海
王明
吴晋宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202110086758.1A priority Critical patent/CN112925310B/en
Publication of CN112925310A publication Critical patent/CN112925310A/en
Application granted granted Critical
Publication of CN112925310B publication Critical patent/CN112925310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Pest Control & Pesticides (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Signal Processing (AREA)
  • Insects & Arthropods (AREA)
  • Artificial Intelligence (AREA)
  • Wood Science & Technology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a control method, a control device, control equipment and a storage medium of an intelligent deinsectization system. Acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a first camera, and acquiring second image data of the first distance from the unmanned ground vehicle through a second camera; when the first distance is larger than a first threshold and smaller than a second threshold, fusing the first image data and the second image data to obtain fused image data; determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data; determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance; and controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels. The method can effectively improve the stability of the unmanned ground vehicle facing the complex road section in the intelligent deinsectization system. The application can be widely applied to the technical field of agricultural deinsectization.

Description

Control method, device, equipment and storage medium of intelligent deinsectization system
Technical Field
The application relates to the technical field of agricultural deinsectization, in particular to a control method, a control device, control equipment and a storage medium of an intelligent deinsectization system.
Background
In recent years, with the rapid development of high-resolution remote sensing technology, machine vision, control technology and the like, efficient, high-precision and low-cost crop health monitoring becomes possible. For example, in the related art, a technical scheme of automatically spraying pesticides by an unmanned vehicle appears, which greatly reduces the operation burden of farmers and can increase the yield of crops to a certain extent.
However, in practical use, it is found that the control of the travel of the unmanned vehicle is a great problem due to the limitation of the working environment. In some applications, the operation of the unmanned vehicle is controlled by measuring and planning the path of the crop area in advance and programming corresponding control information, so that the method has narrow application range, needs a large amount of labor cost and has low benefit. In summary, there is a need to solve the technical problems in the related art.
Disclosure of Invention
The present application aims to solve at least one of the technical problems in the related art to some extent.
Therefore, an object of the embodiments of the present application is to provide a control method of an intelligent pest control system, which can effectively improve stability and operation efficiency when an unmanned vehicle performs pest control.
Another object of the embodiments of the present application is to provide a control device of an intelligent pest control system.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the application comprises the following steps:
in a first aspect, an embodiment of the present application provides a control method for an intelligent pest killing system, where the intelligent pest killing system includes an unmanned ground vehicle, a first camera, a pest killing device, and an unmanned aerial vehicle, the first camera and the pest killing device are disposed on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; the unmanned aerial vehicle is provided with a second camera;
the control method comprises the following steps:
acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the second camera;
when the first distance is larger than a first threshold and smaller than a second threshold, fusing the first image data and the second image data to obtain fused image data;
determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
determining a front wheel steering angle of the unmanned ground vehicle based on a look-ahead control algorithm from the road curvature data and the first distance;
and controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels.
In addition, the method according to the above embodiment of the present application may further have the following additional technical features:
further, in an embodiment of the present application, the fusing the first image data and the second image data to obtain fused image data includes:
grouping the first image data to obtain multiple groups of first image subdata, and determining a first variance of each group of the first image subdata;
grouping the second image data to obtain a plurality of groups of second image subdata, and determining a second variance of each group of the second image subdata;
determining a first optimal variance of the first image data according to the first variance, and determining a second optimal variance of the second image data according to the second variance;
determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance;
and fusing the first image data and the second image data according to the first weight and the second weight to obtain fused image data.
Further, in an embodiment of the present application, the first image data is grouped to obtain three groups of first image sub-data;
determining a first optimal variance of the first image data according to the first variance, which specifically comprises:
by the formula
Figure BDA0002911036970000021
Determining a first optimal variance of the first image data;
in the formula (I), the compound is shown in the specification,
Figure BDA0002911036970000022
for a first optimal variance of the first image data,
Figure BDA0002911036970000023
a first variance corresponding to the first set of first image sub-data,
Figure BDA0002911036970000024
a first variance corresponding to the second set of first image sub-data,
Figure BDA0002911036970000025
and the first variance is corresponding to the third group of the first image sub-data.
Further, in an embodiment of the present application, the method further includes the following steps:
determining road curvature data for a first distance from the unmanned ground vehicle in a direction of travel of the unmanned ground vehicle based on the first image data when the first distance is less than a first threshold.
Further, in an embodiment of the present application, the method further includes the following steps:
and when the first distance is larger than a second threshold value, determining road curvature data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle according to the second image data.
Further, in one embodiment of the present application, the determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm from the road curvature data and the first distance comprises:
acquiring running information of the unmanned ground vehicle; the driving information comprises a vehicle speed of the unmanned ground vehicle, a mass of the unmanned ground vehicle, a yaw moment of inertia of the unmanned ground vehicle, and a turning stiffness of a front wheel of the unmanned ground vehicle;
establishing a dynamic state equation of the unmanned ground vehicle according to the running information;
and determining the front wheel steering angle of the unmanned ground vehicle according to the dynamic state equation and a predictive control algorithm.
Further, in an embodiment of the present application, the method further includes the following steps:
performing Kalman filtering on the first image data and the second image data.
In a second aspect, an embodiment of the present application provides a control device of an intelligent pest killing system, where the intelligent pest killing system includes an unmanned ground vehicle, a first camera, a pest killing device, and an unmanned aerial vehicle, the first camera and the pest killing device are disposed on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; the unmanned aerial vehicle is provided with a second camera;
the control device includes:
the acquisition module is used for acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the first camera and acquiring second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the second camera;
the fusion module is used for fusing the first image data and the second image data to obtain fused image data when the first distance is greater than a first threshold and smaller than a second threshold;
a curvature determination module for determining road curvature data of a first distance from the unmanned ground vehicle in the direction of travel of the unmanned ground vehicle according to the fused image data;
a steering angle prediction module for determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm based on the road curvature data and the first distance;
and the output module is used for controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels.
In a third aspect, an embodiment of the present application further provides a computer device, including:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one program causes the at least one processor to implement the control method of the intelligent insect control system according to the first aspect.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, in which a program executable by a processor is stored, and the program executable by the processor is used for implementing the control method of the intelligent pest control system according to the first aspect when executed by the processor.
Advantages and benefits of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application:
in the control method of the intelligent disinsection system in the embodiment of the application, first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle is acquired through the first camera, and second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle is acquired through the second camera; when the first distance is larger than a first threshold and smaller than a second threshold, fusing the first image data and the second image data to obtain fused image data; determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data; determining a front wheel steering angle of the unmanned ground vehicle based on a look-ahead control algorithm from the road curvature data and the first distance; and controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels. The method can effectively improve the stability of unmanned ground vehicles facing complex road sections in the intelligent deinsectization system, reduce the influence of complex road appearance on the deinsectization process, improve the pesticide spraying efficiency and be beneficial to improving the crop yield.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description is made on the drawings of the embodiments of the present application or the related technical solutions in the prior art, and it should be understood that the drawings in the following description are only for convenience and clarity of describing some embodiments in the technical solutions of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic view of an intelligent pest control system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a control method of an intelligent disinsection system according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of an intelligent pest control system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application. The step numbers in the following embodiments are provided only for convenience of illustration, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
Precision agriculture developed in the late 80 s of the 20 th century and is an important part of the modern agricultural revolution. In the future, precision agriculture will be used as a breakthrough for promoting agricultural rural modernization. It is estimated that 85% of the future grain yield growth will come from crop yield optimization and farming technology improvements. Specifically, the precision agricultural technology is based on modernization means such as a 3S technology (Remote sensing, RS), a Geographic Information System (GIS) and a Global Positioning System (GPS)), a sensor technology, an internet of things technology and the like, and aims to realize precise control of a farming process, precise monitoring of conditions of various aspects such as growth and disaster of crops, precise adjustment of farming investment according to monitoring conditions, precise farming, irrigation, fertilization and application of pesticide, sowing, harvesting and the like, namely equal harvest or higher harvest with the least investment. In this regard, health monitoring of crops is one of the bases of precision agriculture, and has received great attention and attention from experts and farmers.
Based on the application requirements, the embodiment of the application provides an intelligent deinsectization system. Referring to fig. 1, intelligence deinsectization system in this application embodiment includes unmanned ground vehicle 3, first camera, deinsectization device and unmanned aerial vehicle 2, and first camera sets up on unmanned ground vehicle 3 with the deinsectization device, and wherein this first camera is used for shooing the road conditions in unmanned ground vehicle the place ahead to make things convenient for background control system to guide the marcing of unmanned ground vehicle 3 according to the road conditions in unmanned ground vehicle 3 the place ahead. The disinsection device is used for spraying pesticide on the crops 1 for disinsection, and in some embodiments, the disinsection device can comprise a pesticide storage device and a spray head. Unmanned aerial vehicle 2 trails and flies in unmanned ground vehicle 3's top, unmanned aerial vehicle 2 also can gather the road information feedback that unmanned ground vehicle 3 marchd the place ahead on the one hand to unmanned ground vehicle 3's backstage control system, and unmanned ground vehicle 3 near crop 1's pest distribution situation also can be gathered to unmanned ground vehicle 2 on the other hand, thereby conveniently control the spraying amount of pesticide, in order to accomplish more accurately, realize the deinsectization sparingly. Specifically, unmanned aerial vehicle 2 in this application embodiment can adopt rotor type unmanned aerial vehicle, and it has the ability of VTOL to can hover in the air, be fit for working in complicated environment, can carry on visible light digital camera and multispectral digital camera on the unmanned aerial vehicle, all note as the second camera. The intelligent deinsectization system in the embodiment of the application further comprises a background overall control system which is used for receiving various kinds of information and controlling the unmanned aerial vehicle 2, the unmanned ground vehicle 3 and other components to work.
Referring to fig. 2, an embodiment of the present application provides a control method of an intelligent pest control system, where the control method is mainly used for controlling the traveling of an unmanned ground vehicle, and includes the following steps:
step 110, acquiring first image data of a first distance from the unmanned ground vehicle to the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a first camera, and acquiring second image data of the first distance from the unmanned ground vehicle to the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a second camera;
in this application embodiment, as aforementioned, be provided with first camera on the unmanned ground vehicle, first camera can be used for shooing the road conditions in front of unmanned ground vehicle, obtains corresponding image data, marks as first image data. The unmanned aerial vehicle is provided with a second camera, and the second camera can be used for acquiring image data in front of the unmanned ground vehicle and recording the image data as second image data. By adjusting the angle of the top-down photographing of the first camera and the second camera, the two cameras can acquire image data at the same distance from the unmanned ground vehicle, for example, both acquire road image data at 10m from the unmanned ground vehicle, or both acquire road image data at 8m from the unmanned ground vehicle, and the distance between the acquired image data and the unmanned ground vehicle is recorded as the first distance. It can be understood that in the embodiment of the present application, the length of the first distance may be flexibly adjusted, for example, when the speed of the unmanned ground vehicle is faster, the first distance may be longer appropriately, so as to collect the road condition ahead earlier, and conveniently control the operation of the vehicle; conversely, when the speed of the unmanned ground vehicle is slow, the first distance may be suitably short. In some embodiments, kalman filtering may be performed on the first image data and the second image data to improve accuracy of the data and reduce interference of noise data.
Step 120, when the first distance is greater than a first threshold and smaller than a second threshold, fusing the first image data and the second image data to obtain fused image data;
in the embodiment of the application, the first camera on the unmanned ground vehicle and the second camera on the unmanned aerial vehicle are at different shooting heights, so that the credibility degrees of the two cameras are different when image data of roads at different distances are acquired. Specifically, when the road for collecting the image data is closer to the unmanned ground vehicle, the reliability of the first image data obtained by the first camera on the unmanned ground vehicle is higher; when the road for collecting the image data is far away from the unmanned ground vehicle, the reliability of the second image data obtained by the second camera on the unmanned aerial vehicle is higher. Therefore, two distance thresholds, namely a first threshold and a second threshold, may be set, and the second threshold is greater than the first threshold, for example, the first threshold may be set to be 2m and the second threshold may be set to be 5 m. When the first distance between the road of the collected image data and the unmanned ground vehicle is smaller than a first threshold value, the first image data is adopted to determine the curvature data of the front road, namely the curvature data of the road with the first distance between the unmanned ground vehicle and the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle; and when the first distance between the road of the collected image data and the unmanned ground vehicle is larger than a second threshold value, the second image data is adopted to determine the curvature data of the front road. The accuracy of road curvature data identification is improved, and therefore the control of advancing, turning and the like of the unmanned ground vehicle can be conveniently and accurately achieved subsequently.
And when the first distance between the road of the collected image data and the unmanned ground vehicle is greater than a first threshold value and less than a second threshold value, the first image data and the second image data can be fused to obtain fused image data, and the curvature data of the front road is determined according to the fused image data, so that the accuracy of identifying the curvature data of the road is improved. The fused image data come from image data acquired by different image acquisition devices, and by combining and complementing, image data with higher precision can be acquired.
In the embodiment of the application, when the first image data and the second image data are fused to obtain fused image data, the first image data are firstly grouped to obtain a plurality of groups of first image subdata, and a first variance of each group of first image subdata is determined; and meanwhile, grouping the second image data to obtain a plurality of groups of second image subdata, and determining a second variance of each group of second image subdata. And then determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance, and fusing the first image data and the second image data according to the first weight and the second weight to obtain fused image data. Specifically, taking dividing the first image data into three groups as an example, averaging the three groups of first image sub-data, and then determining the variance of each group of first image sub-data according to the average value to obtain the variance of each group of first image sub-data
Figure BDA0002911036970000061
A first variance corresponding to the first set of first image sub-data,
Figure BDA0002911036970000071
a first variance corresponding to the second set of first image sub-data,
Figure BDA0002911036970000072
a first variance corresponding to the third group of the first image subdata is calculated by a formula
Figure BDA0002911036970000073
A first optimal variance of the first image data is determined. Similarly, the second image data may be divided into three groups, and a second optimal variance of the second image data may be determined according to a similar procedure. Then, according to the first optimal variance and the second optimal variance, a first weight corresponding to the first image data can be determined:
Figure BDA0002911036970000074
in the formula, W1Is a first weight, σzA standard deviation that is the second optimum variance; sigmaxIs the standard deviation of the first optimum variance. By subtracting W from 11A second weight is obtained. And according to the first weight and the second weight, carrying out weighted summation on the first image data and the second image data to obtain fused image data.
Step 130, according to the fused image data, determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle;
step 140, determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
in the embodiment of the application, according to the fused image data, the road curvature data at the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle can be determined. And based on the road curvature data, the front wheel steering angle of the unmanned ground vehicle may be determined by a predictive control algorithm.
Specifically, after obtaining the road curvature data in front of the unmanned ground vehicle, according to the real-time driving information of the unmanned ground vehicle and the structural information of the vehicle itself, a dynamic state equation of the vehicle can be established, specifically:
Figure BDA0002911036970000075
in the formula (I), the compound is shown in the specification,
Figure BDA0002911036970000076
state variables of the unmanned ground vehicle comprise a controlled distance error, a controlled speed error, a controlled angle error and a controlled angular acceleration error; m is the mass of the unmanned ground vehicle, and the unit is kg; v. ofxThe unit is the speed of the unmanned ground vehicle, and the unit is meter per second; i iszBeing unmanned ground vehiclesYaw moment of inertia, in kg/m 2; cαfThe turning rigidity of the front wheel of the unmanned ground vehicle is N/rad; lfIs the distance from the center of gravity of the unmanned ground vehicle to the front wheel, and has the unit of m. Wherein: sigma1=2(Cαf+Cαr)、σ2=-2(lfCαf-lrCαr)、
Figure BDA0002911036970000081
CαrThe turning rigidity of the front wheel of the unmanned ground vehicle is expressed in the unit of N/rad, lrIs the distance from the center of gravity of the unmanned ground vehicle to the front wheel, and has the unit of m.
Applying the optimal theory, an optimal input steering angle delta needs to be found, and the evaluation function is obtained as follows:
Figure BDA0002911036970000082
in the formula, k is a discrete sampling time point, and Q and R are weight matrixes respectively.
Discretizing a dynamic state equation to obtain a discrete state equation, then converting a path tracking problem into an augmented linear quadratic problem, and merging the front road curvature data of the unmanned ground vehicle into a state vector of a system to obtain an augmented matrix:
Figure BDA0002911036970000083
CR(k)=[cR(k),cR(k+1),…,cR(k+N)]T
the evaluation function becomes:
Figure BDA0002911036970000084
Figure BDA0002911036970000085
δmin≤δ≤δmax
wherein:
Figure BDA0002911036970000086
Figure BDA0002911036970000087
according to the predictive control theory, an optimal control input steering angle can be obtained:
δ*(k)=-Kbx(k)-KfCR(k)
wherein: delta*(k) Indicating the optimum steering angle, Kb=(R+BTPB)-1BTPA;
Figure BDA0002911036970000088
Or: kf,i=(R+BTPB)-1BTξi-1PD;
The matrices P and ζ can be obtained from the following equations:
Figure BDA0002911036970000089
ζ=AT(I+PBR-1BT)-1
and 150, controlling the unmanned ground vehicle to move according to the steering angle of the front wheels.
Through the steps, the appropriate input steering angle of the front wheel can be calculated, the unmanned ground vehicle is controlled according to the steering angle of the front wheel, the unmanned ground vehicle can be automatically, quickly and accurately adjusted when facing a complex road environment, stable running is guaranteed, and accordingly working efficiency and stability of deinsectization are improved.
The following describes a control device of an intelligent pest control system according to an embodiment of the present application in detail with reference to the accompanying drawings.
Referring to fig. 3, a control device of an intelligent pest control system provided in an embodiment of the present application is the same as the above-mentioned intelligent pest control system, and is not described herein again, and the control device includes:
the acquisition module 101 is used for acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a first camera and acquiring second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a second camera;
the fusion module 102 is configured to fuse the first image data and the second image data to obtain fused image data when the first distance is greater than a first threshold and smaller than a second threshold;
the curvature determining module 103 is used for determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
a steering angle prediction module 104 for determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm based on the road curvature data and the first distance;
and the output module 105 is used for controlling the unmanned ground vehicle to travel according to the steering angle of the front wheels.
It is to be understood that the contents of the above method embodiments are all applicable to the present apparatus embodiment, the functions specifically implemented by the present apparatus embodiment are the same as the above method embodiments, and the advantageous effects achieved by the present apparatus embodiment are also the same as the advantageous effects achieved by the above method embodiments.
Referring to fig. 4, an embodiment of the present application further provides a computer device, including:
at least one processor 201;
at least one memory 202 for storing at least one program;
when the at least one program is executed by the at least one processor 201, the at least one processor 201 may implement the above-described control method embodiment of the intelligent disinsection system.
Similarly, the contents in the foregoing method embodiments are all applicable to this computer apparatus embodiment, the functions specifically implemented by this computer apparatus embodiment are the same as those in the foregoing method embodiments, and the beneficial effects achieved by this computer apparatus embodiment are also the same as those achieved by the foregoing method embodiments.
The present application also provides a computer-readable storage medium, in which a program executable by the processor 201 is stored, and when the program executable by the processor 201 is executed by the processor 201, the embodiment of the control method of the intelligent disinsection system is executed.
Similarly, the contents in the above method embodiments are all applicable to the computer-readable storage medium embodiments, the functions specifically implemented by the computer-readable storage medium embodiments are the same as those in the above method embodiments, and the beneficial effects achieved by the computer-readable storage medium embodiments are also the same as those achieved by the above method embodiments.
In alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed and in which sub-operations described as part of larger operations are performed independently.
Furthermore, although the present application is described in the context of functional modules, it should be understood that, unless otherwise stated to the contrary, one or more of the functions and/or features may be integrated in a single physical device and/or software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion regarding the actual implementation of each module is not necessary for an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be understood within the ordinary skill of an engineer, given the nature, function, and internal relationship of the modules. Accordingly, those skilled in the art can, using ordinary skill, practice the present application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative of and not intended to limit the scope of the application, which is defined by the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the foregoing description of the specification, reference to the description of "one embodiment/example," "another embodiment/example," or "certain embodiments/examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
While the present application has been described with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. The control method of the intelligent deinsectization system is characterized in that the intelligent deinsectization system comprises an unmanned ground vehicle, a first camera, a deinsectization device and an unmanned aerial vehicle, wherein the first camera and the deinsectization device are arranged on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; the unmanned aerial vehicle is provided with a second camera;
the control method comprises the following steps:
acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the second camera;
when the first distance is larger than a first threshold and smaller than a second threshold, fusing the first image data and the second image data to obtain fused image data;
determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
determining a front wheel steering angle of the unmanned ground vehicle based on a look-ahead control algorithm from the road curvature data and the first distance;
and controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels.
2. The method of claim 1, wherein fusing the first image data and the second image data to obtain fused image data comprises:
grouping the first image data to obtain multiple groups of first image subdata, and determining a first variance of each group of the first image subdata;
grouping the second image data to obtain a plurality of groups of second image subdata, and determining a second variance of each group of the second image subdata;
determining a first optimal variance of the first image data according to the first variance, and determining a second optimal variance of the second image data according to the second variance;
determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance;
and fusing the first image data and the second image data according to the first weight and the second weight to obtain fused image data.
3. The method of claim 2, wherein the first image data is grouped to obtain three sets of first image sub-data;
determining a first optimal variance of the first image data according to the first variance, which specifically comprises:
by the formula
Figure FDA0002911036960000021
Determining a first optimal variance of the first image data;
in the formula, σx 2For a first optimal variance of the first image data,
Figure FDA0002911036960000022
a first variance corresponding to the first set of first image sub-data,
Figure FDA0002911036960000023
a first variance corresponding to the second set of first image sub-data,
Figure FDA0002911036960000024
and the first variance is corresponding to the third group of the first image sub-data.
4. The method of claim 1, further comprising the steps of:
determining road curvature data for a first distance from the unmanned ground vehicle in a direction of travel of the unmanned ground vehicle based on the first image data when the first distance is less than a first threshold.
5. The method of claim 1, further comprising the steps of:
and when the first distance is larger than a second threshold value, determining road curvature data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle according to the second image data.
6. The method of claim 1, wherein determining a front wheel steering angle of the unmanned ground vehicle based on a look-ahead control algorithm from the road curvature data and the first distance comprises:
acquiring running information of the unmanned ground vehicle; the driving information comprises a vehicle speed of the unmanned ground vehicle, a mass of the unmanned ground vehicle, a yaw moment of inertia of the unmanned ground vehicle, and a turning stiffness of a front wheel of the unmanned ground vehicle;
establishing a dynamic state equation of the unmanned ground vehicle according to the running information;
and determining the front wheel steering angle of the unmanned ground vehicle according to the dynamic state equation and a predictive control algorithm.
7. The method according to any one of claims 1-6, further comprising the steps of:
performing Kalman filtering on the first image data and the second image data.
8. The control device of the intelligent deinsectization system is characterized in that the intelligent deinsectization system comprises an unmanned ground vehicle, a first camera, a deinsectization device and an unmanned aerial vehicle, wherein the first camera and the deinsectization device are arranged on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; the unmanned aerial vehicle is provided with a second camera;
the control device includes:
the acquisition module is used for acquiring first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the first camera and acquiring second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through the second camera;
the fusion module is used for fusing the first image data and the second image data to obtain fused image data when the first distance is greater than a first threshold and smaller than a second threshold;
a curvature determination module for determining road curvature data of a first distance from the unmanned ground vehicle in the direction of travel of the unmanned ground vehicle according to the fused image data;
a steering angle prediction module for determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm based on the road curvature data and the first distance;
and the output module is used for controlling the unmanned ground vehicle to advance according to the steering angle of the front wheels.
9. A computer device, comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement the method of any one of claims 1-7.
10. A computer-readable storage medium having stored therein instructions executable by a processor, the computer-readable storage medium comprising: the processor-executable instructions, when executed by a processor, are for implementing the method of any one of claims 1-7.
CN202110086758.1A 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system Active CN112925310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110086758.1A CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110086758.1A CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Publications (2)

Publication Number Publication Date
CN112925310A true CN112925310A (en) 2021-06-08
CN112925310B CN112925310B (en) 2023-08-08

Family

ID=76164594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110086758.1A Active CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Country Status (1)

Country Link
CN (1) CN112925310B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342024A (en) * 2021-06-24 2021-09-03 湘潭大学 Fixed-point cruise control method of four-rotor aircraft based on predictive control

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104186451A (en) * 2014-08-19 2014-12-10 西北农林科技大学 Insect-killing weeding pesticide spraying robot based on machine vision
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
CN107807632A (en) * 2016-09-08 2018-03-16 福特全球技术公司 Condition of road surface is perceived from the sensing data of fusion
CN108765284A (en) * 2018-05-04 2018-11-06 哈尔滨理工大学 A kind of autonomous driving vehicle vehicle-mounted unmanned aerial vehicle image processing method and device
CN110398969A (en) * 2019-08-01 2019-11-01 北京主线科技有限公司 Automatic driving vehicle adaptive prediction time domain rotating direction control method and device
CN110798848A (en) * 2019-09-27 2020-02-14 国家电网有限公司 Wireless sensor data fusion method and device, readable storage medium and terminal
CN111665845A (en) * 2020-06-24 2020-09-15 北京百度网讯科技有限公司 Method, device, equipment and storage medium for planning path
CN111742344A (en) * 2019-06-28 2020-10-02 深圳市大疆创新科技有限公司 Image semantic segmentation method, movable platform and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
CN104186451A (en) * 2014-08-19 2014-12-10 西北农林科技大学 Insect-killing weeding pesticide spraying robot based on machine vision
CN107807632A (en) * 2016-09-08 2018-03-16 福特全球技术公司 Condition of road surface is perceived from the sensing data of fusion
CN108765284A (en) * 2018-05-04 2018-11-06 哈尔滨理工大学 A kind of autonomous driving vehicle vehicle-mounted unmanned aerial vehicle image processing method and device
CN111742344A (en) * 2019-06-28 2020-10-02 深圳市大疆创新科技有限公司 Image semantic segmentation method, movable platform and storage medium
CN110398969A (en) * 2019-08-01 2019-11-01 北京主线科技有限公司 Automatic driving vehicle adaptive prediction time domain rotating direction control method and device
CN110798848A (en) * 2019-09-27 2020-02-14 国家电网有限公司 Wireless sensor data fusion method and device, readable storage medium and terminal
CN111665845A (en) * 2020-06-24 2020-09-15 北京百度网讯科技有限公司 Method, device, equipment and storage medium for planning path

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张圣祥等: "多传感器信息融合的服务机器人导航方法", 《单片机与嵌入式系统应用》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342024A (en) * 2021-06-24 2021-09-03 湘潭大学 Fixed-point cruise control method of four-rotor aircraft based on predictive control

Also Published As

Publication number Publication date
CN112925310B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US11763400B2 (en) Updating execution of tasks of an agricultural prescription
EP3566556B1 (en) Method of planning a path for a vehicle having a work tool and a vehicle path planning system
US11134602B2 (en) System and method for controlling the speed of an agricultural implement
CN110708948A (en) System and method for irrigation management using machine learning workflows
US11406052B2 (en) Cartridges to employ an agricultural payload via an agricultural treatment delivery system
US11526997B2 (en) Targeting agricultural objects to apply units of treatment autonomously
US11812681B2 (en) Precision treatment of agricultural objects on a moving platform
US20220171411A1 (en) Systems and methods for traversing a three dimensional space
Vrochidou et al. Computer vision in self-steering tractors
CN112925310B (en) Control method, device, equipment and storage medium of intelligent deinsectization system
CN116839570B (en) Crop interline operation navigation method based on sensor fusion target detection
CN114265409A (en) Track information processing method and device and ground equipment
US20210185942A1 (en) Managing stages of growth of a crop with micro-precision via an agricultural treatment delivery system
AU2020405272A1 (en) Precision treatment of agricultural objects on a moving platform
Rovira-Más et al. Robotics for precision viticulture
Wang et al. Modelling and control methods in path tracking control for autonomous agricultural vehicles: A review of state of the art and challenges
CN109814551A (en) Cereal handles automated driving system, automatic Pilot method and automatic identifying method
US11653590B2 (en) Calibration of systems to deliver agricultural projectiles
CN116048115A (en) Control method for unmanned aerial vehicle, group cooperation system and processor
Grimstad et al. Thorvald II configuration for wheat phenotyping
US11672203B2 (en) Predictive map generation and control
US20230040430A1 (en) Detecting untraversable soil for farming machine
US20230011864A1 (en) Advanced movement through vegetation with an autonomous vehicle
US20200242923A1 (en) Method for generating a visual range collection and visual range collecting device
CN112766178A (en) Method, device, equipment and medium for positioning pests based on intelligent pest control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant