CN112925310B - Control method, device, equipment and storage medium of intelligent deinsectization system - Google Patents

Control method, device, equipment and storage medium of intelligent deinsectization system Download PDF

Info

Publication number
CN112925310B
CN112925310B CN202110086758.1A CN202110086758A CN112925310B CN 112925310 B CN112925310 B CN 112925310B CN 202110086758 A CN202110086758 A CN 202110086758A CN 112925310 B CN112925310 B CN 112925310B
Authority
CN
China
Prior art keywords
ground vehicle
unmanned ground
image data
distance
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110086758.1A
Other languages
Chinese (zh)
Other versions
CN112925310A (en
Inventor
李致富
杜佳荣
曾俊海
王明
吴晋宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202110086758.1A priority Critical patent/CN112925310B/en
Publication of CN112925310A publication Critical patent/CN112925310A/en
Application granted granted Critical
Publication of CN112925310B publication Critical patent/CN112925310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Abstract

The application discloses a control method, a device, equipment and a storage medium of an intelligent deinsectization system. Acquiring first image data of a first distance from an unmanned ground vehicle in the advancing direction of the unmanned ground vehicle through a first camera, and acquiring second image data of the first distance from the unmanned ground vehicle through a second camera; when the first distance is larger than the first threshold value and smaller than the second threshold value, fusing the first image data and the second image data to obtain fused image data; determining road curvature data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle according to the fused image data; determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance; and controlling the unmanned ground vehicle to travel according to the steering angle of the front wheels. The method can effectively improve the stability of the intelligent deinsectization system when the unmanned ground vehicle faces a complex road section. The application can be widely applied to the technical field of agricultural deinsectization.

Description

Control method, device, equipment and storage medium of intelligent deinsectization system
Technical Field
The application relates to the technical field of agricultural deinsectization, in particular to a control method, a device, equipment and a storage medium of an intelligent deinsectization system.
Background
In recent years, with rapid development of high-resolution remote sensing technology, machine vision, control technology and the like, efficient, high-precision and low-cost crop health monitoring is enabled. For example, in the related art, a technical scheme of automatically spraying pesticides by an unmanned vehicle has appeared, which greatly reduces the work load of farmers and can increase the yield of crops to a certain extent.
However, in practical applications, it has been found that the control of the travel of the unmanned vehicle is a major challenge due to the limitations of the working environment. In some applications, the operation of the unmanned vehicle is controlled by measuring and planning the path of the crop area in advance and programming the path into corresponding control information, so that the application range of the method is narrow, a large amount of labor cost is required, and the benefit is low. In view of the above, there is a need to solve the technical problems in the related art.
Disclosure of Invention
The present application aims to solve at least one of the technical problems existing in the related art to a certain extent.
Therefore, an object of the embodiments of the present application is to provide a control method of an intelligent deinsectization system, which can effectively improve the stability and the operation efficiency when deinsectization is performed by an unmanned vehicle.
Another object of the embodiment of the application is to provide a control device of an intelligent deinsectization system.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the application comprises the following steps:
in a first aspect, an embodiment of the present application provides a control method of an intelligent deinsectization system, where the intelligent deinsectization system includes an unmanned ground vehicle, a first camera, an deinsectization device and an unmanned aerial vehicle, the first camera and the deinsectization device are disposed on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; a second camera is mounted on the unmanned aerial vehicle;
the control method comprises the following steps:
acquiring first image data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the second camera;
when the first distance is larger than a first threshold value and smaller than a second threshold value, fusing the first image data and the second image data to obtain fused image data;
determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
and controlling the unmanned ground vehicle to travel according to the front wheel steering angle.
In addition, the method according to the above embodiment of the present application may further have the following additional technical features:
further, in an embodiment of the present application, the fusing the first image data and the second image data to obtain fused image data includes:
grouping the first image data to obtain a plurality of groups of first image sub-data, and determining a first variance of each group of first image sub-data;
grouping the second image data to obtain a plurality of groups of second image sub-data, and determining a second variance of each group of second image sub-data;
determining a first optimal variance of the first image data according to the first variance, and determining a second optimal variance of the second image data according to the second variance;
determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance;
and fusing the first image data and the second image data according to the first weight and the second weight to obtain the fused image data.
Further, in one embodiment of the present application, the first image data is grouped to obtain three groups of first image sub-data;
determining a first optimal variance of the first image data according to the first variance, which is specifically:
by the formulaDetermining a first optimal variance of the first image data;
in sigma x 2 For a first optimal variance of the first image data,for a first variance corresponding to a first group of first image sub-data,/>For the first variance corresponding to the second group of first image sub-data,/>And a first variance corresponding to the third group of first image sub-data.
Further, in one embodiment of the present application, the method further comprises the steps of:
and when the first distance is smaller than a first threshold value, determining road curvature data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the first image data.
Further, in one embodiment of the present application, the method further comprises the steps of:
and when the first distance is greater than a second threshold value, determining road curvature data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the second image data.
Further, in one embodiment of the present application, the determining the front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance includes:
acquiring the driving information of the unmanned ground vehicle; the driving information comprises the speed of the unmanned ground vehicle, the mass of the unmanned ground vehicle, the yaw moment of inertia of the unmanned ground vehicle and the turning rigidity of the front wheels of the unmanned ground vehicle;
establishing a dynamic state equation of the unmanned ground vehicle according to the driving information;
and determining the front wheel steering angle of the unmanned ground vehicle according to the dynamic state equation and a predictive control algorithm.
Further, in one embodiment of the present application, the method further comprises the steps of:
and performing Kalman filtering on the first image data and the second image data.
In a second aspect, an embodiment of the present application provides a control device of an intelligent deinsectization system, where the intelligent deinsectization system includes an unmanned ground vehicle, a first camera, an deinsectization device and an unmanned aerial vehicle, the first camera and the deinsectization device are disposed on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; a second camera is mounted on the unmanned aerial vehicle;
the control device includes:
the acquisition module is used for acquiring first image data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the second camera;
the fusion module is used for fusing the first image data and the second image data to obtain fused image data when the first distance is larger than a first threshold value and smaller than a second threshold value;
the curvature determining module is used for determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
the steering angle prediction module is used for determining the front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
and the output module is used for controlling the unmanned ground vehicle to travel according to the front wheel steering angle.
In a third aspect, embodiments of the present application further provide a computer device, including:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement the method of controlling a smart insect control system of the first aspect described above.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium having stored therein a processor executable program for implementing the control method of the intelligent pest killing system of the first aspect, when the processor executable program is executed by a processor.
The advantages and benefits of the present application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the present application.
According to the control method of the intelligent deinsectization system, first image data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle are acquired through the first camera, and second image data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle are acquired through the second camera; when the first distance is larger than a first threshold value and smaller than a second threshold value, fusing the first image data and the second image data to obtain fused image data; determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data; determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance; and controlling the unmanned ground vehicle to travel according to the front wheel steering angle. The method can effectively improve the stability of unmanned ground vehicles in the intelligent deinsectization system when facing complex road sections, reduce the influence of complex landforms on the deinsectization process, improve the spraying efficiency of pesticides, and be favorable for improving the yield of crops.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description is made with reference to the accompanying drawings of the embodiments of the present application or the related technical solutions in the prior art, it should be understood that, in the following description, the drawings are only for convenience and clarity to describe some embodiments in the technical solutions of the present application, and other drawings may be obtained according to these drawings without any inventive effort for those skilled in the art.
Fig. 1 is a schematic diagram of an intelligent deinsectization system provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a control method of the intelligent deinsectization system provided in the embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of an intelligent deinsectization system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
The precise agriculture is developed in the late 80 s of the 20 th century, and is an important component of the modern agricultural revolution. In the future, the precise agriculture will be used as a break for promoting the modernization of agricultural rural areas. It is estimated that a grain yield increase of 85% in the future will result from crop yield optimization and agro-farming improvement. Specifically, the accurate agricultural technology is based on modernization means such as 3S technology (Remote sensing technology (RS), geographic information system (Geography information systems, GIS) and global positioning system (Global positioning systems, GPS)), sensor technology, internet of things technology, and the like, and aims to realize accurate control over the cultivation process, accurately monitor the conditions of various aspects such as crop growth vigor, disaster recovery, and the like, accurately adjust cultivation input according to the monitoring conditions, realize accurate cultivation, irrigation, fertilization and pesticide application, seeding, harvesting, and the like, namely realize equal harvest or higher harvest with minimum input. Among these, health monitoring of crops is one of the bases of precision agriculture, and great attention and importance of experts and farmers is gained.
Based on the application requirements, an intelligent deinsectization system is provided in the embodiment of the application. Referring to fig. 1, the intelligent deinsectization system in the embodiment of the application includes an unmanned ground vehicle 3, a first camera, an deinsectization device and an unmanned aerial vehicle 2, wherein the first camera and the deinsectization device are arranged on the unmanned ground vehicle 3, and the first camera is used for shooting road conditions in front of the unmanned ground vehicle, so that a background control system can guide the traveling of the unmanned ground vehicle 3 according to the road conditions in front of the unmanned ground vehicle 3. The insecticide apparatus is for insecticide spraying of the crop 1, and in some embodiments, the insecticide apparatus may include an insecticide storage device and spray heads. Unmanned aerial vehicle 2 tracks and flies in unmanned ground vehicle 3's top, and unmanned ground vehicle 3 advances the road information feedback in the place ahead for unmanned ground vehicle 3 to the backstage control system of unmanned ground vehicle 3 also can be gathered to unmanned ground vehicle 2 on the one hand, and on the other hand unmanned aerial vehicle 2 also can gather the pest distribution condition of crop 1 near unmanned ground vehicle 3 to the spray volume of convenient control pesticide, in order to accomplish more accurate, practice thrift and realize deinsectization. Specifically, the unmanned aerial vehicle 2 in the embodiment of the present application may adopt a rotary-wing unmanned aerial vehicle, which has a capability of taking off and landing vertically, and may hover in the air, suitable for working in a complex environment, on which a visible light digital camera and a multispectral digital camera may be carried, both of which are denoted as second cameras. The intelligent deinsectization system in the embodiment of the application further comprises a background overall control system which is used for receiving various information and controlling the unmanned aerial vehicle 2, the unmanned ground vehicle 3 and other parts to work.
Referring to fig. 2, an embodiment of the present application provides a control method of an intelligent deinsectization system, where the control method is mainly used for controlling travel of an unmanned ground vehicle, and includes the following steps:
step 110, acquiring first image data of a first distance from the unmanned ground vehicle to the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through a first camera, and acquiring second image data of the first distance from the unmanned ground vehicle to the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through a second camera;
in this embodiment of the present invention, as described above, a first camera is provided on the unmanned ground vehicle, and the first camera may be used to capture road conditions in front of the unmanned ground vehicle, obtain corresponding image data, and record the image data as first image data. The unmanned aerial vehicle is provided with a second camera, and the unmanned aerial vehicle can be used for collecting image data in front of an unmanned ground vehicle and recording the image data as second image data. By adjusting the nodding angles of the first camera and the second camera, the two cameras can acquire image data at the same distance from the unmanned ground vehicle, for example, the two cameras can acquire road image data at the position 10m from the unmanned ground vehicle, or the two cameras can acquire road image data at the position 8m from the unmanned ground vehicle, and the like, and the distance between the acquired image data and the unmanned ground vehicle is recorded as a first distance. It can be understood that in the embodiment of the present application, the length of the first distance may be flexibly adjusted, for example, when the speed of the unmanned ground vehicle is faster, the first distance may be longer appropriately, so as to collect the road condition ahead earlier, so as to control the operation of the vehicle conveniently; conversely, when the speed of the unmanned ground vehicle is slow, the first distance may be suitably shorter. In some embodiments, the first image data and the second image data may be further subjected to kalman filtering, so as to improve accuracy of the data and reduce interference of noise data.
Step 120, when the first distance is greater than the first threshold and less than the second threshold, fusing the first image data and the second image data to obtain fused image data;
in the embodiment of the application, since the first camera on the unmanned ground vehicle and the second camera on the unmanned plane are at different shooting heights, when image data of roads at different distances are acquired, the credibility of the two cameras is different. Specifically, when the road on which the image data is collected is closer to the unmanned ground vehicle, the reliability of the first image data obtained by the first camera on the unmanned ground vehicle is higher; when the road where the image data are collected is far from the unmanned ground vehicle, the reliability of the second image data obtained by the second camera on the unmanned plane is higher. Thus, two distance thresholds may be set, denoted as a first threshold and a second threshold, the second threshold being larger than the first threshold, e.g. the first threshold may be set to 2m and the second threshold to 5m. When the first distance between the road where the image data are collected and the unmanned ground vehicle is smaller than a first threshold value, the curvature data of the road in front, namely the curvature data of the road which is the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle, are determined by adopting the first image data; when the road distance of the acquired image data is greater than a first distance of the unmanned ground vehicle, the curvature data of the front road is determined by using the second image data. The accuracy of road curvature data identification is improved, and therefore follow-up control of the unmanned ground vehicle in advancing, turning and the like can be conveniently and accurately achieved.
And when the road distance of the collected image data is greater than a first threshold value and less than a second threshold value, the first image data and the second image data can be fused to obtain fused image data, and curvature data of a road in front is determined according to the fused image data so as to improve accuracy of identifying the curvature data of the road. The fusion image data come from the image data acquired by different image acquisition devices, and the image data with higher precision can be acquired by combining and complementing the image data, and the fusion image data has the characteristics of strong timeliness, small influence by atmospheric radiation, high spatial resolution, rich data quantity and the like.
In the embodiment of the application, when the first image data and the second image data are fused to obtain fused image data, the first image data are firstly grouped to obtain a plurality of groups of first image sub-data, and a first variance of each group of first image sub-data is determined; and grouping the second image data simultaneously to obtain a plurality of groups of second image sub-data, and determining a second variance of each group of second image sub-data. And then determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance, and fusing the first image data and the second image data according to the first weight and the second weight to obtain fused image data. In particularTaking the example of dividing the first image data into three groups, averaging the three groups of first image sub-data, and determining the variance of each group of first image sub-data according to the average value to obtainFor a first variance corresponding to a first group of first image sub-data,/>For the first variance corresponding to the second group of first image sub-data,/>For the first variance corresponding to the third group of first image sub-data, by the formula +.>A first optimal variance of the first image data is determined. Similarly, the second image data may be divided into three groups, and the second optimal variance of the second image data may be determined in a similar manner. Then, according to the first optimal variance and the second optimal variance, a first weight corresponding to the first image data can be determined:
in which W is 1 For the first weight, σ z Standard deviation which is the second optimum variance; sigma (sigma) x Is the standard deviation of the first optimum variance. By subtracting W from 1 1 A second weight may be obtained. And carrying out weighted summation on the first image data and the second image data according to the first weight and the second weight, so as to obtain the fused image data.
130, determining road curvature data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle according to the fused image data;
step 140, determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
according to the embodiment of the application, the road curvature data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle can be determined according to the fused image data. And based on the road curvature data, the front wheel steering angle of the unmanned ground vehicle can be determined by a predictive control algorithm.
Specifically, after road curvature data in front of the unmanned ground vehicle is obtained, a kinetic state equation of the vehicle can be established according to real-time running information of the unmanned ground vehicle and structural information of the vehicle, specifically:
wherein x is&The state variables of the unmanned ground vehicle comprise a controlled distance error, a controlled speed error, a controlled angle error and a controlled angular acceleration error; m is the mass of the unmanned ground vehicle, and the unit is kg; v x The speed of the unmanned ground vehicle is in meters per second; i z Yaw moment of inertia of the unmanned ground vehicle is in kg/m2; c (C) αf The turning rigidity of the front wheels of the unmanned ground vehicle is N/rad; l (L) f The distance from the center of gravity of the unmanned ground vehicle to the front wheel is in m. Wherein: sigma (sigma) 1 =2(C αf +C αr )、σ 2 =-2(l f C αf -l r C αr )、C αr The unit of the steering rigidity of the front wheel of the unmanned ground vehicle is N/rad, l r The distance from the center of gravity of the unmanned ground vehicle to the front wheel is in m.
Applying the optimal theory, it is necessary to find an optimal input steering angle δ, and the evaluation function is obtained as follows:
where k is a discrete sampling time point, Q, and R are weight matrices, respectively.
Discretizing a dynamic state equation to obtain a discrete state equation, converting a path tracking problem into an augmented linear quadratic problem, and combining front road curvature data of the unmanned ground vehicle into a state vector of a system, wherein an augmented matrix is changed into:
C R (k)=[c R (k),c R (k+1),...,c R (k+N)] T
the evaluation function becomes:
δ min ≤δ≤δ max
wherein:
according to the predictive control theory, an optimal control input steering angle can be obtained:
δ * (k)=-K b x(k)-K f C R (k)
wherein: delta * (k) Represents the optimal steering angle, K b =(R+B T PB) -1 B T PA;
Or: k (K) f,i =(R+B T PB) -1 B T ξ i-1 PD;
The matrices P and ζ can be obtained by the following formula:
ζ=A T (I+PBR -1 B T ) -1
and 150, controlling the unmanned ground vehicle to travel according to the steering angle of the front wheels.
Through the steps, the proper input steering angle of the front wheels can be calculated, the unmanned ground vehicle is controlled according to the steering angle of the front wheels, and the unmanned ground vehicle can automatically, quickly and accurately make adjustment when facing a complex road environment, so that stable running is ensured, and the deinsectization working efficiency and stability are improved.
The following describes in detail a control device of an intelligent deinsectization system according to an embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 3, in the embodiment of the present application, the control device of the intelligent deinsectization system is the same as that described above, and is not described herein again, and the control device includes:
the acquisition module 101 is configured to acquire first image data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through a first camera, and acquire second image data of the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through a second camera;
the fusion module 102 is configured to fuse the first image data and the second image data to obtain fused image data when the first distance is greater than the first threshold and less than the second threshold;
a curvature determining module 103, configured to determine road curvature data of a first distance from the unmanned ground vehicle in a traveling direction of the unmanned ground vehicle according to the fused image data;
the steering angle prediction module 104 is configured to determine a front steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
an output module 105 for controlling the unmanned ground vehicle to travel according to the front wheel steering angle.
It can be understood that the content in the above method embodiment is applicable to the embodiment of the present device, and the specific functions implemented by the embodiment of the present device are the same as those of the embodiment of the above method, and the achieved beneficial effects are the same as those of the embodiment of the above method.
Referring to fig. 4, an embodiment of the present application further provides a computer device, including:
at least one processor 201;
at least one memory 202 for storing at least one program;
the at least one program, when executed by the at least one processor 201, causes the at least one processor 201 to implement the control method embodiments of the intelligent pest control system described above.
Similarly, the content in the above method embodiment is applicable to the embodiment of the present computer device, and the functions specifically implemented by the embodiment of the present computer device are the same as those of the embodiment of the above method, and the achieved beneficial effects are the same as those achieved by the embodiment of the above method.
The present embodiment also provides a computer readable storage medium having stored therein a program executable by the processor 201, the program executable by the processor 201 when executed by the processor 201 being configured to perform the above-described control method embodiment of the intelligent pest killing system.
Similarly, the content in the above method embodiment is applicable to the present computer-readable storage medium embodiment, and the functions specifically implemented by the present computer-readable storage medium embodiment are the same as those of the above method embodiment, and the beneficial effects achieved by the above method embodiment are the same as those achieved by the above method embodiment.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of this application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Furthermore, while the present application is described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the functions and/or features may be integrated in a single physical device and/or software module or one or more of the functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Thus, those of ordinary skill in the art will be able to implement the present application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the application, which is to be defined by the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the foregoing description of the present specification, descriptions of the terms "one embodiment/example", "another embodiment/example", "certain embodiments/examples", and the like, are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
While the preferred embodiments of the present application have been described in detail, the present application is not limited to the embodiments, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the present application, and these equivalent modifications and substitutions are intended to be included in the scope of the present application as defined in the appended claims.

Claims (9)

1. The control method of the intelligent deinsectization system is characterized in that the intelligent deinsectization system comprises an unmanned ground vehicle, a first camera, an deinsectization device and an unmanned aerial vehicle, wherein the first camera and the deinsectization device are arranged on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; a second camera is mounted on the unmanned aerial vehicle;
the control method comprises the following steps:
acquiring first image data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the second camera;
when the first distance is larger than a first threshold value and smaller than a second threshold value, fusing the first image data and the second image data to obtain fused image data;
determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
determining a front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
controlling the unmanned ground vehicle to travel according to the front wheel steering angle;
the determining the front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance comprises:
acquiring the driving information of the unmanned ground vehicle; the driving information comprises the speed of the unmanned ground vehicle, the mass of the unmanned ground vehicle, the yaw moment of inertia of the unmanned ground vehicle and the turning rigidity of the front wheels of the unmanned ground vehicle;
establishing a dynamic state equation of the unmanned ground vehicle according to the driving information;
and determining the front wheel steering angle of the unmanned ground vehicle according to the dynamic state equation and a predictive control algorithm.
2. The method of claim 1, wherein fusing the first image data and the second image data to obtain fused image data comprises:
grouping the first image data to obtain a plurality of groups of first image sub-data, and determining a first variance of each group of first image sub-data;
grouping the second image data to obtain a plurality of groups of second image sub-data, and determining a second variance of each group of second image sub-data;
determining a first optimal variance of the first image data according to the first variance, and determining a second optimal variance of the second image data according to the second variance;
determining a first weight corresponding to the first image data and a second weight corresponding to the second image data according to the first optimal variance and the second optimal variance;
and fusing the first image data and the second image data according to the first weight and the second weight to obtain the fused image data.
3. The method of claim 2, wherein the first image data is grouped to obtain three sets of first image sub-data;
determining a first optimal variance of the first image data according to the first variance, which is specifically:
by the formulaDetermining a first optimal variance of the first image data;
in the method, in the process of the invention,for a first optimal variance of said first image data +.>For a first variance corresponding to a first group of first image sub-data,/>For the first variance corresponding to the second group of first image sub-data,/>And a first variance corresponding to the third group of first image sub-data.
4. The method of claim 1, further comprising the step of:
and when the first distance is smaller than a first threshold value, determining road curvature data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the first image data.
5. The method of claim 1, further comprising the step of:
and when the first distance is greater than a second threshold value, determining road curvature data of the first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the second image data.
6. The method according to any one of claims 1-5, further comprising the step of:
and performing Kalman filtering on the first image data and the second image data.
7. The control device of the intelligent deinsectization system is characterized by comprising an unmanned ground vehicle, a first camera, an deinsectization device and an unmanned aerial vehicle, wherein the first camera and the deinsectization device are arranged on the unmanned ground vehicle, and the unmanned aerial vehicle tracks and flies above the unmanned ground vehicle; a second camera is mounted on the unmanned aerial vehicle;
the control device includes:
the acquisition module is used for acquiring first image data of a first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the first camera, and acquiring second image data of the first distance from the unmanned ground vehicle in the traveling direction of the unmanned ground vehicle through the second camera;
the fusion module is used for fusing the first image data and the second image data to obtain fused image data when the first distance is larger than a first threshold value and smaller than a second threshold value;
the curvature determining module is used for determining road curvature data of a first distance from the unmanned ground vehicle in the advancing direction of the unmanned ground vehicle according to the fused image data;
the steering angle prediction module is used for determining the front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance;
the output module is used for controlling the unmanned ground vehicle to travel according to the front wheel steering angle;
the determining the front wheel steering angle of the unmanned ground vehicle based on a predictive control algorithm according to the road curvature data and the first distance comprises:
acquiring the driving information of the unmanned ground vehicle; the driving information comprises the speed of the unmanned ground vehicle, the mass of the unmanned ground vehicle, the yaw moment of inertia of the unmanned ground vehicle and the turning rigidity of the front wheels of the unmanned ground vehicle;
establishing a dynamic state equation of the unmanned ground vehicle according to the driving information;
and determining the front wheel steering angle of the unmanned ground vehicle according to the dynamic state equation and a predictive control algorithm.
8. A computer device, comprising:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement the method of any of claims 1-6.
9. A computer readable storage medium having stored therein instructions executable by a processor, characterized by: the processor-executable instructions, when executed by a processor, are for implementing the method of any of claims 1-6.
CN202110086758.1A 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system Active CN112925310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110086758.1A CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110086758.1A CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Publications (2)

Publication Number Publication Date
CN112925310A CN112925310A (en) 2021-06-08
CN112925310B true CN112925310B (en) 2023-08-08

Family

ID=76164594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110086758.1A Active CN112925310B (en) 2021-01-22 2021-01-22 Control method, device, equipment and storage medium of intelligent deinsectization system

Country Status (1)

Country Link
CN (1) CN112925310B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342024A (en) * 2021-06-24 2021-09-03 湘潭大学 Fixed-point cruise control method of four-rotor aircraft based on predictive control

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104186451A (en) * 2014-08-19 2014-12-10 西北农林科技大学 Insect-killing weeding pesticide spraying robot based on machine vision
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
CN107807632A (en) * 2016-09-08 2018-03-16 福特全球技术公司 Condition of road surface is perceived from the sensing data of fusion
CN108765284A (en) * 2018-05-04 2018-11-06 哈尔滨理工大学 A kind of autonomous driving vehicle vehicle-mounted unmanned aerial vehicle image processing method and device
CN110398969A (en) * 2019-08-01 2019-11-01 北京主线科技有限公司 Automatic driving vehicle adaptive prediction time domain rotating direction control method and device
CN110798848A (en) * 2019-09-27 2020-02-14 国家电网有限公司 Wireless sensor data fusion method and device, readable storage medium and terminal
CN111665845A (en) * 2020-06-24 2020-09-15 北京百度网讯科技有限公司 Method, device, equipment and storage medium for planning path
CN111742344A (en) * 2019-06-28 2020-10-02 深圳市大疆创新科技有限公司 Image semantic segmentation method, movable platform and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
CN104186451A (en) * 2014-08-19 2014-12-10 西北农林科技大学 Insect-killing weeding pesticide spraying robot based on machine vision
CN107807632A (en) * 2016-09-08 2018-03-16 福特全球技术公司 Condition of road surface is perceived from the sensing data of fusion
CN108765284A (en) * 2018-05-04 2018-11-06 哈尔滨理工大学 A kind of autonomous driving vehicle vehicle-mounted unmanned aerial vehicle image processing method and device
CN111742344A (en) * 2019-06-28 2020-10-02 深圳市大疆创新科技有限公司 Image semantic segmentation method, movable platform and storage medium
CN110398969A (en) * 2019-08-01 2019-11-01 北京主线科技有限公司 Automatic driving vehicle adaptive prediction time domain rotating direction control method and device
CN110798848A (en) * 2019-09-27 2020-02-14 国家电网有限公司 Wireless sensor data fusion method and device, readable storage medium and terminal
CN111665845A (en) * 2020-06-24 2020-09-15 北京百度网讯科技有限公司 Method, device, equipment and storage medium for planning path

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多传感器信息融合的服务机器人导航方法;张圣祥等;《单片机与嵌入式系统应用》;20180301(第3期);第4-9页 *

Also Published As

Publication number Publication date
CN112925310A (en) 2021-06-08

Similar Documents

Publication Publication Date Title
AU2021200158B2 (en) Semantic segmentation to identify and treat plants in a field and verify the plant treatments
US11763400B2 (en) Updating execution of tasks of an agricultural prescription
Chen et al. Identification of fruit tree pests with deep learning on embedded drone to achieve accurate pesticide spraying
US20190150357A1 (en) Monitoring and control implement for crop improvement
US20220274207A1 (en) Treating a target via a modular precision delivery system
US20220117214A1 (en) Real time control of precision fluid delivery
US20220142142A1 (en) Agricultural device and method for dispensing a liquid
WO2023069842A1 (en) Precision detection and control of vegetation with real time pose estimation
CN104764533A (en) Intelligent agricultural system based on unmanned aerial vehicle image collecting and thermal infrared imager
US20220117215A1 (en) Autonomous detection and treatment of agricultural objects via precision treatment delivery system
US11935289B2 (en) Agricultural analysis robotic systems and methods thereof
CN112925310B (en) Control method, device, equipment and storage medium of intelligent deinsectization system
CN110715665A (en) Field crop phenotype monitoring robot and navigation method thereof
AU2020405272A1 (en) Precision treatment of agricultural objects on a moving platform
US20230217857A1 (en) Predictive map generation and control
Chatzisavvas et al. Autonomous Unmanned Ground Vehicle in Precision Agriculture–The VELOS project
CN116048115A (en) Control method for unmanned aerial vehicle, group cooperation system and processor
US20230040430A1 (en) Detecting untraversable soil for farming machine
TWI670689B (en) Smart farmland patrolling system and method thereof
CN112766178B (en) Disease and pest positioning method, device, equipment and medium based on intelligent deinsectization system
Conejero et al. Collaborative Harvest Robot
TW202244655A (en) Farmland monitoring system
US20240074428A1 (en) System and method for adjustable targeting in field treatment
US11971725B2 (en) System and method for performing spraying operations with an agricultural applicator
US20230039092A1 (en) Preventing damage by farming machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant