CN112859923B - Unmanned aerial vehicle vision formation flight control system - Google Patents

Unmanned aerial vehicle vision formation flight control system Download PDF

Info

Publication number
CN112859923B
CN112859923B CN202110098294.6A CN202110098294A CN112859923B CN 112859923 B CN112859923 B CN 112859923B CN 202110098294 A CN202110098294 A CN 202110098294A CN 112859923 B CN112859923 B CN 112859923B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
formation
control
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110098294.6A
Other languages
Chinese (zh)
Other versions
CN112859923A (en
Inventor
刘贞报
邹旭
陈露露
赵闻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110098294.6A priority Critical patent/CN112859923B/en
Publication of CN112859923A publication Critical patent/CN112859923A/en
Application granted granted Critical
Publication of CN112859923B publication Critical patent/CN112859923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a visual formation flight control system for unmanned aerial vehicles, and belongs to the field of unmanned aerial vehicle control. According to the visual formation flight control system for the unmanned aerial vehicles, disclosed by the invention, the image processing system is based on an adaptive processing method, the real-time three-dimensional position of each unmanned aerial vehicle can be calculated by detecting the marker object, the real-time moving speed is deduced, and the system can normally operate under the condition that the detection of the marker object fails. The formation flight intelligent control system calculates the control quantity required by the unmanned aerial vehicle in real time and transmits the control quantity to the actuating mechanism according to the position information, the speed and the attitude information of the aircraft obtained by the image processing algorithm and by combining a kinematics model and a separation saturation nonlinear control method of the formation system, and the actuating mechanism drives the unmanned aerial vehicle formation system to perform collision avoidance on the premise that the formation shape is not changed. The invention can realize collision avoidance on the premise of keeping the formation shape unchanged, and improves the flight safety of formation.

Description

Unmanned aerial vehicle vision formation flight control system
Technical Field
The invention belongs to the field of unmanned aerial vehicle control, and particularly relates to an unmanned aerial vehicle visual formation flight control system.
Background
The unmanned aerial vehicle formation flight control system is limited by the load capacity and the cruising ability of a single unmanned aerial vehicle, and the completion of complex flight tasks is a great challenge for the single unmanned aerial vehicle, especially for the micro unmanned aerial vehicle, which is also the reason that the formation flight of the unmanned aerial vehicle is more and more valued. Generally, in order to avoid unnecessary power consumption and increase the flight ability of the drone, the drone has strict requirements on the load. The standard inertial navigation unit carried by the unmanned aerial vehicle is a gyroscope and an accelerometer, wherein the gyroscope is used for estimating the real-time triaxial angular velocity of the unmanned aerial vehicle, the accelerometer is used for measuring the real-time triaxial acceleration value of the unmanned aerial vehicle, and the position information acquired by the GPS is matched to enable the unmanned aerial vehicle to acquire all necessary state information for control in real time. But only those state information where the drone is outdoors and away from the urban area are reliable. Accurate and reliable state information is very important for unmanned aerial vehicle formation flight control.
Disclosure of Invention
The invention aims to overcome the defect that state information is accurate and reliable only when an unmanned aerial vehicle is located outdoors and far away from a downtown, and provides a visual formation flight control system for the unmanned aerial vehicle.
In order to achieve the purpose, the invention adopts the following technical scheme to realize the purpose:
an unmanned aerial vehicle visual formation flight control system comprises a hardware system and a software system;
the hardware system comprises an unmanned aerial vehicle system, a ground station system and a monocular imaging system;
the unmanned aerial vehicle system consists of n unmanned aerial vehicles, and each unmanned aerial vehicle is provided with a flight control circuit board, an inertial sensor, an air pressure sensor, a battery voltage measuring module and a communication module;
the ground station system consists of a mobile computer, a flight control lever, a radio modem and a video receiving system and is used for sending instructions to the unmanned aerial vehicle and monitoring airplane state information in real time;
the monocular imaging system is used for shooting images and sending each frame of shot images to the software system;
the software system comprises an image processing system and a formation flight intelligent control system;
the image processing system is used for calculating the real-time three-dimensional position of each unmanned aerial vehicle based on the detection mark object and deducing the real-time moving speed; when the detection of the marker fails, estimating the translation speed in real time based on the optical flow data of the carried optical flow sensor and in combination with an optical flow calculation formula;
the formation flight intelligent control system is used for calculating the control quantity required by the unmanned aerial vehicle in real time by combining a kinematics model and a separation saturation nonlinear control method of the formation system according to the position information, the speed and the attitude information of the aircraft, and sending the control quantity to an actuating mechanism of the unmanned aerial vehicle.
Further, the work flow of the image processing system is as follows:
processing external parameters of the camera on the basis of each frame of visual image by adopting an adaptive processing algorithm;
detecting the landmark object, wherein the landmark object is a quadrilateral object, virtual circles with different radiuses are constructed by taking four vertexes of the quadrilateral object as the center of a circle, each virtual circle is detected, and the landmark object is classified according to the size of the virtual circle, so that the geographic position of the landmark object is identified;
based on the fact that the difference of the opposite slopes of the quadrangles is within a preset range, detecting the detection result of the symbolic object, and determining the geographic direction of the symbolic object as effective through detection;
calculating an adaptive matrix by using the geographical position coordinates and the directions of the four initial circle centers, and calculating camera parameters based on the adaptive matrix and the camera matrix, so as to calculate the position of the unmanned aerial vehicle relative to the landmark object; the camera shooting matrix is determined by the installation position of the camera;
and estimating the motion speed of the unmanned aerial vehicle based on the vision system.
Further, the calculation process of the adaptive processing algorithm is as follows:
Figure GDA0003367370730000031
wherein the content of the first and second substances,
Figure GDA0003367370730000032
indicating the position of the reference marker in the image; s is a scale factor; k is a camera parameter matrix; [ r1 r2 r3]Is a rotation parameter; t is a translation parameter;
Figure GDA0003367370730000033
is the position of the actual reference marker.
Further, the process of checking the detection result of the landmark object is as follows:
Figure GDA0003367370730000034
wherein i and f are initial and final coordinates respectively;
the slope difference of opposite sides is within a preset range, namely:
|mup-mlo|<ε;|mle-mri|<ε。
further, the process of estimating the unmanned aerial vehicle movement speed based on the vision system is as follows:
estimating the speed of the camera, and the process is as follows:
Figure GDA0003367370730000035
Figure GDA0003367370730000036
Figure GDA0003367370730000037
wherein the content of the first and second substances,
Figure GDA0003367370730000038
a velocity vector which is the centroid of the unmanned aerial vehicle; z is the height;
the speed of the unmanned aerial vehicle is deduced according to a preset scale factor through an estimated value of the speed of the camera.
Further, when the detection of the marker fails, the image processing system is configured to estimate the positions of the centers of the current four virtual circles based on the optical flow measurement.
Further, the formula is as follows:
Figure GDA0003367370730000041
Figure GDA0003367370730000042
wherein the content of the first and second substances,
Figure GDA0003367370730000043
the position of the circle center at the moment k is shown; deltaTThe algorithm runs a time step.
Further, the work flow of the formation flight intelligent control system is as follows:
Figure GDA0003367370730000044
wherein L is a Laplace operator matrix;
using a forced consensus algorithm:
Figure GDA0003367370730000045
wherein N isiAn unmanned aerial vehicle set for transmitting self information to an unmanned aerial vehicle i;
the following variables were then converted:
Figure GDA0003367370730000046
Figure GDA0003367370730000047
Figure GDA0003367370730000048
Figure GDA0003367370730000049
wherein x isi,yi,zii,xj,yj,zjjThe three-dimensional positions and the headings of the ith and jth unmanned aerial vehicles which need to be cooperatively controlled are respectively;
the method comprises the following steps of taking the approaching position of an unmanned aerial vehicle as a reference position, wherein the control rate of each unmanned aerial vehicle is as follows:
Figure GDA00033673707300000410
Figure GDA00033673707300000411
Figure GDA00033673707300000412
Figure GDA00033673707300000413
wherein, thetaiIs the real-time pitching angle of the ith unmanned plane,
Figure GDA0003367370730000051
roll angle psi for the ith unmanned aerial vehicle in real timeiFor the ith unmanned plane real-time deflection angle, sigmai(i ═ 1,2,3,4) and ai(i-1, 2,3,4) is a control parameter, uθiAs a pitch control quantity, uφiFor roll control quantity, FTiFor throttle control, uψiIs a yaw control quantity;
and calculating the control quantity required by the unmanned aerial vehicle in real time based on the control rate and transmitting the control quantity to the actuating mechanism, wherein the actuating mechanism drives the unmanned aerial vehicle to avoid collision on the premise of no change of formation.
Compared with the prior art, the invention has the following beneficial effects:
according to the visual formation flight control system for the unmanned aerial vehicles, the image processing system is based on an adaptive processing method, the real-time three-dimensional position of each unmanned aerial vehicle can be calculated through detecting the marker object, the real-time moving speed is deduced, and the system can normally operate under the condition that the detection of the marker object fails; the formation flight intelligent control system calculates the control quantity required by the unmanned aerial vehicle in real time and transmits the control quantity to the actuating mechanism according to the position information, the speed and the attitude information of the aircraft obtained by the image processing algorithm and by combining a kinematics model and a separation saturation nonlinear control method of the formation system, and the actuating mechanism drives the unmanned aerial vehicle formation system to perform collision avoidance on the premise that the formation shape is not changed. The invention combines an airborne visual system, a traditional inertial navigation unit and an advanced control method, can accurately estimate the state information of each unmanned aerial vehicle in the visual formation flight control system of the unmanned aerial vehicles in the complex and much-interference environment of indoor and urban centers, obviously improves the anti-interference capability of the formation flight control system, and simultaneously reduces the manufacturing and operating cost of the system; the formation flight control system comprises a large number of sensors, a high-efficiency complete processing system and a closed-loop feedback mechanism are formed, and the cooperative high-efficiency control of the unmanned aerial vehicle cluster can be realized; the advanced control method in the formation flight control system can realize collision avoidance on the premise of keeping the formation shape unchanged, and improves the safety of formation flight.
Drawings
FIG. 1 is a block diagram of a visual formation flight control system for unmanned aerial vehicles;
FIG. 2 is a block diagram of a vision system;
FIG. 3 is a position stabilization scheme based on a vision system;
fig. 4 is a detection image of a landmark object by a vision system.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The invention provides an unmanned aerial vehicle formation flight control system, which combines an airborne visual system, a traditional inertial navigation unit and an advanced control method, wherein an unmanned aerial vehicle acquires state information through the airborne visual system, detects a landmark object, estimates the relative position of the unmanned aerial vehicle, calculates the posture information of the unmanned aerial vehicle through the traditional inertial navigation unit, and finally realizes collision avoidance on the premise of keeping formation through an intelligent control method.
Different from the prior visual formation flying system, the visual formation flying system provided by the invention is not only airborne and light, but also can realize the estimation of the flying speed and the position of the visual formation flying system under the condition that the detection of the reference object fails, and can be combined with other intelligent control methods to improve the reliability and the adaptability and lay a solid foundation for the next formation control.
The invention is described in further detail below with reference to the accompanying drawings:
referring to fig. 1, fig. 1 is a block diagram of a visual formation flight control system of an unmanned aerial vehicle according to the present invention, and the visual formation flight control system of the unmanned aerial vehicle includes a hardware system and a software system; the hardware system comprises an unmanned aerial vehicle system, a ground station system and a monocular imaging system; an unmanned aerial vehicle system in the hardware system consists of n unmanned aerial vehicles with the weight of only 800g, and each unmanned aerial vehicle is provided with a flight control circuit board, an inertial sensor, an air pressure sensor, a battery voltage measuring module and a communication module; the ground station system in the hardware system consists of a mobile computer, a flight control lever, a radio modem and a video receiving system, and an operator can send instructions to the unmanned aerial vehicle and monitor the state information of the airplane in real time by using the ground station; referring to fig. 2, a monocular imaging system in the hardware system is composed of a camera with 640 × 480 pixels, a wireless PLL transmitter, a 200mW voltage regulator and four groups of antennas, and the monocular imaging system is mounted on the unmanned aerial vehicle.
The software system comprises an image processing system and a formation flight intelligent control system; the image processing system is an adaptive processing algorithm, is used for calculating the real-time three-dimensional position of each unmanned aerial vehicle by detecting the marker object and deducing the real-time moving speed, and most importantly, the image processing system can also normally operate under the condition that the detection of the marker object fails.
The image processing system operates as follows:
the method comprises the following steps: processing the external parameters of the camera on the basis of each frame of visual image by adopting an adaptive processing algorithm, wherein the adaptive processing algorithm comprises the following steps:
Figure GDA0003367370730000071
wherein the content of the first and second substances,
Figure GDA0003367370730000081
indicating the position of the reference marker in the image; s is a scale factor (initially unknown); k is a camera parameter matrix (unknown at the beginning); [ r1 r2 r3]As a rotation parameter (the parameter is fixed after the camera is mounted); t is a translation parameter (the parameter is fixed after the camera is installed);
Figure GDA0003367370730000082
the position of the actual reference marker; in general, Z is 0;
referring to fig. 3, fig. 3 shows a coordinate system of the unmanned aerial vehicle and the ground reference object set by the present invention, where the nose of the unmanned aerial vehicle is forward in the Xh direction and upward in the zh direction, and the yh direction is determined by the right-hand criterion. The coordinate system of the ground reference object is xlp direction toward the north and ylp direction toward the west;
step two: detecting the landmark object, and identifying the geographic orientation of the landmark object;
referring to fig. 4, assuming that the landmark object is a quadrilateral object, virtual circles are constructed by taking four vertexes of the quadrilateral object as the center of a circle, the virtual circles are different in size, and the computer detects each virtual circle and classifies the virtual circles according to the size of the virtual circle, so as to identify the geographic position of the landmark object;
step three: inspecting the detection result of the marking object
In step two, the hypothetical marker object is detected, but the erroneous detection result needs to be discarded, and therefore the detection result needs to be checked. Ideally, the four circles are arranged to form a rectangle, a connecting line between two upper corners and a connecting line between two right corners satisfy a parallel condition, and a straight line connecting the two left corners and the connecting line between the two right corners also satisfy a corresponding constraint condition. The parallelism test is based on the slope equation:
Figure GDA0003367370730000083
where i, f are the initial and final coordinates, respectively. Thus, the slope of the upper line should be approximately equal to the slope of the lower line, while the slope of the left line should be approximately equal to the slope of the right line, i.e.:
|mup-mlo|<ε;|mle-mri|<ε
only the detection of a landmark object satisfying this condition is effective.
Step four: at each moment, when the unmanned aerial vehicle is in a hovering state, an adaptive matrix H is calculated by using the prior information of the positions of four circle centers as sK [ r1 r 2T ], and camera parameters are calculated by using the estimated transformation matrix and the camera matrix, so that the position (x, y, z) of the unmanned aerial vehicle relative to the marker object can be calculated.
Step five: estimation of unmanned aerial vehicle movement speed based on visual system
Because the camera system is fixedly installed with the unmanned aerial vehicle, the camera system has the same speed as the unmanned aerial vehicle. The speed estimation of the camera is the estimation of the speed of the unmanned aerial vehicle, and the speed estimation method of the camera is as the following formula:
Figure GDA0003367370730000091
Figure GDA0003367370730000092
Figure GDA0003367370730000093
wherein the content of the first and second substances,
Figure GDA0003367370730000094
a velocity vector which is the centroid of the unmanned aerial vehicle; z is the height; according to this formula, when the drone is flying at a constant altitude, the speed of the drone can be deduced by estimating the speed of the camera system according to a predetermined scaling factor.
Step six: when the detection of the marker fails, the center positions of the current four virtual circles are estimated by using optical flow measurement, and the formula is as follows:
Figure GDA0003367370730000095
Figure GDA0003367370730000096
wherein the content of the first and second substances,
Figure GDA0003367370730000097
the position of the circle center at the moment k is shown; deltaTRunning a time step for the algorithm;
the formation flight intelligent control system is used for realizing the consistency of multiple unmanned aerial vehicles by combining a kinematics model of a formation system and utilizing a forced consistency algorithm according to the position information, the speed and the attitude information of the airplane obtained by an image processing system, and the specific process is as follows:
Figure GDA0003367370730000101
wherein L is a laplacian matrix.
Using a forced consensus algorithm:
Figure GDA0003367370730000102
realizing the consistency of multiple drones, wherein NiAn unmanned aerial vehicle set for transmitting self information to an unmanned aerial vehicle i;
the following variables were then converted:
Figure GDA0003367370730000103
Figure GDA0003367370730000104
Figure GDA0003367370730000105
Figure GDA0003367370730000106
wherein x isi,yi,zii,xj,yj,zjjThe three-dimensional position and the heading of the ith and jth unmanned planes which need to be cooperatively controlled are respectively.
The approaching position of an unmanned aerial vehicle is taken as a reference position, each unmanned aerial vehicle is stably controlled by the separation saturation nonlinear control method provided by the invention, and the control rate is as follows:
Figure GDA0003367370730000107
Figure GDA0003367370730000108
Figure GDA0003367370730000109
Figure GDA00033673707300001010
wherein, thetaiIs the real-time pitching angle of the ith unmanned plane,
Figure GDA00033673707300001011
roll angle psi for the ith unmanned aerial vehicle in real timeiFor the ith unmanned plane real-time deflection angle, sigmai(i ═ 1,2,3,4) and ai(i-1, 2,3,4) is a control parameter, uθiAs a pitch control quantity, uφiFor roll control quantity, FTiFor throttle control, uψiIs a yaw control quantity;
under this rate of control effect, the required control volume of real-time computation unmanned aerial vehicle passes to and actuates the mechanism, actuates the mechanism and orders about unmanned aerial vehicle formation system and carries out the collision on the unchangeable prerequisite of formation shape and avoid.
The ground station sends a fixed point mode instruction to the unmanned aerial vehicle formation system; the ground station receives and stores all the compiled signals, and analyzes flight experiments and results; and sending a take-off instruction to the unmanned aerial vehicle formation system by utilizing the ground station, and taking off all unmanned aerial vehicles in the unmanned aerial vehicle formation system by one key to fly to the same set height.
Before the unmanned aerial vehicles are formed into a formation, the unmanned aerial vehicles are accurately placed on the mark object and used for confirming position reference signals of the unmanned aerial vehicle formation system; the landmark object for reference can be a flight platform which is easy to identify, and can also be an object with a certain shape and a prominent color.
When the monocular imaging system operates, each frame of image of the reference object is scanned through the adaptive processing algorithm, so that the real-time three-dimensional position of the airplane is obtained.
And the unmanned aerial vehicle formation system estimates the translation speed in real time by combining an optical flow calculation formula on the basis of the acquired real-time three-dimensional position.
Through the traditional inertial navigation unit on each unmanned aerial vehicle, other measured state quantities are fed back to the intelligent controller in real time, such as attitude information and azimuth information.
Each unmanned aerial vehicle of the unmanned aerial vehicle formation system transmits the position information and the speed information of the unmanned aerial vehicle to the ground station through the radio modem, and then the ground station transmits the position information and the speed information to other unmanned aerial vehicles, so that the information sharing of each unmanned aerial vehicle of the unmanned aerial vehicle formation system is realized.
The unmanned aerial vehicle formation system realizes three-dimensional position cooperative control among the unmanned aerial vehicles by combining an intelligent control algorithm provided by the invention according to the position information shared by each unmanned aerial vehicle; the effect generated by the three-dimensional position cooperative control is that the unmanned aerial vehicle formation system can realize collision avoidance on the premise of keeping the formation shape unchanged; the three-dimensional position cooperative control algorithm can be extended to n unmanned aerial vehicles.
The above-mentioned contents are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modification made on the basis of the technical idea of the present invention falls within the protection scope of the claims of the present invention.

Claims (6)

1. An unmanned aerial vehicle visual formation flight control system is characterized by comprising a hardware system and a software system;
the hardware system comprises an unmanned aerial vehicle system, a ground station system and a monocular imaging system;
the unmanned aerial vehicle system consists of n unmanned aerial vehicles, and each unmanned aerial vehicle is provided with a flight control circuit board, an inertial sensor, an air pressure sensor, a battery voltage measuring module and a communication module;
the ground station system consists of a mobile computer, a flight control lever, a radio modem and a video receiving system and is used for sending instructions to the unmanned aerial vehicle and monitoring airplane state information in real time;
the monocular imaging system is used for shooting images and sending each frame of shot images to the software system;
the software system comprises an image processing system and a formation flight intelligent control system;
the image processing system is used for calculating the real-time three-dimensional position of each unmanned aerial vehicle based on the detection mark object and deducing the real-time moving speed; when the detection of the marker fails, estimating the translation speed in real time based on the optical flow data of the carried optical flow sensor and in combination with an optical flow calculation formula;
the formation flight intelligent control system is used for calculating the control quantity required by the unmanned aerial vehicle in real time according to the position information, the speed and the attitude information of the airplane by combining a kinematics model and a separation saturation nonlinear control method of the formation system and sending the control quantity to an actuating mechanism of the unmanned aerial vehicle;
the work flow of the image processing system is as follows:
processing external parameters of the camera on the basis of each frame of visual image by adopting an adaptive processing algorithm;
detecting the landmark object, wherein the landmark object is a quadrilateral object, virtual circles with different radiuses are constructed by taking four vertexes of the quadrilateral object as the center of a circle, each virtual circle is detected, and the landmark object is classified according to the size of the virtual circle, so that the geographic position of the landmark object is identified;
based on the fact that the difference of the opposite slopes of the quadrangles is within a preset range, detecting the detection result of the symbolic object, and determining the geographic direction of the symbolic object as effective through detection;
calculating an adaptive matrix by using the geographical position coordinates and the directions of the four initial circle centers, and calculating camera parameters based on the adaptive matrix and the camera matrix, so as to calculate the position of the unmanned aerial vehicle relative to the landmark object; the camera shooting matrix is determined by the installation position of the camera;
estimating the motion speed of the unmanned aerial vehicle based on a visual system;
the process of estimating the motion speed of the unmanned aerial vehicle based on the visual system comprises the following steps:
estimating the speed of the camera, and the process is as follows:
Figure FDA0003367370720000021
Figure FDA0003367370720000022
Figure FDA0003367370720000023
wherein the content of the first and second substances,
Figure FDA0003367370720000024
a velocity vector which is the centroid of the unmanned aerial vehicle; z is the height;
the speed of the unmanned aerial vehicle is deduced according to a preset scale factor s through an estimated value of the speed of the camera.
2. The unmanned aerial vehicle visual formation flight control system of claim 1, wherein the adaptive processing algorithm is calculated by:
Figure FDA0003367370720000025
wherein the content of the first and second substances,
Figure FDA0003367370720000026
indicating the position of the reference marker in the image; s is a scale factor; k is a camera parameter matrix; [ r1 r2 r3]Is a rotation parameter; t is a translation parameter;
Figure FDA0003367370720000027
is the position of the actual reference marker.
3. The visual formation flight control system for unmanned aerial vehicles according to claim 1, wherein the process of checking the detection result of the landmark object is as follows:
Figure FDA0003367370720000031
wherein i and f are initial and final coordinates respectively;
the slope difference of opposite sides is within a preset range, namely:
|mup-mlo|<ε;|mle-mri|<ε。
4. the unmanned aerial vehicle visual formation flight control system of claim 1, wherein when landmark object detection fails, the image processing system is configured to estimate the positions of the centers of the current four virtual circles based on optical flow measurements.
5. The visual formation flight control system of unmanned aerial vehicles of claim 4, wherein the formula is as follows:
Figure FDA0003367370720000032
Figure FDA0003367370720000033
wherein the content of the first and second substances,
Figure FDA0003367370720000034
the position of the circle center at the moment k is shown; deltaTThe algorithm runs a time step.
6. The unmanned aerial vehicle visual formation flight control system of claim 1, wherein the work flow of the formation flight intelligent control system is as follows:
Figure FDA0003367370720000035
wherein L is a Laplace operator matrix;
using a forced consensus algorithm:
Figure FDA0003367370720000036
wherein N isiAn unmanned aerial vehicle set for transmitting self information to an unmanned aerial vehicle i;
the following variables were then converted:
Figure FDA0003367370720000041
Figure FDA0003367370720000042
Figure FDA0003367370720000043
Figure FDA0003367370720000044
wherein x isi,yi,zii,xj,yj,zjjThe three-dimensional positions and the headings of the ith and jth unmanned aerial vehicles which need to be cooperatively controlled are respectively;
the method comprises the following steps of taking the approaching position of an unmanned aerial vehicle as a reference position, wherein the control rate of each unmanned aerial vehicle is as follows:
Figure FDA0003367370720000045
Figure FDA0003367370720000046
Figure FDA0003367370720000047
Figure FDA0003367370720000048
wherein, thetaiIs the real-time pitching angle of the ith unmanned plane,
Figure FDA0003367370720000049
for the i-th unmanned plane to roll in real time at an angle of psiiFor the ith unmanned plane real-time deflection angle, sigmai(i ═ 1,2,3,4) and ai(i-1, 2,3,4) is a control parameter, uθiAs a pitch control quantity, uφiFor roll control quantity, FTiFor throttle control, uΨiIs a yaw control quantity;
and calculating the control quantity required by the unmanned aerial vehicle in real time based on the control rate and transmitting the control quantity to the actuating mechanism, wherein the actuating mechanism drives the unmanned aerial vehicle to avoid collision on the premise of no change of formation.
CN202110098294.6A 2021-01-25 2021-01-25 Unmanned aerial vehicle vision formation flight control system Active CN112859923B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110098294.6A CN112859923B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle vision formation flight control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110098294.6A CN112859923B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle vision formation flight control system

Publications (2)

Publication Number Publication Date
CN112859923A CN112859923A (en) 2021-05-28
CN112859923B true CN112859923B (en) 2022-02-18

Family

ID=76008659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110098294.6A Active CN112859923B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle vision formation flight control system

Country Status (1)

Country Link
CN (1) CN112859923B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114265406B (en) * 2021-12-21 2024-04-12 南京理工大学 Intelligent vehicle formation control system based on machine vision and control method thereof
CN115014279B (en) * 2022-08-09 2022-10-28 湖南科天健光电技术有限公司 Observation aircraft, observation system, calibration method and method for measuring target to be measured

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106647814B (en) * 2016-12-01 2019-08-13 华中科技大学 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference
CN107144281B (en) * 2017-06-30 2023-09-12 一飞智控(天津)科技有限公司 Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision
CN108196582A (en) * 2018-02-12 2018-06-22 深圳技术大学(筹) A kind of indoor Visual Navigation unmanned plane cluster flight control system and method
CN108459618A (en) * 2018-03-15 2018-08-28 河南大学 A kind of flight control system and method that unmanned plane automatically launches mobile platform
CN108508916B (en) * 2018-04-02 2021-05-07 南方科技大学 Control method, device and equipment for unmanned aerial vehicle formation and storage medium
CN109445432A (en) * 2018-10-31 2019-03-08 中国科学技术大学 Unmanned plane and ground mobile robot formation localization method based on image
CN109813311B (en) * 2019-03-18 2020-09-15 南京航空航天大学 Unmanned aerial vehicle formation collaborative navigation method
CN111176308A (en) * 2019-12-27 2020-05-19 西安羚控电子科技有限公司 Small-size many rotor unmanned aerial vehicle cluster control system of closed environment

Also Published As

Publication number Publication date
CN112859923A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
CN109911188B (en) Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment
CN109911231B (en) Unmanned aerial vehicle autonomous carrier landing method and system based on GPS and image recognition hybrid navigation
CN106774436B (en) Control system and method for stably tracking target of rotor unmanned aerial vehicle based on vision
US20190273909A1 (en) Methods and systems for selective sensor fusion
CN105652891B (en) A kind of rotor wing unmanned aerial vehicle movement Target self-determination tracks of device and its control method
US20180321041A1 (en) Methods and systems for determining a state of an unmanned aerial vehicle
Martínez et al. On-board and ground visual pose estimation techniques for UAV control
CN109901580A (en) A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method
CN102190081B (en) Vision-based fixed point robust control method for airship
CN105759829A (en) Laser radar-based mini-sized unmanned plane control method and system
CN109683629B (en) Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision
US20150149000A1 (en) Unkown
CN108062108A (en) A kind of intelligent multi-rotor unmanned aerial vehicle and its implementation based on airborne computer
CN108248845A (en) A kind of rotor flying mechanical arm system and algorithm based on dynamic center of gravity compensation
CN105182992A (en) Unmanned aerial vehicle control method and device
CN112859923B (en) Unmanned aerial vehicle vision formation flight control system
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN104808674A (en) Multi-rotor aircraft control system, terminal and airborne flight control system
CN107831776A (en) Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
CN110333735B (en) System and method for realizing unmanned aerial vehicle water and land secondary positioning
WO2020033099A1 (en) Landing site localization for dynamic control of an aircraft toward a landing site
US20210318122A1 (en) Positioning apparatus capable of measuring position of moving body using image capturing apparatus
US20210101747A1 (en) Positioning apparatus capable of measuring position of moving body using image capturing apparatus
CN113156998A (en) Unmanned aerial vehicle flight control system and control method
CN110456822A (en) A kind of small and medium size unmanned aerial vehicles double redundancy independently measures flight control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant