CN110297499B - Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure - Google Patents

Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure Download PDF

Info

Publication number
CN110297499B
CN110297499B CN201910525033.0A CN201910525033A CN110297499B CN 110297499 B CN110297499 B CN 110297499B CN 201910525033 A CN201910525033 A CN 201910525033A CN 110297499 B CN110297499 B CN 110297499B
Authority
CN
China
Prior art keywords
moving body
layer
neighbor
vector
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910525033.0A
Other languages
Chinese (zh)
Other versions
CN110297499A (en
Inventor
陈杨杨
蒋国庆
卫平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201910525033.0A priority Critical patent/CN110297499B/en
Publication of CN110297499A publication Critical patent/CN110297499A/en
Application granted granted Critical
Publication of CN110297499B publication Critical patent/CN110297499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying

Abstract

The invention discloses a formation tracking surrounding and unknown speed estimation method based on a following hierarchical structure, which comprises the following steps of: a) dividing the moving bodies into m layers of following groups according to initial perception, and determining neighbors of the upper layer and the lower layer of each moving body and the front-back neighbors of the same layer; b) measuring the direction vector of the moving body and the direction vectors of the neighbors of the moving body, and calculating the angle error between the included angle between the direction vector of the moving body and the direction vector of the neighbors and the expected value; c) obtaining the distance between the self moving body and the upper layer of adjacent space, and calculating the distance error between the distance value and the expected value; d) designing a velocity estimation law of a neighbor of the upper layer according to the error value, and further designing a velocity control law of the moving body projected to the direction vector and the vertical direction vector of the moving body; e) the method effectively solves the problems that the sensing radius of the moving body is limited and the speed is not measurable, is simple and reliable, and can be used in the fields of unmanned swarm detection, combat and the like.

Description

Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure
Technical Field
The invention belongs to the technical field of three-dimensional formation surrounding tracking control, and particularly relates to a formation tracking surrounding and unknown speed estimation method based on a following hierarchical structure.
Background
Currently, more and more researchers begin to utilize a distributed formation tracking surrounding technology with multiple moving bodies to complete complex tasks in complex scenes, such as unmanned plane swarm, robot crowd and the like, the american ocean office has also developed Argo plans and multi-underwater robot collaborative ocean monitoring projects successively by combining multiple countries and scientific research institutions, and the U.S. bee colony published in the front cover of the air force magazine in the 4 th month of 2019: the importance of the formation tracking surrounding control technology is also emphasized in the future air war preparedness, so that the development of the technology is undoubtedly of great importance in the aspects of detection and operation of unmanned bee colonies.
Most of the current formation tracking surrounding control methods require that each moving body of a group can detect a target and the speed of a neighbor can be measured, such as' invention patent: ZL 201610069641.1; a geometric design method for surrounding a multilayer surrounding formation; chenyang poplar, Wangkai Xuan, Zhang ya and the invention patent: 201710303014.4, respectively; a method for tracking unknown targets by distributed formation spherical surrounding based on relative positions; the related technologies of poplar, wei ping, Zheng, r, Lin, z, Fu, m, Sun, d, Distributed control for unified navigation of ring-coupled units [ J ], automation, 2015,53:23-29 ", all require that in the team tracking enclosure control method, each moving body can detect a target and the speed of the moving body neighbor can be detected, and in practice, only part of the moving bodies in the group can detect the motion of the target due to the limited sensing radius of the sensor. Meanwhile, it is difficult for the image sensor and the ranging sensor to directly measure the velocity information of the target and the neighbor, which causes many limitations in practical application of the existing method.
Disclosure of Invention
Aiming at the existing problems, the invention provides a formation tracking surrounding and unknown speed estimation method based on a following hierarchical structure, which comprises the following steps: a) dividing the moving bodies into m layers of following groups according to initial perception, and determining neighbors of the upper layer and the lower layer of each moving body and the front-back neighbors of the same layer; b) measuring the direction vector of the moving body and the direction vectors of the neighbors of the moving body, and calculating the angle error between the included angle between the direction vector of the moving body and the direction vector of the neighbors and the expected value; c) obtaining the distance between the self moving body and the upper layer of adjacent space, and calculating the distance error between the distance value and the expected value; d) designing a velocity estimation law of a neighbor of the upper layer according to the error value, and further designing a velocity control law of the moving body projected to the direction vector and the vertical direction vector of the moving body; e) the method effectively solves the problems that the sensing radius of the moving body is limited and the speed is not measurable, is simple and reliable, and can be used in the fields of unmanned swarm detection, combat and the like.
In order to achieve the purpose, the invention adopts the technical scheme that: the formation tracking surrounding and unknown speed estimation method based on the following hierarchical structure comprises the following steps:
s1, dividing the moving body into m layers of following groups according to the initial perception, and determining the neighbors of the upper layer and the lower layer of each moving body and the front-back neighbors of the same layer;
s2, measuring the direction vector of the self moving body and the direction vector of the neighbor of the self moving body through image equipment, and calculating the angle error between the included angle between the direction vector of the moving body and the direction vector of the neighbor and an expected value, wherein the neighbor comprises the upper layer neighbor of the moving body, the lower layer neighbor of the moving body, the same-layer forward neighbor and the same-layer backward neighbor of the moving body;
s3, obtaining the distance between the self moving body and the upper layer of the adjacent layer, and calculating the distance error between the distance value and the expected value;
s4, designing a velocity estimation law of the neighbor of the previous layer according to the angle error obtained in the step S2 and the distance error obtained in the step S3, and further designing a velocity control law of the moving body projected to the direction vector and the vertical direction vector of the moving body;
and S5, according to the speed control of the moving body projected to the direction vector and the vertical direction vector obtained in the step S4, the speed control input of the moving body is solved in a series mode, and the motion control of the moving body is completed.
As a refinement of the present invention, the step S1 further includes:
s11, setting the moving body group capable of initially sensing the target as a first-layer following group, wherein the moving body group capable of sensing the first-layer following group is called a second-layer following group, and so on, obtaining m following groups in total, wherein m is larger than or equal to 1, and the target is regarded as a virtual 0-th-layer following group;
s12, determining the front and back neighbors of each moving body on the same layer with the neighbors on the same upper layer, wherein the determination method is based on the initial perception and the nearest principle, each moving body does not need to have the front and back neighbors on the same layer, if the moving body can only perceive the moving body on the same layer with the neighbors on the same upper layer, the moving body is the front neighbor on the same layer and the back neighbor on the same layer;
and S13, determining the neighbors of the upper layer and the lower layer of each moving body, wherein the determination method is that each moving body must have the neighbors of the upper layer but not necessarily the neighbors of the lower layer based on the initial perception and the nearest principle.
As a refinement of the present invention, the step S2 further includes:
s21, measuring the self moving body pointing vector through the image equipment
Figure BDA0002097783970000031
Vector of the upper layer neighbor
Figure BDA0002097783970000032
Direction vector of next layer neighbor
Figure BDA0002097783970000033
Pointing vector of same-layer forward neighbor
Figure BDA0002097783970000034
And the vector pointing to the backward neighbor on the same layer
Figure BDA0002097783970000035
The self-moving body pointing vector
Figure BDA0002097783970000036
Vector of the upper layer neighbor
Figure BDA0002097783970000037
Direction vector of next layer neighbor
Figure BDA0002097783970000038
Pointing vector of same-layer forward neighbor
Figure BDA0002097783970000039
Vector pointing to the same layer backward neighbor
Figure BDA00020977839700000310
Wherein the content of the first and second substances,
Figure BDA00020977839700000311
representing the ith motile in the following group of the kth e m layer;
Figure BDA00020977839700000312
representing moving body
Figure BDA00020977839700000313
The upper neighbor of (1); when k-1 ═ 0 represents the target,
Figure BDA00020977839700000314
representing moving body
Figure BDA00020977839700000315
The next-layer neighbor of (1);
Figure BDA00020977839700000316
representing moving body
Figure BDA00020977839700000317
The same layer forward neighbor of (1);
Figure BDA00020977839700000318
representing moving body
Figure BDA00020977839700000319
Backward neighbors of the same layer;
Figure BDA00020977839700000320
and
Figure BDA00020977839700000321
the direction angles of the two are under the airborne coordinates;
s22, calculating the self moving body pointing vector
Figure BDA00020977839700000322
And the neighbor pointing vector of the previous layer
Figure BDA00020977839700000323
The included angle is an adjacent included angle
Figure BDA00020977839700000324
Comprises the following steps:
Figure BDA00020977839700000325
wherein the content of the first and second substances,
Figure BDA0002097783970000041
is a vector perpendicular to the pointing direction of the moving body,
Figure BDA0002097783970000042
calculating the angle of the limb
Figure BDA0002097783970000043
And corresponding expected values
Figure BDA0002097783970000044
Inter-critical angle error, i.e. critical angle error
Figure BDA0002097783970000045
Figure BDA0002097783970000046
S23, calculating the self moving body pointing vector
Figure BDA0002097783970000047
And the pointing vector of the next layer neighbor
Figure BDA0002097783970000048
The angle of inclination
Figure BDA0002097783970000049
Comprises the following steps:
Figure BDA00020977839700000410
wherein the content of the first and second substances,
Figure BDA00020977839700000411
calculating the angle of the angle
Figure BDA00020977839700000412
And corresponding expected values
Figure BDA00020977839700000413
Error of (2), i.e. temporary angle error
Figure BDA00020977839700000414
Comprises the following steps:
Figure BDA00020977839700000415
s24, calculating the self moving body pointing vector
Figure BDA00020977839700000416
And the vector of the same layer forward neighbor
Figure BDA00020977839700000417
Is at an angle of inclination, said forward angle of inclination
Figure BDA00020977839700000418
Comprises the following steps:
Figure BDA00020977839700000419
calculating the forward included angle
Figure BDA00020977839700000420
And corresponding expected values
Figure BDA00020977839700000421
Error of (2), i.e. forward angle error
Figure BDA00020977839700000422
Figure BDA00020977839700000423
S25, calculating the self moving body pointing vector
Figure BDA00020977839700000424
And the pointing vector of the same-layer backward neighbor
Figure BDA00020977839700000425
The angle of (d), the angle of backward direction
Figure BDA00020977839700000426
Comprises the following steps:
Figure BDA00020977839700000427
wherein the content of the first and second substances,
Figure BDA00020977839700000428
calculating the backward included angle
Figure BDA00020977839700000429
And corresponding expected values
Figure BDA00020977839700000430
Error of (2), i.e. backward angle error
Figure BDA00020977839700000431
Figure BDA00020977839700000432
As another improvement of the present invention, in the step S3, the distance between the self-moving body and the upper adjacent layer is obtained by a distance measuring sensor, and the distance between the self-moving body and the upper adjacent layer is obtained by a distance measuring sensor
Figure BDA0002097783970000051
And expected value
Figure BDA0002097783970000052
Distance error of
Figure BDA0002097783970000053
Comprises the following steps:
Figure BDA0002097783970000054
as another improvement of the present invention, the step S4 further includes:
s41, obtaining the temporary angle error according to the step S2
Figure BDA0002097783970000055
Angular error under critical conditions
Figure BDA0002097783970000056
Forward angle error
Figure BDA0002097783970000057
And backward angle error
Figure BDA0002097783970000058
Distance error obtained from step S3
Figure BDA0002097783970000059
The velocity direction value of the neighbor of the upper layer obtained according to the measurement
Figure BDA00020977839700000510
Designing a rate estimation law of the neighbor of the upper layer:
Figure BDA00020977839700000511
wherein the gain k is controlledl1,2,3 is a constant greater than 0; b is 1 to indicate that the moving body has a next layer neighbor, otherwise b is 0;
s42, according to the distance error
Figure BDA00020977839700000512
Velocity direction of upper layer neighbor
Figure BDA00020977839700000513
Rate of sum estimation law
Figure BDA00020977839700000514
Designing a speed control law of the moving body projected to the pointing vector of the moving body to reduce the distance error to meet the design requirement;
s43, according to the temporary angle error
Figure BDA00020977839700000515
And forward angle error
Figure BDA00020977839700000516
Velocity direction of upper layer neighbor
Figure BDA00020977839700000517
And rate estimation
Figure BDA00020977839700000518
And designing a speed control law of the projection of the moving body to the vertical direction vector, so that the errors of the adjacent angle and the forward angle are reduced to meet the design requirement.
As another improvement of the present invention, the moving body is projected to its directional vector in the step S42
Figure BDA00020977839700000519
The velocity control law above is:
Figure BDA00020977839700000520
wherein the gain k is controlled4Is a constant greater than 0.
As another improvement of the present invention, the moving body is projected to the vertical direction vector in the step S43
Figure BDA00020977839700000521
The velocity control law above is:
Figure BDA0002097783970000061
wherein the gain k is controlledlWhere, l is 5,6,7 is a constant greater than 0, where b is 1 indicates that the moving body has a same-layer neighbor, otherwise b is 0.
As a further improvement of the present invention, the step S5 further includes:
s51, according to the speed control of the moving body projected on the direction vector and the vertical direction vector obtained in the step S4, the speed control input is solved in series
Figure BDA0002097783970000062
Comprises the following steps:
Figure BDA0002097783970000063
and S52, the upper computer sends the speed control input of the moving body obtained in the step S51 to the lower computer, and the motion control is completed through a servo system.
Compared with the prior art, the invention discloses a formation tracking surrounding and unknown speed estimation method based on a following hierarchical structure, which can effectively solve the problems that the perception radius of a moving body is limited and the speed is not measurable under the condition of consistency of an airborne coordinate system, is simple and reliable, and can be used in the fields of unmanned swarm detection, combat and the like. Compared with the existing method, the method does not need each moving body to sense the movement of the target, thereby greatly reducing the performance requirement on the airborne sensor, also making the moving body smaller to a certain extent, and having certain robustness on measurement loss (such as visual occlusion) in the moving process; meanwhile, the measurement of the neighbor rate is not needed, and a speed measuring sensor is not needed to be loaded on the body, so that the energy consumption of a power supply is reduced, and the cruising ability of the moving body is improved. In addition, due to the two characteristics of the method, the cost of the whole system can be reduced to a certain extent.
Drawings
FIG. 1 is a topological diagram corresponding to the hierarchical follow structure of step S1 according to the present invention;
FIG. 2 is a schematic diagram of the circular formation trace bounding expectation according to the present invention
FIG. 3 is a schematic representation of the angles of the invention adjacent to the top, bottom, front and back;
FIG. 4 is a flow chart of the present invention following hierarchy based formation tracking envelope and unknown speed estimation method.
The above figures include:
Figure BDA0002097783970000071
representing a target;
Figure BDA0002097783970000072
representing the 1 st, 2 nd, 3 rd motiles in the first layer follower group;
Figure BDA0002097783970000073
represents the 1 st to 5 th motiles in the first layer follower group;
Figure BDA0002097783970000074
represents the 1 st, 2 nd and 3 rd motiles in the k-1 th following group;
Figure BDA0002097783970000075
represents the 1 st to 7 th motiles in the k-th following group;
Figure BDA0002097783970000076
represents the 1 st, 2 nd and 3 rd motiles in the k +1 th following group;
Figure BDA0002097783970000077
and
Figure BDA0002097783970000078
are respectively as
Figure BDA0002097783970000079
Co-layer forward and backward neighbors of the motiles;
Figure BDA00020977839700000710
and
Figure BDA00020977839700000711
are respectively as
Figure BDA00020977839700000712
Co-layer forward and backward neighbors of the motiles;
Figure BDA00020977839700000713
is composed of
Figure BDA00020977839700000714
The desired distance of the moving body to the target is equal to the desired circle radius;
Figure BDA00020977839700000715
is composed of
Figure BDA00020977839700000716
The desired distance between the motiles to their upper neighbors;
Figure BDA00020977839700000717
is a circle with a chord length of
Figure BDA00020977839700000718
A corresponding central angle;
Figure BDA00020977839700000719
is composed of
Figure BDA00020977839700000720
The relative central angle between the moving body and the forward adjacent part of the same layer is expected;
Figure BDA00020977839700000721
respectively represent
Figure BDA00020977839700000722
The direction vector of the moving body, the direction vector of the neighbor on the upper layer, the direction vector of the neighbor on the lower layer, the direction vector of the forward neighbor on the same layer, and the direction vectors of the front neighbor and the rear neighbor on the same layer;
Figure BDA00020977839700000723
respectively represent
Figure BDA00020977839700000724
The moving body is correspondingly inclined at the upper part, the lower part, the front part and the back part.
Detailed Description
The invention will be explained in more detail below with reference to the drawings and examples.
Example 1
The formation tracking surrounding and unknown speed estimation method based on the following hierarchical structure comprises the following steps:
s1, dividing the moving body into m layers of following groups according to the initial perception, and determining the neighbors of the upper layer and the lower layer of each moving body and the front-back neighbors of the same layer; in the present invention, the moving bodies realize the estimation of the formation tracking surrounding and the velocity of the neighbor of the previous layer according to the measurement data of the airborne sensor (including the image and the ranging sensor), and because the sensing radius of the sensor is limited, not all moving bodies can sense the target, as shown in fig. 1, each moving body needs to determine the neighbor of the moving body according to the initial sensing data, i.e. the layered structure, so the step S1 specifically includes the following steps:
s11, the moving body group capable of sensing the target initially is called a first layer following group omega1Sensing movement of the first layer followerThe population is called the second layer follower population omega2And by analogy, m is more than or equal to 1 layer of following group { omega ≧ 112,…,ΩmConsider the target as a virtual layer 0 follower
Figure BDA00020977839700000725
And S12, determining that each moving body has the same front-back neighbor of the previous layer neighbor of the same layer according to the initial perception and the nearest principle. Each moving body can have the same-layer forward and backward neighbors or not. If a mobile can only perceive a same-layer mobile as it has the same previous-layer neighbor, then this mobile is its same-layer backward neighbor even though it has a same-layer forward neighbor. For example, layer k in FIG. 1 follows the 2,4,6,7 th moving body in the cluster, i.e., the moving body
Figure BDA0002097783970000081
Can sense the 2 nd moving body in the k-1 th following group, i.e. the moving body
Figure BDA0002097783970000082
If moving body
Figure BDA0002097783970000083
Can sense the moving body
Figure BDA0002097783970000084
It will select the same-layer forward neighbor according to the counterclockwise closest principle with respect to it
Figure BDA0002097783970000085
(i.e. as represented by
Figure BDA0002097783970000086
) And selecting the same-layer backward neighbor according to the clockwise distance nearest principle
Figure BDA0002097783970000087
Namely as shown in
Figure BDA0002097783970000088
For the 1 st, 3 rd, 5 th moving body in the k-th following group, i.e. moving body
Figure BDA0002097783970000089
Because the perception radius is limited, no moving objects in the kth layer following group can be perceived, and therefore, they do not have same-layer neighbors.
And S13, determining the neighbors of the upper layer and the lower layer of each moving body according to the initial perception and the nearest principle, wherein each moving body must have the neighbors of the upper layer but can not have the neighbors of the lower layer. For example, layer k in FIG. 1 follows the 3 rd moving body in the group, i.e., the moving body
Figure BDA00020977839700000810
It can sense the moving body in the k-1 layer following group
Figure BDA00020977839700000811
And other moving bodies, the moving body being selected on the basis of the counterclockwise closest principle with respect thereto
Figure BDA00020977839700000812
As its upper layer neighbor. At the same time, the moving body
Figure BDA00020977839700000813
Also in the (k + 1) th layer of the follower group
Figure BDA00020977839700000814
And other moving bodies, the moving body being selected on the basis of the closest clockwise distance to the moving body
Figure BDA00020977839700000815
As its next layer neighbor. Layer k in fig. 1 follows the 1 st moving body (i.e., moving body) in the group
Figure BDA00020977839700000816
) It has only one layer of adjacent moving body
Figure BDA00020977839700000817
There is no neighbor of the next layer.
S2, in order to design the velocity estimation algorithm and the formation tracking bounding algorithm of the neighbor in the previous layer, the angle error and the distance error are calculated according to the measured data. The method comprises the following steps of measuring a pointing vector of a moving body of the moving body and a pointing vector of a neighbor of the moving body by an image device, calculating an angle error between an included angle between the pointing vector of the moving body and the pointing vector of the neighbor of the moving body and an expected value, wherein the neighbor comprises an upper layer neighbor of the moving body, a lower layer neighbor of the moving body, a same-layer forward neighbor and a same-layer backward neighbor of the moving body, and as shown in fig. 2 and 3, step S2 is to calculate an angle error between the included angle and the expected value according to the pointing vector of the moving body obtained from an image and the pointing vector of the neighbor of the moving body, and the specific steps are implemented as follows:
s21, measuring the self moving body pointing vector through the image equipment
Figure BDA0002097783970000091
Vector of the upper layer neighbor
Figure BDA0002097783970000092
Direction vector of next layer neighbor
Figure BDA0002097783970000093
Pointing vector of same-layer forward neighbor
Figure BDA0002097783970000094
And the vector pointing to the backward neighbor on the same layer
Figure BDA0002097783970000095
Wherein the content of the first and second substances,
Figure BDA0002097783970000096
representing the ith motile in the following group of the kth e m layer;
Figure BDA0002097783970000097
representing moving body
Figure BDA0002097783970000098
The upper neighbor of (1); when k-1 ═ 0 represents the target,
Figure BDA0002097783970000099
representing moving body
Figure BDA00020977839700000910
The next-layer neighbor of (1);
Figure BDA00020977839700000911
representing moving body
Figure BDA00020977839700000912
The same layer forward neighbor of (1);
Figure BDA00020977839700000913
representing moving body
Figure BDA00020977839700000914
Backward neighbors of the same layer;
Figure BDA00020977839700000915
and
Figure BDA00020977839700000916
the direction angles of the two are under the airborne coordinates;
s22, obtaining moving body orientation vector from image
Figure BDA00020977839700000917
And the direction vector of the neighbor of the previous layer
Figure BDA00020977839700000918
Calculating the angle of the limb
Figure BDA00020977839700000919
Figure BDA00020977839700000920
Wherein the content of the first and second substances,
Figure BDA00020977839700000921
is a vector perpendicular to the pointing direction of the moving body,
Figure BDA00020977839700000922
then, by
Figure BDA00020977839700000923
And corresponding expected values
Figure BDA00020977839700000924
Calculating the angle error of the face
Figure BDA00020977839700000925
Figure BDA00020977839700000926
S23, obtaining moving body orientation vector from image
Figure BDA00020977839700000927
And the pointing vector of the next layer neighbor
Figure BDA00020977839700000928
Calculating the angle of the limb
Figure BDA00020977839700000929
Figure BDA00020977839700000930
Wherein the content of the first and second substances,
Figure BDA00020977839700000931
then, by
Figure BDA00020977839700000932
And corresponding expected values
Figure BDA00020977839700000933
Calculating the angle error
Figure BDA00020977839700000934
Figure BDA00020977839700000935
S24, obtaining moving body orientation vector from image
Figure BDA00020977839700000936
And the vector of the same layer forward neighbor
Figure BDA00020977839700000937
Calculating the forward included angle
Figure BDA00020977839700000938
Figure BDA00020977839700000939
Then, by
Figure BDA0002097783970000101
And corresponding expected values
Figure BDA0002097783970000102
Calculating forward angle error
Figure BDA0002097783970000103
Figure BDA0002097783970000104
S25, obtaining moving body orientation vector from image
Figure BDA0002097783970000105
And the pointing vector of the same-layer backward neighbor
Figure BDA0002097783970000106
Calculating the backward included angle
Figure BDA0002097783970000107
Figure BDA0002097783970000108
Wherein the content of the first and second substances,
Figure BDA0002097783970000109
then, by
Figure BDA00020977839700001010
And corresponding expected values
Figure BDA00020977839700001011
Calculating the error of the backward angle
Figure BDA00020977839700001012
Figure BDA00020977839700001013
S3, in order to realize formation tracking of the surrounding target, the distance between the moving body in each layer of the following group and the adjacent layer of the previous layer is calculated according to the distance measurement
Figure BDA00020977839700001014
And the expected value
Figure BDA00020977839700001015
Distance error of
Figure BDA00020977839700001016
Figure BDA00020977839700001017
S4, designing a velocity estimation law of the neighbor of the previous layer according to the angle error obtained in the step S2 and the distance error obtained in the step S3, and further designing a velocity control law of the moving body projected to the direction vector and the vertical direction vector of the moving body;
since the velocity of the neighbor in the previous layer is unknown, the unknown velocity needs to be estimated first, and then the velocity control law of the moving body projected onto the pointing vector and the vertical pointing vector is designed. The step is to design a velocity estimation law according to the angle errors of the upper, lower, forward and backward directions, the distance error between the upper neighbor and the velocity direction vector of the upper neighbor, and the velocity control law of the projection of the moving body to the direction vector and the vertical direction vector of the moving body, which are obtained in the steps S2 and S3, and the specific steps are implemented as follows:
s41, obtaining the temporary angle error from the step S2
Figure BDA00020977839700001018
Angular error under critical conditions
Figure BDA00020977839700001019
Forward angle error
Figure BDA00020977839700001020
And backward angle error
Figure BDA00020977839700001021
Distance error from step P3
Figure BDA00020977839700001022
And the velocity direction of the upper layer neighbor
Figure BDA00020977839700001023
Designing rate estimation law of upper layer neighbor
Figure BDA00020977839700001024
Wherein the gain k is controlledlAnd l is 1,2 and 3 are constants larger than 0, b is 1 to indicate that the moving body has a next-layer neighbor, otherwise b is 0.
S42, the distance error obtained in step S3
Figure BDA0002097783970000111
Velocity direction of upper layer neighbor
Figure BDA0002097783970000112
And rate estimation
Figure BDA0002097783970000113
Projecting a design moving body to its pointing vector
Figure BDA0002097783970000114
Law of velocity control
Figure BDA0002097783970000115
Wherein the gain k is controlled4Is a constant greater than 0.
S43, obtaining the temporary angle error from the step S2
Figure BDA0002097783970000116
And forward angle error
Figure BDA0002097783970000117
Velocity direction of upper layer neighbor
Figure BDA0002097783970000118
And rate estimation
Figure BDA0002097783970000119
Projecting a design moving body to a vertically oriented vector
Figure BDA00020977839700001110
Law of velocity control
Figure BDA00020977839700001111
Wherein the gain k is controlledlWhere, l is 5,6,7 is a constant greater than 0, where b is 1 indicates that the moving body has a same-layer neighbor, otherwise b is 0.
S5, performing joint solving for the speed control input of the moving body according to the speed control of the moving body projected to the direction vector and the vertical direction vector obtained in step S4, and completing the motion control of the moving body, specifically comprising:
s51, a speed control part projecting the moving body to the direction vector according to the step S4
Figure BDA00020977839700001112
And a velocity control section projected onto the vertically oriented vector
Figure BDA00020977839700001113
Velocity control input for solving moving body in series
Figure BDA00020977839700001114
Figure BDA00020977839700001115
And S52, the upper computer sends the speed control input of the moving body to the lower computer, and the motion control of the moving body is completed through the servo system.
The method is particularly suitable for the situations that the perception radius of the moving body is limited and the speed is not measurable under the condition of consistency of an airborne coordinate system. Consider a formation tracking bounding control system consisting of n moving bodies in two-dimensional space. The motion of a moving body can be seen as the motion of newtonian particles:
Figure BDA0002097783970000121
wherein z isi=[xi,yi]TIs the position of the moving body in the inertial coordinate system,
Figure BDA0002097783970000122
for the speed control input of the moving body, i is 1. The motion of the tracked object served by the present invention can be expressed in the form of:
Figure BDA0002097783970000123
wherein z is0=[x0,y0]TIs the position of the target in the inertial frame, fv0(t) is a first order continuous derivative function with respect to time t representing the direction of motion of the object,
Figure BDA0002097783970000124
is an unknown constant that represents the unknown rate of the object to be estimated.
In the invention, each moving body can obtain the distance between the moving body and the neighbor on the upper layer through the distance measuring sensor of the moving body
Figure BDA0002097783970000125
Measuring its own pointing vector by means of an imaging device
Figure BDA0002097783970000126
Vector of the upper layer neighbor
Figure BDA0002097783970000127
Direction vector of next layer neighbor
Figure BDA0002097783970000128
Pointing vector of same-layer forward neighbor
Figure BDA0002097783970000129
And backward neighbors on the same layerOrientation vector
Figure BDA00020977839700001210
Wherein
Figure BDA00020977839700001211
Represents the ith motile in the following group of the kth e m layer,
Figure BDA00020977839700001212
representing moving body
Figure BDA00020977839700001213
When k-1 ═ 0 denotes the target,
Figure BDA00020977839700001214
representing moving body
Figure BDA00020977839700001215
The next-layer neighbors of (a) are,
Figure BDA00020977839700001216
representing moving body
Figure BDA00020977839700001217
The same-layer forward neighbors of (a),
Figure BDA00020977839700001218
representing moving body
Figure BDA00020977839700001219
The backward neighbors of the same layer of (b),
Figure BDA00020977839700001220
Figure BDA00020977839700001221
and
Figure BDA00020977839700001222
are the respective direction angles in airborne coordinates. In addition, only one moving body is consideredMeasuring the speed direction of the neighbor on the upper layer of the self
Figure BDA00020977839700001223
While the rate
Figure BDA00020977839700001224
Is unknown. For the moving body, the velocity of the neighbor on the upper layer can be expressed as
Figure BDA00020977839700001225
In order to realize the purpose that each moving body in the plane surrounds the target with a desired radius circle, each moving body in the following group of the k & gt 1 layer needs to ensure the distance between the moving body and the adjacent layer of the next layer
Figure BDA00020977839700001226
And expected value
Figure BDA00020977839700001227
The distance error between the two reaches the design requirement
Figure BDA00020977839700001228
Namely, it is
Figure BDA00020977839700001229
Wherein the content of the first and second substances,
Figure BDA00020977839700001230
represents the distance between the ith moving body and the target in the layer 1 following group,
Figure BDA00020977839700001231
representing the desired circle radius.
In order to realize formation surrounding, the moving body is required to be enabled
Figure BDA0002097783970000131
Corresponding toAngle of clinical picture
Figure BDA0002097783970000132
And expected value
Figure BDA0002097783970000133
(chord length on circle of
Figure BDA0002097783970000134
Corresponding central angle) to meet the design requirements
Figure BDA0002097783970000135
Namely, it is
Figure BDA0002097783970000136
At the same time, the corresponding forward angle
Figure BDA0002097783970000137
And expected value
Figure BDA0002097783970000138
The angle error between the (expected relative central angles) meets the design requirement
Figure BDA0002097783970000139
Namely, it is
Figure BDA00020977839700001310
Therefore, the design idea of the invention is that the moving body is divided into m layers of following groups by initial perception, and the upper layer and the lower layer of each moving body and the front-back neighbors of the same layer are determined; designing an estimation law of the upper layer neighbor rate according to the measurement information; then, respectively designing a speed control law of the moving body projected to the direction vector and the vertical direction vector of the moving body according to the speed estimation and measurement information; finally, the speed control input can be accurately calculated by the speed control laws in two directions, the problems that the sensing radius of a moving body is limited and the speed is not measurable are effectively solved, the method is simple and reliable, and the method can be used in the fields of unmanned swarm detection, battle operation and the like.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited by the foregoing examples, which are provided to illustrate the principles of the invention, and that various changes and modifications may be made without departing from the spirit and scope of the invention, which is also intended to be covered by the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (7)

1. The formation tracking surrounding and unknown speed estimation method based on the following hierarchical structure is characterized by comprising the following steps of:
s1, dividing the moving body into m layers of following groups according to the initial perception, and determining the neighbors of the upper layer and the lower layer of each moving body and the neighbors of the front part and the back part of the same layer, wherein the steps specifically comprise:
s11, setting the moving body group capable of initially sensing the target as a first-layer following group, wherein the moving body group capable of sensing the first-layer following group is called a second-layer following group, and so on, obtaining m following groups in total, wherein m is larger than or equal to 1, and the target is regarded as a virtual 0-th-layer following group;
s12, determining the front and back neighbors of each moving body on the same layer with the neighbors on the same upper layer, wherein the determination method is based on the initial perception and the nearest principle, each moving body does not need to have the front and back neighbors on the same layer, if the moving body can only perceive the moving body on the same layer with the neighbors on the same upper layer, the moving body is the front neighbor on the same layer and the back neighbor on the same layer;
s13, determining the neighbors of the upper layer and the lower layer of each moving body, wherein the determination method is based on the initial perception and the nearest principle, and each moving body must have the neighbors of the upper layer but not necessarily have the neighbors of the lower layer;
s2, measuring the direction vector of the moving body and the direction vectors of the neighbors thereof through the image equipment, and calculating the angle error between the included angle between the direction vector of the moving body and the direction vector of the neighbors thereof and the expected value, wherein the neighbors comprise the upper-layer neighbor of the moving body, the lower-layer neighbor of the moving body, the same-layer forward neighbor and the same-layer backward neighbor of the moving body;
s3, obtaining the distance between the self moving body and the upper layer of the adjacent layer, and calculating the distance error between the distance value and the expected value;
s4, designing a velocity estimation law of the neighbor of the previous layer according to the angle error obtained in the step S2 and the distance error obtained in the step S3, and further designing a velocity control law of the moving body projected to the direction vector and the vertical direction vector of the moving body;
and S5, according to the speed control of the moving body projected to the direction vector and the vertical direction vector obtained in the step S4, the speed control input of the moving body is solved in a series mode, and the motion control of the moving body is completed.
2. The method of tracking envelope and unknown speed estimation in formation based on following hierarchy as claimed in claim 1 wherein: the step S2 further includes:
s21, measuring the self moving body pointing vector through the image equipment
Figure FDA0003340823730000021
Vector of the upper layer neighbor
Figure FDA0003340823730000022
Direction vector of next layer neighbor
Figure FDA0003340823730000023
Pointing vector of same-layer forward neighbor
Figure FDA0003340823730000024
And the vector pointing to the backward neighbor on the same layer
Figure FDA0003340823730000025
The self-moving body pointing vector
Figure FDA0003340823730000026
Vector of the upper layer neighbor
Figure FDA0003340823730000027
Direction vector of next layer neighbor
Figure FDA0003340823730000028
Pointing vector of same-layer forward neighbor
Figure FDA0003340823730000029
Vector pointing to the same layer backward neighbor
Figure FDA00033408237300000210
Wherein the content of the first and second substances,
Figure FDA00033408237300000211
representing the ith motile in the following group of the kth e m layer;
Figure FDA00033408237300000212
representing moving body
Figure FDA00033408237300000213
The upper neighbor of (1); when k-1 ═ 0 represents the target,
Figure FDA00033408237300000214
representing moving body
Figure FDA00033408237300000215
The next-layer neighbor of (1);
Figure FDA00033408237300000216
representing moving body
Figure FDA00033408237300000217
The same layer forward neighbor of (1);
Figure FDA00033408237300000218
representing moving body
Figure FDA00033408237300000219
Backward neighbors of the same layer;
Figure FDA00033408237300000220
and
Figure FDA00033408237300000221
the direction angles of the two are under the airborne coordinates;
s22, calculating the self moving body pointing vector
Figure FDA00033408237300000222
And the neighbor pointing vector of the previous layer
Figure FDA00033408237300000223
The included angle is an adjacent included angle
Figure FDA00033408237300000224
Comprises the following steps:
Figure FDA00033408237300000225
wherein the content of the first and second substances,
Figure FDA00033408237300000226
is a vector perpendicular to the pointing direction of the moving body,
Figure FDA00033408237300000227
calculating the angle of the limb
Figure FDA00033408237300000228
And corresponding expected values
Figure FDA00033408237300000229
Inter-critical angle error, i.e. critical angle error
Figure FDA00033408237300000230
Figure FDA00033408237300000231
S23, calculating the self moving body pointing vector
Figure FDA00033408237300000232
And the pointing vector of the next layer neighbor
Figure FDA00033408237300000233
The included angle is a temporary included angle
Figure FDA00033408237300000234
Comprises the following steps:
Figure FDA00033408237300000235
wherein the content of the first and second substances,
Figure FDA0003340823730000031
calculating the angle of the angle
Figure FDA0003340823730000032
And corresponding expected values
Figure FDA0003340823730000033
The error of (2), i.e. the angle of approachError in degree
Figure FDA0003340823730000034
Comprises the following steps:
Figure FDA0003340823730000035
s24, calculating the self moving body pointing vector
Figure FDA0003340823730000036
And the vector of the same layer forward neighbor
Figure FDA0003340823730000037
Is at an angle of inclination, said forward angle of inclination
Figure FDA0003340823730000038
Comprises the following steps:
Figure FDA0003340823730000039
calculating the forward included angle
Figure FDA00033408237300000310
And corresponding expected values
Figure FDA00033408237300000311
Error of (2), i.e. forward angle error
Figure FDA00033408237300000312
Figure FDA00033408237300000313
S25, calculating the self moving body pointing vector
Figure FDA00033408237300000314
And the pointing vector of the same-layer backward neighbor
Figure FDA00033408237300000315
The angle of (d), the angle of backward direction
Figure FDA00033408237300000316
Comprises the following steps:
Figure FDA00033408237300000317
wherein the content of the first and second substances,
Figure FDA00033408237300000318
calculating the backward included angle
Figure FDA00033408237300000319
And corresponding expected values
Figure FDA00033408237300000320
Error of (2), i.e. backward angle error
Figure FDA00033408237300000321
Figure FDA00033408237300000322
3. The follow-hierarchy-based formation tracking bounding and unknown speed estimation method of claim 2, wherein: in step S3, the distance between the self-moving body and the upper layer of the neighborhood is obtained by the distance measuring sensor, and the distance between the self-moving body and the upper layer of the neighborhood is obtained
Figure FDA00033408237300000323
And expected value
Figure FDA00033408237300000324
Distance error of
Figure FDA00033408237300000325
Comprises the following steps:
Figure FDA00033408237300000326
4. the method of tracking envelope and unknown speed estimation in formation based on following hierarchy as claimed in claim 3 wherein: the step S4 further includes:
s41, obtaining the temporary angle error according to the step S2
Figure FDA0003340823730000041
Angular error under critical conditions
Figure FDA0003340823730000042
Forward angle error
Figure FDA0003340823730000043
And backward angle error
Figure FDA0003340823730000044
Distance error obtained from step S3
Figure FDA0003340823730000045
The velocity direction value of the neighbor of the upper layer obtained according to the measurement
Figure FDA0003340823730000046
Designing a rate estimation law of the neighbor of the upper layer:
Figure FDA0003340823730000047
wherein the gain k is controlledl1,2,3 is a constant greater than 0; b is 1 to indicate that the moving body has a next layer neighbor, otherwise b is 0;
s42, according to the distance error
Figure FDA0003340823730000048
Velocity direction of upper layer neighbor
Figure FDA0003340823730000049
Rate of sum estimation law
Figure FDA00033408237300000410
Designing a speed control law of the moving body projected to the pointing vector of the moving body to reduce the distance error to meet the design requirement;
s43, according to the temporary angle error
Figure FDA00033408237300000411
And forward angle error
Figure FDA00033408237300000412
Velocity direction of upper layer neighbor
Figure FDA00033408237300000413
And rate estimation
Figure FDA00033408237300000414
And designing a speed control law of the projection of the moving body to the vertical direction vector, so that the errors of the adjacent angle and the forward angle are reduced to meet the design requirement.
5. The method for tracking envelope and unknown speed in formation based on following hierarchy as claimed in claim 4, wherein said step S42 is performed by projecting the moving object to its orientation vector
Figure FDA00033408237300000415
The velocity control law above is:
Figure FDA00033408237300000416
wherein the gain k is controlled4Is a constant greater than 0.
6. The method for tracking envelope and unknown velocity in formation based on following hierarchy as claimed in claim 4 or 5, wherein said step S43 is a step of projecting the moving body to the vertical orientation vector
Figure FDA00033408237300000417
The velocity control law above is:
Figure FDA00033408237300000418
wherein the gain k is controlledlWhere, l is 5,6,7 is a constant greater than 0, where b is 1 indicates that the moving body has a same-layer neighbor, otherwise b is 0.
7. The method for tracking envelope and unknown speed in formation based on following hierarchy as claimed in claim 6, wherein said step S5 further comprises:
s51, according to the speed control of the moving body projected on the direction vector and the vertical direction vector obtained in the step S4, the speed control input is solved in series
Figure FDA0003340823730000051
Comprises the following steps:
Figure FDA0003340823730000052
and S52, the upper computer sends the speed control input of the moving body obtained in the step S51 to the lower computer, and the motion control is completed through a servo system.
CN201910525033.0A 2019-06-18 2019-06-18 Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure Active CN110297499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910525033.0A CN110297499B (en) 2019-06-18 2019-06-18 Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910525033.0A CN110297499B (en) 2019-06-18 2019-06-18 Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure

Publications (2)

Publication Number Publication Date
CN110297499A CN110297499A (en) 2019-10-01
CN110297499B true CN110297499B (en) 2022-04-08

Family

ID=68028153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910525033.0A Active CN110297499B (en) 2019-06-18 2019-06-18 Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure

Country Status (1)

Country Link
CN (1) CN110297499B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101373386A (en) * 2008-09-03 2009-02-25 东南大学 Multi-movement body cooperation path tracing control method based on projection
CN101650569A (en) * 2009-08-31 2010-02-17 东南大学 Trailing formation control method of multiple movement bodies in three-dimensional space
CN105629966A (en) * 2016-02-01 2016-06-01 东南大学 Geometric design method for multilayer encircling formation enclosure
CN106773689A (en) * 2016-12-16 2017-05-31 西北工业大学 AUV formation cooperative control methods based on layered distribution type Model Predictive Control
CN107065877A (en) * 2017-05-03 2017-08-18 东南大学 Distribution formation based on relative position is spherical to surround the method for following the trail of unknown object
US10168674B1 (en) * 2013-04-22 2019-01-01 National Technology & Engineering Solutions Of Sandia, Llc System and method for operator control of heterogeneous unmanned system teams

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586464B2 (en) * 2015-07-29 2020-03-10 Warren F. LeBlanc Unmanned aerial vehicles

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101373386A (en) * 2008-09-03 2009-02-25 东南大学 Multi-movement body cooperation path tracing control method based on projection
CN101650569A (en) * 2009-08-31 2010-02-17 东南大学 Trailing formation control method of multiple movement bodies in three-dimensional space
US10168674B1 (en) * 2013-04-22 2019-01-01 National Technology & Engineering Solutions Of Sandia, Llc System and method for operator control of heterogeneous unmanned system teams
CN105629966A (en) * 2016-02-01 2016-06-01 东南大学 Geometric design method for multilayer encircling formation enclosure
CN106773689A (en) * 2016-12-16 2017-05-31 西北工业大学 AUV formation cooperative control methods based on layered distribution type Model Predictive Control
CN107065877A (en) * 2017-05-03 2017-08-18 东南大学 Distribution formation based on relative position is spherical to surround the method for following the trail of unknown object

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Formation circling control of detect-pursuing structure with an uncertain dynamic target;Jiang Guoqing,et al.;《International Conference on Control and Automation》;20190731;第1230-1234页 *
多自主体协同中的微分博弈问题研究;阳倪;《万方学位论文》;20181218;全文 *
大规模机器人群体的分层编队控制算法;陈世明 等;《华中科技大学学报(自然科技版)》;20141031;第42卷(第10期);第52-57页 *

Also Published As

Publication number Publication date
CN110297499A (en) 2019-10-01

Similar Documents

Publication Publication Date Title
He et al. A review of monocular visual odometry
CN108230361B (en) Method and system for enhancing target tracking by fusing unmanned aerial vehicle detector and tracker
CN109298629B (en) System and method for guiding mobile platform in non-mapped region
Weikersdorfer et al. Event-based particle filtering for robot self-localization
Kolhatkar et al. Review of SLAM algorithms for indoor mobile robot with LIDAR and RGB-D camera technology
Chen et al. Transforming a 3-d lidar point cloud into a 2-d dense depth map through a parameter self-adaptive framework
US10347001B2 (en) Localizing and mapping platform
CN111141264B (en) Unmanned aerial vehicle-based urban three-dimensional mapping method and system
CN113674412B (en) Pose fusion optimization-based indoor map construction method, system and storage medium
CN111766783A (en) Cluster system-oriented formation enclosure tracking method capable of converging in limited time
Zheng et al. Robust and accurate monocular visual navigation combining IMU for a quadrotor
CN111812978B (en) Cooperative SLAM method and system for multiple unmanned aerial vehicles
Al-Mutib et al. Stereo vision SLAM based indoor autonomous mobile robot navigation
US20240078701A1 (en) Location determination and mapping with 3d line junctions
Vemprala et al. Monocular vision based collaborative localization for micro aerial vehicle swarms
CN108646760B (en) Monocular vision based mobile robot target tracking and platform control system and method
CN110297499B (en) Formation tracking surrounding and unknown speed estimation method based on following hierarchical structure
Xu et al. A vision-only relative distance calculation method for multi-UAV systems
EP4174777A1 (en) Apparatus for accelerating simultaneous localization and mapping and electronic device including the same
CN114147707B (en) Robot docking method and device based on visual identification information
KR102319631B1 (en) Device For Tracing Underwater Object And Controlling Method Thereof
Baik et al. Geometric particle swarm optimization for robust visual ego-motion estimation via particle filtering
Kaiser et al. Position and orientation of an aerial vehicle through chained, vision-based pose reconstruction
Shibata et al. Refraction-based bundle adjustment for scale reconstructible structure from motion
Marzat et al. Vision-based localization, mapping and control for autonomous MAV: EuRoC challenge results

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant