CN110825108A - Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace - Google Patents

Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace Download PDF

Info

Publication number
CN110825108A
CN110825108A CN201911092138.8A CN201911092138A CN110825108A CN 110825108 A CN110825108 A CN 110825108A CN 201911092138 A CN201911092138 A CN 201911092138A CN 110825108 A CN110825108 A CN 110825108A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
speed
collision
velocity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911092138.8A
Other languages
Chinese (zh)
Other versions
CN110825108B (en
Inventor
唐文兵
黎瑶
陈博琛
丁佐华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sci Tech University ZSTU
Original Assignee
Zhejiang Sci Tech University ZSTU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sci Tech University ZSTU filed Critical Zhejiang Sci Tech University ZSTU
Priority to CN201911092138.8A priority Critical patent/CN110825108B/en
Publication of CN110825108A publication Critical patent/CN110825108A/en
Application granted granted Critical
Publication of CN110825108B publication Critical patent/CN110825108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a cooperative anti-collision method for multiple tracking unmanned aerial vehicles in the same airspace. For each unmanned aerial vehicle, the method firstly uses OpenCV to detect a moving target, and then uses a PID controller to calculate the speed of the unmanned aerial vehicle according to the position information of the target on an image obtained by an onboard camera. And then, carrying out collision detection by using a speed obstacle model, and finally providing a target-oriented speed optimization method. The invention provides a target-oriented speed optimization method in consideration of real-time requirements, wherein the problem of multi-machine cooperative anti-collision under the condition that unmanned aerial vehicle paths are crossed in a complex space is taken into consideration, the problem of cooperative anti-collision of multiple unmanned aerial vehicles is converted into the problem of planning of a speed space by using a speed obstacle model.

Description

Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace
Technical Field
The invention relates to the field of multi-unmanned aerial vehicle collaborative motion planning under constraint conditions, in particular to a collaborative anti-collision method for multiple tracking unmanned aerial vehicles in the same airspace.
Background
With the maturity of low altitude Unmanned Aerial Vehicle (UAV) hovering, cruising and pan-tilt technologies, as well as the rise of Computer Vision and Deep Learning (Deep Learning), target tracking based on an Unmanned Aerial Vehicle platform has become a research hotspot at home and abroad, and has been used for tracking crime vehicles in urban anti-terrorism, tracking ground maneuvering targets in air war and the like. Tracking the unmanned aerial vehicle refers to an unmanned aerial vehicle with special purposes, which acquires image information by using sensors such as an airborne pan-tilt and the like, acquires position information of a target on an image (or predicts the position information of the target by state estimation and multi-sensor information fusion) through a target detection process, and calculates a control command (usually a velocity vector) through a tracking algorithm to control the motion of the unmanned aerial vehicle so as to ensure that the target is always located near a central area of a visual field obtained by an airborne camera. However, the above process is driven only by the movement of the target, and does not take into account the environment in which the drone is located (i.e. the threat of other aircraft). With the popularity of civilian drones and the increasing complexity of the tasks performed, the routes for tracking drones may cross. Therefore, a collision detection and resolution method between unmanned aerial vehicles needs to be designed to guarantee flight safety.
The problem of cooperative collision prevention of multiple unmanned aerial vehicles relates to the problem of high-dimensional combined space generated by superposition of multiple unmanned aerial vehicle degrees of freedom, the problem of optimization, the problem of static and dynamic constraint of unmanned aerial vehicles and the like. The control architecture of the multi-unmanned aerial vehicle anti-collision system comprises a centralized structure and a distributed structure. The centralized control can obtain the planning result with high efficiency and global optimization, but the centralized control is mainly suitable for static environment and is difficult to cope with the change of the environment; in a distributed system, each unmanned aerial vehicle plans its own action according to its own environmental information, which has the advantages of being adaptable to environmental changes and having the disadvantages of not being able to obtain a global optimal solution and possibly causing a Deadlock (Deadlock) problem. At present, the multi-drone anti-collision method is mostly expanded from research results of single-robot motion planning, such as Sampling-based Methods (Sampling-based Methods), Neural Network-based Methods (Neural Network-based Methods), Fuzzy Logic (Fuzzy Logic), geometric models (such as Velocity Obstacles (Velocity), Artificial Potential Field Methods (Artificial Potential fields)), mathematical optimization models (mathematical optimization or planning Methods), Swarm Intelligence or soft computing (Swarm Intelligence, software computing), and Reinforcement Learning (relationship Learning).
Among the above methods, the speed obstacle is a Local path planning (Local path planning) model (i.e. obstacle avoidance model) commonly used by multi-robot systems, but it is assumed that the obstacle is stationary or moving at a constant speed, and the model needs to solve a complex Nonlinear Optimization Problem (Nonlinear Optimization Problem) in a Relative speed Space (Relative Velocity Space), so that the real-time property is difficult to guarantee. In addition, on tracking the unmanned aerial vehicles, the speed optimization not only needs to consider potential collision among the unmanned aerial vehicles, but also needs to consider the motion trend of the tracked target, and meanwhile needs to ensure real-time response.
Disclosure of Invention
In order to solve the problems, improve the use efficiency of an airspace, ensure the safety and smoothness of the flight of the unmanned aerial vehicle and realize the continuous tracking of a target, the invention provides a cooperative anti-collision method for multiple tracking unmanned aerial vehicles in the same airspace.
The invention is realized by the following technical scheme: a cooperative anti-collision method for multiple tracking unmanned aerial vehicles in the same airspace specifically comprises the following steps:
step 1, firstly, acquiring image information from a real-time video stream issued by an unmanned aerial vehicle sensor;
step 2, using a tracker in OpenCV to locate the center coordinate (x) of the target on the imageo,yo);
Step 3, the central coordinate (x) in the step 2 is comparedo,yo) Scaling to get a normalized offset Δx' and Deltay' calculating the velocity vector v of the drone using a PID algorithm for tracking the target.
Step 4, using a velocity barrier model to perform collision detection on v, namely checking whether the tail end of v is in a combined velocity barrier area, and when the tail end of v is in the combined velocity barrier area, adjusting the candidate velocity by using a target-oriented velocity optimization method to eliminate collision;
and 5, repeatedly executing the processes of the steps 1-4 by the unmanned aerial vehicle until the tracking task is completed.
Further, the method for tracking the unmanned aerial vehicle specifically comprises the following steps:
step 1: collecting image information by using equipment such as an airborne holder and the like, and returning video data to a control end;
step 2: selecting a target to be tracked by the unmanned aerial vehicle from the video data, and extracting features such as HOG, HSV and the like;
and 3, step 3: positioning the central coordinate (x) of the target on the image by using the features of HOG, HSV and the like extracted in the step 2 through a tracker of OpenCVo,yo) I.e. the centre of the smallest enclosing circle or the centre of the smallest enclosing rectangle;
and 4, step 4: calculating the center coordinate (x)o,yo) With unmanned aerial vehicle field of vision central point (x)c,yc) The absolute offsets Δ x and Δ y in both the horizontal and vertical directions:
Figure BDA0002267129160000021
and 5, step 5: setting a dead zone range, wherein the width is w, the height is h, and the center of the dead zone range is superposed with the center of the visual field of the unmanned aerial vehicle; when | xo-xcWhen | < w/2, the speed direction of the unmanned aerial vehicle does not need to be adjusted; when yo-ycWhen | < h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 6 is implemented;
and 6, step 6: let the resolution of the view picture of the unmanned aerial vehicle be xr×yrThe offsets are first normalized, scaling Δ x and Δ y to [ -1, 1]I.e. deltax′=Δx/(xr/2),Δy′=Δy/(yr/2), then the speed of the drone at the next moment is:
v=vmaxΔy
α=αmaxΔx
vmaxmaximum speed of drone, αmaxmaxPi/2) is the maximum rotation angle of the unmanned aerial vehicle rotating around the Z axis only,
wherein when isy' < 0, indicating that it should be reversed, otherwise it should be advanced; similarly, when ΔxWhen the' is less than 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards;
at this time, a new velocity vector v ═ (v, α) is generated, the direction of which is relative to the current straight ahead α, and the velocity magnitude is v;
and 7, step 7: and repeating the steps 3-6 until the tracking task is completed.
Further, step 4 comprises the following substeps:
when there are multiple tracked drones in the airspace, collision check on v is also needed.
Step 1: screening the intruders by using a collision cone model, and selecting the threateners threatening the local machine, wherein the specific process comprises the following steps:
for each at u0Unmanned plane u in sensing rangeiDefining the relative velocity v0|i=v0-viDefining the ray:
λ(p,v)={p+vt|t>0}
wherein the state information of the drone is represented by S ═ (p, v), and p ═ p (p)x,py) Indicating the position of the unmanned aerial vehicle, and v indicating the speed information of the unmanned aerial vehicle; at unmanned plane u0Has u in the sensing range1,…,unAlso executing the task, defining n unmanned planes u1,…,unIs u0Is the "invader", i.e. | | p0-pi||2≤ρSR,ρSRA radius representing a perception range of the drone, or a radius representing an early warning range; λ (p, v) represents a ray whose origin is in p, the direction being along the direction of v;
according to u0Radius size of (d) to uiIs subjected to 'puffing', i.e. ri′=ri+r0Obtaining a position obstacle PO0|iDefine the collision cone CC0|iComprises the following steps:
Figure BDA0002267129160000031
any relative velocity vt|iIs located in CC0|iAll cause u0And uiBy collision of CC0|iScreening out the pairs u0Threatening unmanned plane denoted u1,…,um(m≤n)。
Step 2, defining a speed obstacle model:
Figure BDA0002267129160000032
wherein
Figure BDA0002267129160000033
Representing a minkowski vector sum operation, i.e.:
wherein, VOt|iIs a set of a series of speeds that result in u0And uiA collision at some future time;
to avoid multiple threats, multiple VOs need to be combined:
Figure BDA0002267129160000042
where U denotes the combination of m VO geometric regions, so when v0When the tail end of the motor is positioned in the VO, the speed outside the VO is selected to avoid potential collision.
Step 3, speed optimization, under the condition of ensuring that the direction of a speed vector v is unchanged, under the constraint of VO, the speed v (t +1) of the unmanned aerial vehicle at the next moment is optimized, and the specific steps are as follows:
assuming that the boundary curve of VO is Ω and the extension of velocity v is defined as λ (p, v), then:
P=λ(p,v)∩Ω
where P denotes the intersection of the candidate speed and the VO boundary curve. Of all the intersections, the intersection P closest to the distance P is selectediAs end of the next time velocity EvNamely:
wherein P isiDenotes the intersection of p with the i' th VO boundary curve.
The optimized speed of the unmanned aerial vehicle is as follows:
v′=(v′,α)=(||Ev-p||2,α)。
compared with the prior art, the invention has the following beneficial effects:
(1) the method has a complete geometric basis, is commonly used for the problems of obstacle avoidance and collision avoidance of a multi-robot system, and is suitable for the application scene of the method;
(2) when a plurality of intruders exist, the collision cone model is used for screening the intruders, so that the problem scale is reduced;
(3) compared with a classical velocity barrier model, the target-oriented velocity optimization method is provided in consideration of the requirements of real-time performance and directivity of the tracking task (namely the requirement that the tracking unmanned aerial vehicle needs to keep the motion trend basically consistent with the tracked target), namely a simple method for solving the optimization problem is provided.
The method is characterized in that: the collision problem that probably takes place between them when having solved many tracking unmanned aerial vehicle flights fast high-efficiently.
Drawings
FIG. 1 is a flow chart embodying the present invention;
FIG. 2 is a diagram of an implementation of the present invention;
FIG. 3 is a diagram showing a relationship between an actual position and an ideal position of a target on a visual field picture of an airborne tripod head of an unmanned aerial vehicle;
FIG. 4 is a schematic diagram of the screening of "intruding" drones based on velocity barriers in the present invention;
FIG. 5 is a schematic diagram of a target-oriented speed optimization method introduced in the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Fig. 1 shows a specific flowchart of a cooperative anti-collision method for multiple tracked drones in the same airspace, which specifically includes:
step 1, firstly, acquiring image information from a real-time video stream issued by an unmanned aerial vehicle sensor;
step 2, using a tracker in OpenCV to locate the center coordinate (x) of the target on the imageo,yo);
Step 3, the central coordinate (x) in the step 2 is comparedo,yo) Normalized offset Δx' and Deltay' calculating the velocity vector v of the drone using a PID algorithm for tracking the target.
Step 4, using a velocity barrier model to perform collision detection on v, namely checking whether the tail end of v is in a combined velocity barrier area, and when the tail end of v is in the combined velocity barrier area, adjusting the candidate velocity by using a target-oriented velocity optimization method to eliminate collision;
and 5, repeatedly executing the processes of the steps 1-4 by the unmanned aerial vehicle until the tracking task is completed.
Fig. 2 shows an implementation architecture diagram for implementing the method of the invention on a real drone platform. The bottom layer of the architecture is an unmanned aerial vehicle hardware platform which comprises a battery, a point machine, a sensor, an antenna, an onboard processor and the like, and a Linux operating system is installed on the onboard processor of the unmanned aerial vehicle. In order to realize global information sharing between drones, a Robot Operation System (ROS) is considered to be deployed on a Linux Operation System as a Meta-Operation System. In the ROS, a Master Node (Master Node) is responsible for message communication between nodes. Therefore, each unmanned aerial vehicle can be used as a node in one ROS, all unmanned aerial vehicles register own information with a main node, and each unmanned aerial vehicle broadcasts own information such as position, speed and the like through a Topic (Topic). Similarly, other drones may subscribe to the topic to obtain information for the drone. This enables asynchronous global communication between drones. On the basis of an ROS layer, the cooperative anti-collision system for the tracking unmanned aerial vehicle comprises two main functional modules, namely target tracking and cooperative anti-collision. The target tracking module mainly comprises two parts, namely target detection based on OpenCV and speed control based on PID; the cooperative collision avoidance mainly comprises a collision detection based on speed obstacle and a speed optimization part of target guidance. A detailed process description of these two modules is given below:
the specific implementation steps of the target tracking process are as follows:
step 1: collecting image information by using equipment such as an airborne holder and the like, and returning video data to a control end;
step 2: selecting a target to be tracked by the unmanned aerial vehicle from the video data, and extracting features such as HOG, HSV and the like;
and 3, step 3: locating the central coordinates (x) of the target on the image by using the features of HOG, HSV, etc. extracted in step 2 through the 'tracker' of OpenCV, such as MIL, KCF, TLD, etco,yo) I.e. the center of the smallest enclosing circle or the center of the smallest enclosing rectangle, such as the center of the red circle in fig. 3;
and 4, step 4: calculating the center coordinate (x)o,yo) With unmanned aerial vehicle field of vision central point (x)c,yc) The absolute offsets Δ x and Δ y in both the horizontal and vertical directions:
Figure BDA0002267129160000061
and the delta x and the delta y are used for judging whether the unmanned aerial vehicle needs to carry out speed adjustment at the current moment or not and calculating a control instruction by using a PID algorithm.
And 5, step 5: in order to reduce the "jitter" problem of the drone when the target center coordinates are near the ideal value, a dead zone range is set, with width w and height h, as shown by the small rectangle in fig. 3. The center of the dead zone range is superposed with the center of the visual field of the unmanned aerial vehicle; when | xo-xcWhen the | is less than or equal to w/2, the speed direction of the unmanned aerial vehicle does not need to be adjusted, namely, the unmanned aerial vehicle does not need to be rotated left or right; when yo-ycWhen | < h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 6 is implemented;
and 6, step 6: arriving at this step to illustrate that the position of the target on the image has already gone out of the dead zone range, the present invention calculates the drone speed using a PID controller, only the P control is given here as an example. Let the resolution of the view picture of the unmanned aerial vehicle be xr×yrThe offsets are first normalized, scaling Δ x and Δ y to [ -1, 1]I.e. deltax′=Δx/(xr/2),Δy′=Δy/(yr/2), then the speed of the drone at the next moment is:
v=vmaxΔy
α=αmaxΔx
vmaxmaximum speed of drone, αmaxmaxPi/2) is the maximum rotation angle of the unmanned aerial vehicle rotating around the Z axis only,
wherein when isy' < 0, indicating that it should be reversed, otherwise it should be advanced; similarly, when ΔxWhen the' is less than 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards;
at this time, a new velocity vector v ═ (v, α) is generated, the direction of which is relative to the current straight ahead α, and the velocity magnitude is v;
and 7, step 7: and repeating the steps 3-6 until the tracking task is completed.
However, the speed generated by the above process cannot guarantee that no collision occurs between the drones, and for this reason, a synergistic collision avoidance process is given. The anti-collision process provided by the invention is based on a speed obstacle model, is commonly used for the problems of obstacle avoidance and collision avoidance of a multi-robot system, and is suitable for the application scene of the invention. The specific implementation steps of the cooperative anti-collision process are as follows:
step 1: screening the intruders by using a collision cone model, and selecting the threateners threatening the local machine, wherein the specific process comprises the following steps:
for each at u0Unmanned plane u in sensing rangeiDefining the relative velocity v0|i=v0-viDefining the ray:
λ(p,v)={p+vt|t>0}
wherein the state information of the drone is represented by S ═ (p, v), and p ═ p (p)x,py) Indicating the position of the unmanned aerial vehicle, and v indicating the speed information of the unmanned aerial vehicle; at unmanned plane u0Has u in the sensing range1,…,unAlso executing the task, defining n unmanned planes u1,…,unIs u0Is the "invader", i.e. | | p0-pi||2≤ρSR,ρSRA radius representing a perception range of the drone, or a radius representing an early warning range; λ (p, v) represents a ray whose origin is in p, the direction being along the direction of v;
according to u0Radius size of (d) to uiIs subjected to 'puffing', i.e. ri′=ri+r0Obtaining a position obstacle PO0|iDefine the collision cone CC0|iComprises the following steps:
Figure BDA0002267129160000071
any relative velocity vt|iIs located in CC0|iAll cause u0And uiSuch as the shaded portion on the right in fig. 4. By CC0|iScreening out the pairs u0Threatening unmanned plane denoted u1,…,um(m≤n)。
Step 2, combining potential collisions: in order to directly use the absolute velocity v0Performing collision detection, and proposing a speed obstacle model;
Figure BDA0002267129160000072
wherein
Figure BDA0002267129160000073
Representing a minkowski vector sum operation, i.e.:
Figure BDA0002267129160000074
wherein, VOt|iIs a set of a series of speeds that result in u0And uiA collision at some future time;
to avoid multiple threats, multiple VOs need to be combined, as in fig. 5 a VO consists of two speed barriers:
Figure BDA0002267129160000075
where U denotes the combination of m VO geometric regions, so when v0When the tail end of the motor is positioned in the VO, the speed outside the VO is selected to avoid potential collision.
Step 3, collision handling (speed optimization). In the classical velocity barrier model, the following optimization problem is solved
Figure BDA0002267129160000076
The speed of the robot is adjusted. However, the process is time-consuming, and therefore, the method for optimizing the target-oriented speed is provided, and under the condition that the direction of the speed vector v is guaranteed to be unchanged to the maximum extent and under the constraint of the VO, the speed v (t +1) of the unmanned aerial vehicle at the next moment is optimized, and the method specifically comprises the following steps:
assuming that the boundary curve of VO is Ω and the extension of velocity v is defined as λ (p, v), then:
P=λ(p,v)∩Ω
where P denotes the intersection of the candidate speed and the VO boundary curve. Of all the intersections, the intersection P closest to the distance P is selectediAs end of the next time velocity EvNamely:
Figure BDA0002267129160000081
wherein P isiDenotes the intersection of p with the i' th VO boundary curve.
The optimized speed of the unmanned aerial vehicle is as follows:
v′=(v′,α)=(||Ev-p||2,α)。
the present invention is not limited to the above-described embodiments, and those skilled in the art can implement the present invention in other various embodiments based on the disclosure of the present invention. Therefore, the design of the invention is within the scope of protection, with simple changes or modifications, based on the design structure and thought of the invention.

Claims (3)

1. A cooperative anti-collision method for multiple tracking unmanned aerial vehicles in the same airspace is characterized by comprising the following steps:
step 1, firstly, acquiring image information from a real-time video stream issued by an unmanned aerial vehicle sensor;
step 2, using a tracker in OpenCV to locate the center coordinate (x) of the target on the imageo,yo);
Step 3, the central coordinate (x) in the step 2 is comparedo,yo) Scaling to get a normalized offset Δx' and Deltay' calculating the velocity vector v of the drone using a PID algorithm for tracking the target.
Step 4, using a velocity barrier model to perform collision detection on v, namely checking whether the tail end of v is in a combined velocity barrier area, and when the tail end of v is in the combined velocity barrier area, adjusting the candidate velocity by using a target-oriented velocity optimization method to eliminate collision;
and 5, repeatedly executing the processes of the steps 1-4 by the unmanned aerial vehicle until the tracking task is completed.
2. The cooperative anti-collision method according to claim 1, wherein the method for tracking the drone is specifically:
step 1: collecting image information by using equipment such as an airborne holder and the like, and returning video data to a control end;
step 2: selecting a target to be tracked by the unmanned aerial vehicle from the video data, and extracting features such as HOG, HSV and the like;
and 3, step 3: positioning the central coordinate (x) of the target on the image by using the features of HOG, HSV and the like extracted in the step 2 through a tracker of OpenCVo,yo) I.e. the centre of the smallest enclosing circle or the centre of the smallest enclosing rectangle;
and 4, step 4: calculating the center coordinate (x)o,yo) With unmanned aerial vehicle field of vision central point (x)c,yc) The absolute offsets Δ x and Δ y in both the horizontal and vertical directions:
Figure FDA0002267129150000011
and 5, step 5: setting a dead zone range, wherein the width is w, the height is h, and the center of the dead zone range is superposed with the center of the visual field of the unmanned aerial vehicle; when | xo-xcWhen | < w/2, the speed direction of the unmanned aerial vehicle does not need to be adjusted; when yo-ycWhen | < h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 6 is implemented;
and 6, step 6: let the resolution of the view picture of the unmanned aerial vehicle be xr×yrThe offsets are first normalized, scaling Δ x and Δ y to [ -1, 1]I.e. deltax′=Δx/(xr/2),Δy′=Δy/(yr/2), then the speed of the drone at the next moment is:
v=vmaxΔy
α=αmaxΔx
vmaxmaximum speed of drone, αmaxmaxPi/2) is the maximum rotation angle of the unmanned aerial vehicle rotating around the Z axis only,
wherein when isy′<0, indicating that it should be backed up, otherwise it should be advanced;similarly, when Δx′<When 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards;
at this time, a new velocity vector v ═ (v, α) is generated, the direction of which is relative to the current straight ahead α, and the velocity magnitude is v;
and 7, step 7: and repeating the steps 3-6 until the tracking task is completed.
3. The cooperative anti-collision method according to claim 1, wherein step 4 comprises the following sub-steps:
when there are multiple tracked drones in the airspace, collision check on v is also needed.
Step 1: screening the intruders by using a collision cone model, and selecting the threateners threatening the local machine, wherein the specific process comprises the following steps:
for each at u0Unmanned plane u in sensing rangeiDefining the relative velocity v0|i=v0-viDefining the ray:
λ(p,v)={p+vt|t>0}
wherein the state information of the drone is represented by S ═ (p, v), and p ═ p (p)x,py) Indicating the position of the unmanned aerial vehicle, and v indicating the speed information of the unmanned aerial vehicle; at unmanned plane u0Has u in the sensing range1,…,unAlso executing the task, defining n unmanned planes u1,…,unIs u0Is the "invader", i.e. | | p0-pi||2≤ρSR,ρSRA radius representing a perception range of the drone, or a radius representing an early warning range; λ (p, v) represents a ray whose origin is in p, the direction being along the direction of v;
according to u0Radius size of (d) to uiIs subjected to 'puffing', i.e. ri′=ri+r0Obtaining a position obstacle PO0|iDefine the collision cone CC0|iComprises the following steps:
Figure FDA0002267129150000021
any relative velocity vt|iIs located in CC0|iAll cause u0And uiBy collision of CC0|iScreening out the pairs u0Threatening unmanned plane denoted u1,…,um(m≤n)。
Step 2, defining a speed obstacle model:
Figure FDA0002267129150000022
wherein
Figure FDA0002267129150000023
Representing a minkowski vector sum operation, i.e.:
Figure FDA0002267129150000024
wherein, VOt|iIs a set of a series of speeds that result in u0And uiA collision at some future time; to avoid multiple threats, multiple VOs need to be combined:
Figure FDA0002267129150000031
wherein ∪ denotes the combination of m VO geometric regions, so when v is0When the tail end of the motor is positioned in the VO, the speed outside the VO is selected to avoid potential collision.
Step 3, speed optimization, under the condition of ensuring that the direction of a speed vector v is unchanged, under the constraint of VO, the speed v (t +1) of the unmanned aerial vehicle at the next moment is optimized, and the specific steps are as follows:
assuming that the boundary curve of VO is Ω and the extension of velocity v is defined as λ (p, v), then:
P=λ(p,v)∩Ω
where P denotes the intersection of the candidate speed and the VO boundary curve. In all intersections, distances are selectedThe intersection point P nearest to PiAs end of the next time velocity EvNamely:
wherein P isiDenotes the intersection of p with the i' th VO boundary curve.
The optimized speed of the unmanned aerial vehicle is as follows:
v′=(v′,α)=(||Ev-p||2,α)。
CN201911092138.8A 2019-11-11 2019-11-11 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace Active CN110825108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911092138.8A CN110825108B (en) 2019-11-11 2019-11-11 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911092138.8A CN110825108B (en) 2019-11-11 2019-11-11 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace

Publications (2)

Publication Number Publication Date
CN110825108A true CN110825108A (en) 2020-02-21
CN110825108B CN110825108B (en) 2023-03-14

Family

ID=69553838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911092138.8A Active CN110825108B (en) 2019-11-11 2019-11-11 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace

Country Status (1)

Country Link
CN (1) CN110825108B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111982127A (en) * 2020-08-31 2020-11-24 华通科技有限公司 Lightweight-3D obstacle avoidance method
CN112270250A (en) * 2020-10-26 2021-01-26 浙江理工大学 Target tracking method for tracking ground moving target by unmanned aerial vehicle
CN112506194A (en) * 2020-12-03 2021-03-16 中山大学 Distributed safety learning control method for mobile robot cluster
CN112925342A (en) * 2021-01-20 2021-06-08 北京工商大学 Unmanned aerial vehicle dynamic obstacle avoidance method based on improved mutual velocity obstacle method
CN113885562A (en) * 2021-10-08 2022-01-04 北京理工大学 Multi-unmanned aerial vehicle cooperative collision avoidance method under perception constraint based on speed obstacle
CN114355958A (en) * 2021-09-09 2022-04-15 南京航空航天大学 Interactive task deployment method of multi-unmanned-aerial-vehicle intelligent cooperative system
CN117241133A (en) * 2023-11-13 2023-12-15 武汉益模科技股份有限公司 Visual work reporting method and system for multi-task simultaneous operation based on non-fixed position

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120265380A1 (en) * 2011-04-13 2012-10-18 California Institute Of Technology Target Trailing with Safe Navigation with colregs for Maritime Autonomous Surface Vehicles
CN104501816A (en) * 2015-01-08 2015-04-08 中国航空无线电电子研究所 Multi-unmanned aerial vehicle coordination and collision avoidance guide planning method
CN105022270A (en) * 2015-03-20 2015-11-04 武汉理工大学 Automatic ship collision avoidance method based on velocity vector coordinate system
CN108958289A (en) * 2018-07-28 2018-12-07 天津大学 Cluster unmanned plane collision prevention method based on relative velocity obstacle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120265380A1 (en) * 2011-04-13 2012-10-18 California Institute Of Technology Target Trailing with Safe Navigation with colregs for Maritime Autonomous Surface Vehicles
CN104501816A (en) * 2015-01-08 2015-04-08 中国航空无线电电子研究所 Multi-unmanned aerial vehicle coordination and collision avoidance guide planning method
CN105022270A (en) * 2015-03-20 2015-11-04 武汉理工大学 Automatic ship collision avoidance method based on velocity vector coordinate system
CN108958289A (en) * 2018-07-28 2018-12-07 天津大学 Cluster unmanned plane collision prevention method based on relative velocity obstacle

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FIORINIP: "motion planning in dynamic environments using velocity obstacles", 《THEINTERNATIONALJOURNALOFROBOTICSRESEARCH》 *
YAMIN HUANG: "Generalized velocity obstacle algorithm for preventing ship collisions at sea", 《OCEAN ENGINEERING》 *
杨秀霞 等: "一种三维空间UAV自主避障算法研究", 《计算机与数字工程》 *
熊勇 等: "基于速度障碍的多船自动避碰控制方法", 《中国航海》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111982127A (en) * 2020-08-31 2020-11-24 华通科技有限公司 Lightweight-3D obstacle avoidance method
CN112270250A (en) * 2020-10-26 2021-01-26 浙江理工大学 Target tracking method for tracking ground moving target by unmanned aerial vehicle
CN112270250B (en) * 2020-10-26 2024-04-09 浙江理工大学 Target tracking method for tracking ground moving target by unmanned aerial vehicle
CN112506194A (en) * 2020-12-03 2021-03-16 中山大学 Distributed safety learning control method for mobile robot cluster
CN112506194B (en) * 2020-12-03 2022-03-29 中山大学 Distributed safety learning control method for mobile robot cluster
CN112925342B (en) * 2021-01-20 2022-07-01 北京工商大学 Unmanned aerial vehicle dynamic obstacle avoidance method based on improved mutual velocity obstacle method
CN112925342A (en) * 2021-01-20 2021-06-08 北京工商大学 Unmanned aerial vehicle dynamic obstacle avoidance method based on improved mutual velocity obstacle method
CN114355958A (en) * 2021-09-09 2022-04-15 南京航空航天大学 Interactive task deployment method of multi-unmanned-aerial-vehicle intelligent cooperative system
CN114355958B (en) * 2021-09-09 2022-06-21 南京航空航天大学 Interactive task deployment method of multi-unmanned-aerial-vehicle intelligent cooperative system
CN113885562B (en) * 2021-10-08 2023-01-10 北京理工大学 Multi-unmanned aerial vehicle cooperative collision avoidance method under perception constraint based on speed obstacle
CN113885562A (en) * 2021-10-08 2022-01-04 北京理工大学 Multi-unmanned aerial vehicle cooperative collision avoidance method under perception constraint based on speed obstacle
CN117241133A (en) * 2023-11-13 2023-12-15 武汉益模科技股份有限公司 Visual work reporting method and system for multi-task simultaneous operation based on non-fixed position
CN117241133B (en) * 2023-11-13 2024-02-06 武汉益模科技股份有限公司 Visual work reporting method and system for multi-task simultaneous operation based on non-fixed position

Also Published As

Publication number Publication date
CN110825108B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN110825108B (en) Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace
CN110632941B (en) Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
Ryan et al. An overview of emerging results in cooperative UAV control
Roelofsen et al. Reciprocal collision avoidance for quadrotors using on-board visual detection
Eresen et al. Autonomous quadrotor flight with vision-based obstacle avoidance in virtual environment
Ma'Sum et al. Simulation of intelligent unmanned aerial vehicle (UAV) for military surveillance
Lin et al. A robust real-time embedded vision system on an unmanned rotorcraft for ground target following
Kumar et al. Recent developments on target tracking problems: A review
Rafi et al. Autonomous target following by unmanned aerial vehicles
Park et al. Stereo vision based obstacle collision avoidance for a quadrotor using ellipsoidal bounding box and hierarchical clustering
Xu et al. Vision-based autonomous landing of unmanned aerial vehicle on a motional unmanned surface vessel
Chen et al. Real-time identification and avoidance of simultaneous static and dynamic obstacles on point cloud for UAVs navigation
Kim Control laws to avoid collision with three dimensional obstacles using sensors
Pritzl et al. Cooperative navigation and guidance of a micro-scale aerial vehicle by an accompanying UAV using 3D LiDAR relative localization
Lombaerts et al. Adaptive multi-sensor fusion based object tracking for autonomous urban air mobility operations
Leong et al. Vision-based sense and avoid with monocular vision and real-time object detection for uavs
Bodi et al. Reinforcement learning based UAV formation control in GPS-denied environment
Nagrare et al. Decentralized path planning approach for crowd surveillance using drones
Shinde et al. Multi-view geometry and deep learning based drone detection and localization
Choi et al. Multi-robot avoidance control based on omni-directional visual SLAM with a fisheye lens camera
Barisic et al. Brain over Brawn: Using a Stereo Camera to Detect, Track, and Intercept a Faster UAV by Reconstructing the Intruder's Trajectory
Lwowski et al. A reactive bearing angle only obstacle avoidance technique for unmanned ground vehicles
Wang et al. Low-cost camera based sense and avoid in unmanned aerial vehicles: Sensing and control methods
Marlow et al. Local terrain mapping for obstacle avoidance using monocular vision
US11865978B2 (en) Object tracking system including stereo camera assembly and methods of use

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant