CN115661204B - Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster - Google Patents

Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster Download PDF

Info

Publication number
CN115661204B
CN115661204B CN202211358543.1A CN202211358543A CN115661204B CN 115661204 B CN115661204 B CN 115661204B CN 202211358543 A CN202211358543 A CN 202211358543A CN 115661204 B CN115661204 B CN 115661204B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
target
tracking
avo
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211358543.1A
Other languages
Chinese (zh)
Other versions
CN115661204A (en
Inventor
聂一鸣
孔凡杰
葛超
连政
徐孝煜
黄昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical National Defense Technology Innovation Institute PLA Academy of Military Science
Priority to CN202211358543.1A priority Critical patent/CN115661204B/en
Publication of CN115661204A publication Critical patent/CN115661204A/en
Application granted granted Critical
Publication of CN115661204B publication Critical patent/CN115661204B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a collaborative searching and tracking positioning method of an unmanned aerial vehicle cluster on multiple moving targets, and belongs to the technical field of unmanned aerial vehicle collaborative and tracking control. The invention pre-plans the cruising path of each unmanned plane; the unmanned aerial vehicle performs multi-target searching and detecting by using a monocular camera; the unmanned aerial vehicle adopts a multi-stage tracking strategy to intelligently track the target; in the tracking process, the unmanned aerial vehicle performs target position measurement, calculation, optimization and release sharing; after the single target tracking is completed, the unmanned aerial vehicle automatically plans a return path and continues to search the residual targets; and in the whole course of the task, the unmanned aerial vehicle autonomously carries out real-time obstacle avoidance between the static obstacle and the unmanned aerial vehicle. The invention solves the repeated tracking problem of a plurality of unmanned aerial vehicles on the same target and the stable continuous tracking problem of the short-term loss of the target, and effectively improves the overall efficiency of collaborative searching and tracking positioning of a plurality of moving targets.

Description

Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle coordination and tracking control, and particularly relates to a coordination searching and tracking positioning method for a moving target by an unmanned aerial vehicle cluster.
Background
A drone is a type of unmanned aerial vehicle that may be used multiple times and perform certain tasks by carrying sensors or other loads. Due to the characteristics of small weight, low cost, strong autonomy, no risk of casualties and the like, the unmanned aerial vehicle has wide application in aspects of disaster relief and rescue, aerial photography, real-time detection, remote striking, post-disaster rescue, geological investigation and the like. Unmanned aerial vehicles perform target searching, detection tracking and positioning tasks in a certain area, and are always an important means for acquiring information in military and civil fields. In recent years, with the rapid development of technologies such as artificial intelligence, sensors, signal processing, image recognition, mobile communication and the like, an unmanned aerial vehicle cluster cooperation scheme becomes an important means for completing target cooperation searching and tracking positioning tasks.
Under a complex scene, a single unmanned aerial vehicle often faces the problems of large search range, more tracking objects and the like, so that tasks such as target searching, tracking and positioning cannot be completed quickly. The unmanned aerial vehicle clusters can achieve higher efficiency through mutual cooperation, and compared with the single unmanned aerial vehicle when executing collaborative searching and tracking positioning tasks, the single unmanned aerial vehicle has the following advantages: (1) The task area is expanded, and a larger area can be searched by utilizing cooperation; (2) Robustness is increased, and the situation that tasks fail due to single unmanned aerial vehicle faults is avoided; (3) The task execution efficiency is improved, the unmanned aerial vehicle clusters execute tasks in parallel, and the search time is reduced. However, when the unmanned aerial vehicle cluster executes the collaborative search task, improving the rationality of task allocation, global information sharing, guaranteeing the safety of the unmanned aerial vehicle cluster, mutual interference of multiple unmanned aerial vehicles and the like are still key problems to be solved urgently. The unmanned aerial vehicle cluster collaborative search, tracking and positioning tasks can be divided into three parts: (1) task area coverage search: planning a search path of each unmanned aerial vehicle, and realizing intelligent search of the unmanned aerial vehicle cluster on a task area; (2) real-time target detection: the unmanned aerial vehicle detects targets in the search area in real time by using an airborne sensor; (3) target tracking and localization: the unmanned aerial vehicle can track the target intelligently and calculate the position of the target accurately.
Unmanned aerial vehicle cluster collaborative search and tracking positioning need to solve the following problems: (1) Selecting a proper task planning method to adapt to various complex scenes; (2) How to plan a proper cruising path by collaborative searching, and quickly searching, so as to avoid low searching efficiency; (3) The stable tracking method is designed, so that the obstacle can be avoided and the tracking process is not hindered.
The task planning method of the unmanned aerial vehicle cluster is mainly divided into two types, namely centralized type and distributed type. The centralized type is that a central control node performs task planning on the whole unmanned aerial vehicle cluster, so that the efficiency is high but the robustness is poor; the distributed unmanned aerial vehicle has no central control node, can independently search and detect, can also perform local information interaction and decision to achieve the aim of completing a search task, has stronger robustness and expandability, and has higher requirements on algorithms. The existing unmanned aerial vehicle cluster collaborative search problem can be mainly divided into a static programming method and a dynamic programming method at home and abroad. In the aspect of static planning, the method mainly comprises the area processing and path planning methods such as Voronoi diagram, graham's method, dubins path planning and the like. The operation adopts the divide-and-conquer idea, divides the search area, determines the optimal search path of the unmanned aerial vehicle in each sub-area, solves the problem of low cooperative efficiency of unmanned aerial vehicle clusters, but is difficult to face the emergency situations of unmanned aerial vehicle faults, incapability of taking off and the like. The other dynamic planning method is mainly to construct a pheromone map, a target probability map and other maps, and then to conduct real-time dynamic decision by adopting optimization methods such as distributed predictive control, rolling time domain optimization and the like. Unmanned aerial vehicle cluster cooperative target tracking is mainly divided into two modes of standby off tracking and persistence tracking. The stand off tracking needs to control the unmanned aerial vehicle to keep a specific distance from the target, and also needs to keep a relatively proper observation angle between the unmanned aerial vehicle and the target, so that the unmanned aerial vehicle usually performs circular motion around the target, and the more mature method comprises the methods of an optimal guidance parameter searching algorithm, a roll angle, a speed control law and the like. While performing a persistence tracking task, it is often manifested as performing object tracking in a particular formation. And a proper unmanned aerial vehicle formation tracking method is designed, and formation tracking control on the target is realized through mutual cooperation among unmanned aerial vehicles.
Disclosure of Invention
Therefore, aiming at the problem of low efficiency of multi-unmanned aerial vehicle collaborative search and target tracking positioning in a complex urban environment, the invention provides a multi-unmanned aerial vehicle collaborative rapid search and tracking positioning method with self-adaptive strategy switching. According to the method, through a real-time high-precision target detection algorithm, a dynamic obstacle avoidance strategy and a quick coordinate measuring and calculating method, the technical problems that a single target is repeatedly tracked, multiple unmanned aerial vehicles are prevented from being bumped and targets are blocked for a short time under the condition of multiple obstacles can be effectively solved, and the quick search, stable tracking and accurate positioning of the known multiple moving targets in a task area by the unmanned aerial vehicle cluster are realized.
In order to achieve the above object, the present invention provides the following technical solutions:
a collaborative searching and tracking positioning method of unmanned aerial vehicle clusters to multiple moving targets is shown in fig. 1, and comprises the following steps:
step S1, a gridding environment information and an unmanned aerial vehicle cruising model are constructed, as shown in FIG. 6;
step S2, performing target detection on a search area by using an airborne monocular camera based on a deep learning method;
s21, training a target detection algorithm network model;
firstly, N task area images containing targets are collected through an unmanned plane monocular camera, and targets expected to be tracked in the images are marked in a manual marking mode to generate corresponding multi-category labels;
Taking the image and the labeled target label data as a training set, training a target detection algorithm depth target detection model, and completing training when one of the following conditions is met:
condition 1: the accuracy of the target detection algorithm model is not improved any more;
condition 2: the training times reach the preset maximum set times E max
Secondly, collecting M task areas containing targetsInputting the domain image into a target detection algorithm model with complete training to generate a new target label, manually checking and correcting the detection result to finally obtain a target label set L of the region to be searched O The method comprises the steps of carrying out a first treatment on the surface of the Wherein M is much greater than N;
m images and tag set L O Inputting the target detection algorithm network model for training, and ending training when one of the following conditions is met:
condition 1: the accuracy of the target detection algorithm model is not improved any more;
condition 2, training times reaching a predetermined maximum set time E' max
S22, acquiring an image of a region to be searched in real time;
cruising search is carried out on the unmanned aerial vehicle after taking off according to the path obtained in the step S1, RGB image data of a search area are collected in real time through an onboard monocular camera, and the image data collected by different unmanned aerial vehicles at the same time t are integrated into an image set P t ={p 1 ,p 2 ,…,p n P, where i Searching area images acquired at t moment of the ith unmanned aerial vehicle;
s23, performing target detection on the monocular camera image by using a target detection algorithm model;
utilizing the S21 pre-trained multi-classification target detection algorithm model to real-time acquire the image set P obtained in the step S22 t Image detection is carried out to obtain a target information set U of each image O Which includes the length and width (H, L) of each target bounding box, the center point coordinates (x o ,y o ) Confidence CONF O And class number cls;
step S3, after the unmanned aerial vehicle detects the target, the unmanned aerial vehicle autonomously selects different tracking strategies to intelligently track the target;
s31, when the unmanned aerial vehicle detects the target, judging whether the newly found target is tracked by other unmanned aerial vehicles, and eliminating the tracked target
The unmanned aerial vehicle flies along the cruising path, and a target detection algorithm is utilized to detect a target of a real-time image of the monocular camera in the cruising process; meanwhile, the unmanned aerial vehicle acquires and stores the current tracking and releasing target positions of other unmanned aerial vehicles in real time, carries out multi-frame clustering according to the acquired target position information released by other unmanned aerial vehicles, and analyzes to obtain the tracking and discovering conditions of the current targets;
s32, traversing U 'by unmanned aerial vehicle' O,n According to the distance between the detected target and the unmanned plane, selecting the nearest target, and judging to enter a long-distance tracking or short-distance tracking program:
taking the size of a bounding box of a target detection algorithm as a judgment basis of a far target and a near target, wherein h is i And l i Respectively, currently detects U' O,n The height and width of the ith target bounding box;
h i <T h or l i <T l Or y min,i <T y Step S33 is entered, otherwise step S34 is entered;
wherein T is h 、T l 、T y Numerical selection is set according to the imaging resolution of the monocular camera used, y min,i Is U' O,n The y value of the pixel point at the upper left corner of the ith bounding box;
s33, when the unmanned aerial vehicle judges that the target position is far away from the unmanned aerial vehicle, entering a remote tracking program;
when unmanned aerial vehicle is tracked remotely, cloud platform angle keeps T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
first, a relative distance (u) of the target bounding box from the center point imaged by the monocular camera is calculated x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) The position of the center point for imaging the monocular camera is determined according to y o Setting the forward speed v of the unmanned aerial vehicle in a pixel distance segmentation way from the imaging center point of the monocular camera x Is of the size of (2):
v x =k fi ×y o +b fi
wherein k is fi 、b fi Parameters are adjusted for different speeds during remote tracking, i representing y at this time o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
wherein k is f Yaw rate w for unmanned aerial vehicle z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
s34, when the unmanned aerial vehicle judges that the target position is close to the unmanned aerial vehicle, entering close-range tracking;
unmanned aerial vehicle closely tracks, and cloud platform angle adjustment is T g2 Wherein T is g2 Greater than T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
calculating the relative distance (u) of the object bounding box from the center point of monocular camera imaging x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) Center point position for monocular camera imaging, speed of advance of unmanned aerial vehicle v x According to the size y o And u is equal to cy Ratio u of (2) tc Segment setting is performed, wherein forward is positive and backward is negative:
v x =k ni ×y o +b ni
wherein k is ni 、b ni Parameters are adjusted for different speeds during close-range tracking, i representing thisTime y o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
wherein k is n For unmanned aerial vehicle short distance yaw rate w z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
S35, in the unmanned aerial vehicle tracking process, after a target is temporarily lost due to shielding, the unmanned aerial vehicle pre-judges and tracks the target motion trail based on the motion state before the target is lost;
step S4, measuring and calculating the position of the target while tracking, and sharing the target position in real time
According to the perspective transformation principle, mapping the position of the target on the camera image into an unmanned aerial vehicle body coordinate system to obtain the relative position with the unmanned aerial vehicle, converting the position of the target into the world coordinate system through the world coordinate of the unmanned aerial vehicle, so as to obtain the coordinate of the target in the world coordinate system, and releasing and sharing the category and the coordinate of the target into a global network by the unmanned aerial vehicle;
s41, eliminating the wrong target position;
the target detection network has good detection effect and can continuously detect the appointed target; when an sporadic detection result appears, namely, no target is detected in continuous frames, the detection is judged to be error detection, and the tracking program does not respond to the detection result;
the current unmanned aerial vehicle records the coordinates of each category target detection frame in the nearest n frames on the image; frame number n when a certain class of object is detected d If the number is less than n, the target is not processed, and the sporadic detection result is used as the detection of possible errors to be eliminated;
S42, automatically correcting the target class errors in target detection by utilizing the front and rear frame information;
in unmanned aerial vehicle close range tracking, the class of the tracked object is locked to be cls tr The method comprises the steps of carrying out a first treatment on the surface of the At t s In the time range of seconds, when the target class is C tr Is detected to disappear and other target class C is detected e When comparing object class C tr Coordinates of last moment (x t ,y t ) Coordinates (x) of other object classes currently detected e ,y e ) If the distance d between them err <d tr Then the target class C e Modified to C tr
Wherein,d tr judging a threshold value with similar distance between two coordinates for the image coordinate system;
s43, smoothing target detection result by using multi-frame sequential information
When a target is detected in continuous frames, a multi-frame smoothing technology is utilized to exclude a detection result with larger error, the target detection positions of the multi-frames are fused, and an average value of target coordinates is calculated;
after the false detection is eliminated, the average coordinate value (x m ,y m ):
Wherein x is o,i 、y o,i Respectively an x-axis coordinate and a y-axis coordinate of a target detection frame in an ith frame;
s44, calculating the position of the target in the world coordinate system and publishing the position to the global network
Under the camera image coordinate system and the unmanned plane body coordinate system, n and n >4 are found, the corresponding coordinates above are brought into the following perspective transformation formula, and the homography matrix Trans of the perspective transformation is calculated:
[x u y u 1]=[x m y m 1]·Trans
Wherein [ x ] u y u 1]Representing the coordinates of a point under the unmanned aerial vehicle body coordinate system, [ x ] m y m 1]Representing coordinates of points under a camera image coordinate system;
obtaining coordinates of points corresponding to the unmanned aerial vehicle body coordinate system under the image coordinate system through perspective transformation homography matrix Trans, and obtaining the relative position of the target and the unmanned aerial vehicle;
and then, converting the relative positions of the target and the unmanned aerial vehicle into absolute positions in a world coordinate system through a transformation matrix and the pose of the unmanned aerial vehicle:
[x w y w 1 1]=[x u y u 1 1]·Tr w_uav
wherein,yaw angle, x of unmanned aerial vehicle uav And y uav The coordinates of the unmanned aerial vehicle in a world coordinate system;
object class cls to be tracked and coordinates x of the object in the world coordinate system w ,y w Publishing to a global network;
s5, planning a path returning to the cruising point by using a dynamic path planning algorithm;
after the unmanned aerial vehicle finishes tracking the target, a collision-free path is planned by adopting dynamic path planning rules by taking the current position as a starting point and taking a cruising path point closest to the unmanned aerial vehicle as an end point;
after reaching the end point, returning to the step S2, and continuing searching and detecting until the searching, tracking and positioning tasks of all the moving targets are completed;
and S6, in the whole process of the search task, the unmanned aerial vehicle in the steps S1-S5 needs to carry out obstacle avoidance treatment on the static obstacle and other unmanned aerial vehicles.
Compared with the prior art, the invention has the advantages and positive effects that:
1. according to the invention, the cooperative tracking mode of the unmanned aerial vehicle clusters on the targets is comprehensively considered and optimized, different targets in the same class can be autonomously identified, different unmanned aerial vehicles are automatically allocated to track the different targets, and the problem of repeated tracking of a single target is solved;
2. the invention provides a static obstacle avoidance algorithm and a dynamic obstacle avoidance algorithm, solves the problem of collision avoidance of multiple unmanned aerial vehicles under the condition of multiple obstacles, and prevents the impact of collision on an overall task;
3. according to the invention, the target prior motion state is utilized to carry out fitting calculation on the target motion track, so that the problem of short-time shielding loss of the target is solved, and the tracking continuity and stability are improved;
4. in the invention, the tracking of the moving target is split into a long-distance tracking mode and a short-distance tracking mode, and different camera visual field ranges and unmanned aerial vehicle tracking strategies are adaptively switched in different modes, so that the efficiency of tracking the moving target is effectively improved.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flowchart of a method for collaborative searching and tracking positioning of a moving object by a cluster of unmanned aerial vehicles according to an embodiment of the present invention;
FIG. 2 is a block diagram of a tracking strategy for a drone to multiple moving targets according to one embodiment of the invention;
fig. 3 is a flowchart of a remote tracking strategy of a moving object by a drone according to one embodiment of the present invention;
FIG. 4 is a flowchart of a close-range tracking strategy of a drone to a moving target, according to one embodiment of the present invention;
FIG. 5 is a block diagram of a process for locating and publishing a moving object by a drone according to one embodiment of the present invention;
FIG. 6 is a schematic diagram of a simulation test environment rasterized modeling in accordance with an embodiment of the invention;
fig. 7 is a schematic diagram illustrating dynamic obstacle avoidance among unmanned aerial vehicle clusters according to an embodiment of the present invention;
FIG. 8 is a schematic view of an obstacle avoidance of a static obstacle by a drone according to one embodiment of the present invention;
fig. 9 is a schematic diagram of a speed change of a drone tracking process according to one embodiment of the present invention.
Detailed Description
In order to make the technical scheme and advantages of the present invention more clear, the technical scheme in the embodiment of the present invention is clearly and completely described below with reference to the accompanying drawings in the embodiment of the present invention:
the embodiment of the invention relates to a collaborative searching and tracking positioning method for a moving target by an unmanned aerial vehicle cluster, which comprises the following steps.
A collaborative searching and tracking positioning method of unmanned aerial vehicle clusters to multiple moving targets comprises the following steps:
step S1, a gridding environment information and an unmanned aerial vehicle cruising model are built according to specific tasks;
the person skilled in the art can choose to construct the gridding environment information and the unmanned aerial vehicle cruising model according to specific task requirements, and a construction mode is provided in the embodiment.
Further, the implementation process of step S1 is as follows:
s11, acquiring map information of a search task area according to a search task requirement, and rasterizing environment information of the task area;
explicitly searching task space boundary range { O x ,O y ,O z }, wherein O x ,O y Refers to the boundary range of the XY axis plane, O z Refers to the range of elevation boundaries;
rasterizing modeling is carried out on an XY axis plane of a search task space, and the area size of a single grid is set to be S 2 The number of grids q=o x ×O y /S 2
Classifying different grids according to actual obstacle informationAggregate phi for obstacle grid obs And a trafficable grid set Φ ava
S12, searching a cruising path in a passable area;
setting a search condition set U cond ={C 1 ,C 2 ,C 3 }, wherein C 1Uav i Refers to a grid covered by a monocular camera view of an ith unmanned aerial vehicle, C 1 Meaning that the total grid covered by the monocular camera vision of each unmanned aerial vehicle is equal to the trafficable grid set phi ava
C 2Where i.noteq.j, uav i,t Refers to a grid covered by the field of view of the ith unmanned aerial vehicle at the moment t, C 2 Meaning that each unmanned aerial vehicle cannot search for repeated grids at the same time t;
C 3 :Z uav =H cru ,0<H cru <O z ,C 3 finger unmanned aerial vehicle keeps high H cru
To search for condition sets U cond Based on, in the trafficable grid set phi ava Planning grid areas which are required to be searched for and covered by each unmanned aerial vehicle, and connecting the centers of the planned areas to form an area search cruising path of each unmanned aerial vehicle;
s13, after the cruising path is determined, the unmanned aerial vehicle takes off and flies along the cruising path at the speed v and the height H cru Cruising.
Step S2, performing target detection on a search area by using an airborne monocular camera based on a deep learning method;
a person skilled in the art can select a suitable target detection algorithm according to task requirements to perform target detection in the step 2, wherein the target detection algorithm is YOLOv4 or YOLOv7, and can achieve a good detection effect, and the performance of YOLOv7 is most prominent, and the embodiment is described by taking YOLOv7 as an example.
S21, training a YOLOv7 network model;
firstly, N task area images containing targets are collected through an unmanned plane monocular camera, and targets expected to be tracked in the images are marked in a manual marking mode to generate corresponding multi-category labels;
Taking the image and the labeled target label data as a training set, training the Yolov7 depth target detection model, and completing training when one of the following conditions is met:
condition 1: the accuracy of the YOLOv7 model is not improved any more;
condition 2: the training times reach the preset maximum set times E max
Secondly, collecting M task area images containing targets, inputting a training-completed YOLOv7 model to generate new target labels, manually checking and correcting detection results to finally obtain a target label set L of the area to be searched O The method comprises the steps of carrying out a first treatment on the surface of the Wherein M is much greater than N;
m images and tag set L O Inputting the YOLOv7 network model for training, and ending training when one of the following conditions is met:
condition 1: the accuracy of the YOLOv7 model is not improved any more;
condition 2, training times reaching a predetermined maximum set time E' max
S22, acquiring an image of a region to be searched in real time;
cruising search is carried out on the unmanned aerial vehicle after taking off according to the path obtained in the step S1, RGB image data of a search area are collected in real time through an onboard monocular camera, and the image data collected by different unmanned aerial vehicles at the same time t are integrated into an image set P t ={p 1 ,p 2 ,…,p n P, where i Searching area images acquired at t moment of the ith unmanned aerial vehicle;
S23, performing target detection on the monocular camera image by using a YOLOv7 model;
utilizing the S21 pre-trained multi-classification YOLOv7 model to real-time acquire the image set P obtained in the step S22 t Image detection is carried out to obtain a target information set U of each image O Containing each target bounding boxLength and width (H, L), center point coordinates (x o ,y o ) Confidence CONF O And class number cls;
further, step S2 further includes S24, optimizing the filtering target detection result;
the filtration conditions were set as follows:
condition 1, detection frame confidence CONF O ≥T C Wherein T is C Referring to a lowest confidence threshold constant, discarding the bounding box if less than the constant;
condition 2, target bounding box area H.L is greater than or equal to T S Wherein T is S Referring to the lowest area threshold constant, discarding the bounding box if less than the constant;
condition 3, overlapping area S of two adjacent bounding boxes IOU ≤T IOU Wherein T is IOU Referring to a maximum overlapping area threshold constant, if the maximum overlapping area threshold constant is larger than the maximum overlapping area threshold constant, discarding bounding boxes with lower confidence;
condition 4, target class number Cls is not in target class set Cls to be tracked ex In, i.eDiscarding the bounding box;
according to the filtering condition, the target information set U O Filtering and updating to obtain a final target information set U 'to be tracked in the task area' O Better results are obtained.
Step S3, after the unmanned aerial vehicle detects the target, the unmanned aerial vehicle autonomously selects different tracking strategies to intelligently track the target, as shown in fig. 2;
s31, when the unmanned aerial vehicle detects the target, judging whether the newly found target is tracked by other unmanned aerial vehicles, and eliminating the tracked target
The unmanned aerial vehicle flies along the cruising path, and the real-time image of the monocular camera is subjected to target detection by utilizing YOLOv7 in the cruising process; meanwhile, the unmanned aerial vehicle acquires and stores the current tracking and releasing target positions of other unmanned aerial vehicles in real time, carries out multi-frame clustering according to the acquired target position information released by other unmanned aerial vehicles, and analyzes to obtain the tracking and discovering conditions of the current targets;
further, the step S31 is specifically implemented as follows:
s311, each unmanned aerial vehicle acquires the target positions of other unmanned aerial vehicles which are tracked and released at present, stores the acquired continuous positions in a classification mode, and stores the target positions in an array C respectively k Wherein k is a category label, C k The size is L, and if the front and rear position information acquisition interval is greater than s seconds, the current category target is determined to be lost, and C is determined to be k Initializing and zeroing the array;
s312 if the current time C k The last digit value of the array is non-zero, and C k Continuously updating the data in the array, and then classifying the object into the class C k The array performs k-means clustering, and the purpose of the k-means clustering is to determine that a plurality of different targets under the category are being tracked currently, and the specific method is as follows: determining an initial centroid k value of the cluster by adopting an elbow rule, clustering the initial centroid k value of the cluster under the condition of different values k=2, 3,4, and calculating the sum k of clustering distances according to the following formula dist
C k Sample X in array i The clustering result is C (i) Clustering is performed to obtain distance statistics W (in) of different samples among the same target:
sample distance statistic W (out) between different targets:
the sample statistics for the entire cluster are:
K dist =W(in)+W(out);
selecting the most proper k value by using an ellaw so as to determine the number of tracked targets in the current category; meanwhile, taking the clustering center position of each category as the coordinates of each target to obtain the tracking condition of other unmanned aerial vehicles on different targets in different categories;
s313 as the nth unmanned aerial vehicle, the target detection set obtained by S24I.e. when the target is detected, if C k If the distance E is not the same as T, comparing the target position obtained by remote measurement with the target positions of the same class which are currently tracked by other unmanned aerial vehicles, and if the Euclidean distance E is less than T l The target is the target which is tracked by other unmanned aerial vehicles, and the target is collected U' O,n The target is deleted.
S32, traversing U 'by unmanned aerial vehicle' O,n According to the distance between the detected target and the unmanned plane, selecting the nearest target, and judging to enter a long-distance tracking or short-distance tracking program:
taking the size of a YOLOv7 bounding box as a judgment basis of a far and near target, wherein h is i And l i Respectively, currently detects U' O,n The height and width of the ith target bounding box;
h i <T h or l i <T l Or y min,i <T y Step S33 is entered, otherwise step S34 is entered;
wherein T is h 、T l 、T y Numerical selection is set according to the imaging resolution of the monocular camera used, y min,i Is U' O,n The y value of the pixel point at the upper left corner of the ith bounding box;
s33, when the unmanned aerial vehicle judges that the target position is far away from the unmanned aerial vehicle, entering a long-distance tracking program, as shown in fig. 3;
when unmanned aerial vehicle is tracked remotely, cloud platform angle keeps T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
first, a relative distance (u) of the target bounding box from the center point imaged by the monocular camera is calculated x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) The position of the center point for imaging the monocular camera is determined according to y o Setting the forward speed v of the unmanned aerial vehicle in a pixel distance segmentation way from the imaging center point of the monocular camera x The size of (2) is shown in fig. 9:
v x =k fi ×y o +b fi
wherein k is fi 、b fi Parameters are adjusted for different speeds during remote tracking, i representing y at this time o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
wherein k is f Yaw rate w for unmanned aerial vehicle z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
s34, when the unmanned aerial vehicle judges that the target position is close to the unmanned aerial vehicle, entering close-range tracking, as shown in FIG. 4;
unmanned aerial vehicle closely tracks, and cloud platform angle adjustment is T g2 Wherein T is g2 Greater than T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
calculating the relative distance (u) of the object bounding box from the center point of monocular camera imaging x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) Center point position for monocular camera imaging, speed of advance of unmanned aerial vehicle v x According to the size y o And u is equal to cy Ratio u of (2) tc Segment setting is performed, wherein forward is positive and backward is negative:
v x =k ni ×y o +b ni
wherein k is ni 、b ni For different speed adjustment parameters in close-range tracking, i represents y at this time o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
wherein k is n For unmanned aerial vehicle short distance yaw rate w z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
s35, in the unmanned aerial vehicle tracking process, after a target is temporarily lost due to shielding, the unmanned aerial vehicle pre-judges and tracks the target motion trail based on the motion state before the target is lost;
there are various manners of prejudging and tracking, and those skilled in the art can choose according to specific situations, and the present invention provides an implementation manner as follows:
further, the specific implementation of step S35 includes the following steps:
s351, creating a queue q, wherein the queue stores the current tracked target position of the unmanned aerial vehicle in real time, the last data of the queue q is the latest motion position of the current tracked target, and the data in the queue q are continuous in time;
s352 automatically acquires data in a queue q when the target is lost due to occlusion of the current tracked target and the unmanned aerial vehicle, calculates the advancing direction theta of the target before disappearance, and fits the motion trail of the target:
wherein y is act,n Y-value, x, representing the last target position in queue q act,n X value, y representing last target position in queue q act,i Y-value, x, representing the first target position in queue q act,1 X-value, v, representing the first target position in queue q act Representing the target movement speed, t q Representing the time interval from the acquisition of the first position to the last position in the whole queue q;
s353, predicting the movement track of the blocked target according to the calculated movement direction and movement speed of the target:
x act,pre =y act,n +cos(q)×(t now -t lose )
y act,pre =y act,n +sin(q)×(t now -t lose )
wherein x is act,pre 、y act,pre Representing the x and y coordinates, t, of the current predicted target position now Indicating the current time, t lose Indicating the target losing moment; the predicted target position is used as the target position of the tracked target after being temporarily lost due to shielding;
s354, the unmanned aerial vehicle adjusts the tracking speed according to the target moving direction and speed obtained through prediction, predicts and tracks, and when the target is not shielded any more, the unmanned aerial vehicle enters S34
v wx =cos(θ)×v act
v wy =sin(θ)×v act
Wherein v is wx 、v wy Respectively representing the speeds of the unmanned aerial vehicle in the x axis and the y axis of the world coordinates at the current moment.
Step S4, measuring and calculating the position of the target while tracking, and sharing the target position in real time
S41, eliminating the wrong target position;
the current unmanned aerial vehicle records the coordinates of each category target detection frame in the nearest n frames on the image; when a certain category Frame number n of detected target d If the number is less than n, the target is not processed, and the sporadic detection result is used as the detection of possible errors to be eliminated;
s42, automatically correcting the target class errors in target detection by utilizing the front and rear frame information;
in unmanned aerial vehicle close range tracking, the class of the tracked object is locked to be cls tr The method comprises the steps of carrying out a first treatment on the surface of the At t s In the time range of seconds, when the target class is C tr Is detected to disappear and other target class C is detected e When comparing object class C tr Coordinates of last moment (x t ,y t ) Coordinates (x) of other object classes currently detected e ,y e ) If the distance d between them err <d tr Then the target class C e Modified to C tr
Wherein,d tr judging a threshold value with similar distance between two coordinates for the image coordinate system;
s43, smoothing the target detection result by utilizing multi-frame sequential information;
when a target is detected in continuous frames, a multi-frame smoothing technology is utilized to exclude a detection result with larger error, the target detection positions of the multi-frames are fused, and an average value of target coordinates is calculated;
after the false detection is eliminated, the average coordinate value (x m ,y m ):
Wherein x is o,i 、y o,i Respectively an x-axis coordinate and a y-axis coordinate of a target detection frame in an ith frame;
s44, calculating the position of the target in the world coordinate system and publishing the position to the global network, as shown in FIG. 5;
Under the camera image coordinate system and the unmanned plane body coordinate system, n and n >4 are found, the corresponding coordinates above are brought into the following perspective transformation formula, and the homography matrix Trans of the perspective transformation is calculated:
[x u y u 1]=[x m y m 1]·Trans
wherein [ x ] u y u 1]Representing the coordinates of a point under the unmanned aerial vehicle body coordinate system, [ x ] m y m 1]Representing coordinates of points under a camera image coordinate system;
obtaining coordinates of points corresponding to the unmanned aerial vehicle body coordinate system under the image coordinate system through perspective transformation homography matrix Trans, and obtaining the relative position of the target and the unmanned aerial vehicle;
and then, converting the relative positions of the target and the unmanned aerial vehicle into absolute positions in a world coordinate system through a transformation matrix and the pose of the unmanned aerial vehicle:
[x w y w 1 1]=[x u y u 1 1]·Tr w_uav
wherein,yaw angle, x of unmanned aerial vehicle uav And y uav The coordinates of the unmanned aerial vehicle in a world coordinate system;
object class cls to be tracked and coordinates x of the object in the world coordinate system w ,y w Publishing to a global network;
s5, planning a path returning to the cruising point by using a dynamic path planning algorithm;
after the unmanned aerial vehicle finishes tracking the target, a collision-free path is planned by adopting dynamic path planning rules by taking the current position as a starting point and taking a cruising path point closest to the unmanned aerial vehicle as an end point;
After reaching the end point, returning to the step S2, and continuing searching and detecting until the searching, tracking and positioning tasks of all the moving targets are completed;
further, in the embodiment of the present invention, the dynamic path planning algorithm in S5 adopts D * The algorithm is used for planning the path returning to the cruising point, and other path planning algorithms can be adopted by the person skilled in the art according to the requirements, D * The algorithm plans a path returning to the cruising point as follows:
s51, rasterizing and dividing a task area into an unviewable area, a difficult-to-pass area and an unviewable area;
rasterizing a task area; according to the passing difficulty, the area is divided into 3 main parts: passable areas, non-passable areas and difficult-to-pass areas; the passable area refers to an area with an open visual field and normal passing of the unmanned aerial vehicle, the non-passable area refers to an area with static barriers, and the difficult-to-pass area refers to an area with an unopened visual field and easy collision of the unmanned aerial vehicle;
s52, setting a loss value of each grid, and planning an optimal path for returning to the cruising point;
after the unmanned aerial vehicle completes tracking one target, D is used * Calculating a path returning to a cruising point by an algorithm; d (D) * The algorithm comprises heuristic functions f (n) =g (n) +h (n), wherein g (n) is the shortest path value currently known from the starting point to the point n, and h (n) is the loss value from the point n to the target;
Loss values of passable areas, inaccessible areas and inaccessible areas in adjacent nodes are respectively set to N QYS 、N QYS +n and Inf; at the same time set to 1.4 x N in diagonally adjacent nodes QYS 、1.4*(N QYS +n) and Inf; the purpose of setting the loss value is to enable the return route of the unmanned aerial vehicle to be contained in the passable area as much as possible in the completion of the route planning of returning to the cruising point.
Step S6, in the whole process of searching the task, the unmanned aerial vehicle in steps S1-S5 needs to avoid the static obstacle and other unmanned aerial vehicles, and a person skilled in the art can adopt a proper method to avoid the obstacle according to the requirement, and the step S6 is realized according to the following steps:
s61, a static obstacle avoidance algorithm of the unmanned aerial vehicle according to the position change speed of surrounding static obstacles is shown in fig. 7;
according to the positions of static obstacles (such as objects threatening the unmanned aerial vehicle to fly, such as buildings, street lamps, hills and the like), the static obstacle points closest to the unmanned aerial vehicle are obtained by traversing the area around the unmanned aerial vehicle within a certain range;
at speed v when the drone is in close range tracking 0 <v avo_m When flying, starting a fine adjustment obstacle avoidance program; traversing omega around unmanned aerial vehicle s Points within range find distance unmanned plane U 0 Nearest obstacle point, Ω s The calculation can be made by the following formula:
Ω s ={(x,y)x 0 -D avo_s <x<x 0 +D avo_s ,y 0 -D avo_s <y<y 0 +D avo_s }
wherein D is avo_s The response distance of the unmanned aerial vehicle to the obstacle when the unmanned aerial vehicle is in close-range tracking obstacle avoidance;
if omega s Points within the range contain points M marked as obstacles obs Starting a small-speed static obstacle avoidance program; under the unmanned aerial vehicle coordinate system, the forward direction of the unmanned aerial vehicle is taken as the x axis, the leftward translation direction of the unmanned aerial vehicle is taken as the y axis, and then the obstacle point M obs Distance from unmanned plane in x-axis direction d x The distance from the unmanned plane in the y-axis direction is d y The Euclidean distance to the unmanned aerial vehicle is
After obstacle avoidance, the flying speed of the unmanned aerial vehicle in the advancing direction is as follows:
the flying speed of the unmanned aerial vehicle in the left moving direction is as follows:
wherein d avo_sx And d avo_sy In the small-speed static obstacle avoidance program, speed adjustment parameters of the unmanned aerial vehicle in the forward and leftward movement directions are respectively set;
at a speed v when the unmanned aerial vehicle is in the process of remote tracking 0 ≥v avo_m During flight, starting an emergency obstacle avoidance program; traversing omega around unmanned aerial vehicle h Points within range find distance unmanned plane U 0 Nearest obstacle point, Ω h The calculation can be made by the following formula:
Ω h ={(x,y)|x 0 -D avo_h <x<x 0 +D avo_h ,y 0 -D avo_h <y<y 0 +D avo_h }
wherein D is avo_h The response distance of the unmanned aerial vehicle to the obstacle when the unmanned aerial vehicle is used for remotely tracking and avoiding the obstacle is provided;
if omega h Points within the range contain points M marked as obstacles obs Starting a high-speed static obstacle avoidance program; under the unmanned aerial vehicle coordinate system, the forward direction of the unmanned aerial vehicle is taken as the x axis, the leftward translation direction of the unmanned aerial vehicle is taken as the y axis, and then the obstacle point M obs Distance from unmanned plane in x-axis direction d x The distance from the unmanned plane in the y-axis direction is d y The Euclidean distance to the unmanned aerial vehicle is
After obstacle avoidance, the flying speed of the unmanned aerial vehicle in the advancing direction is as follows:
the flying speed of the unmanned aerial vehicle in the left moving direction is as follows:
v y =abs(v 0 )·k avo_hx ·(D avo_h -d e )·(D avo_h -|d y |)
wherein d avo_hx And k avo_hx Respectively adjusting parameters of the speed of the unmanned aerial vehicle in the directions of an x axis and a y axis in a high-speed static obstacle avoidance program;
for the conditions of different speeds of the unmanned aerial vehicle, v avo_m In order to distinguish boundary values of large-speed flight and small-speed flight of the unmanned aerial vehicle, parameter settings of obstacle avoidance programs are different under different speeds, and the general conditions are as follows:
d avo_hx >d avo_sx ,d avo_sy >1,k avo_hx <<1,D avo_h >D avo_s
s62, the unmanned aerial vehicle changes the inter-machine dynamic obstacle avoidance algorithm of the height according to the position relation between other unmanned aerial vehicles, as shown in fig. 8;
unmanned plane Uav 0 Acquiring Uav of other N-1 unmanned aerial vehicles in real time 1 ,Uav 2 ,...,Uav N-1 And calculates the distance d with other unmanned aerial vehicles 1 ,d 2 ,...,d N-1 The method comprises the steps of carrying out a first treatment on the surface of the When min { d 1 ,d 2 ,...,d N-1 }<D avo_d During the starter obstacle avoidance procedure, D avo_d The response distance for dynamic obstacle avoidance between unmanned aerial vehicles is;
in general, the initial cruising altitude of the unmanned aerial vehicle is H cru After the obstacle avoidance process of the starter, the expected cruising altitude of the unmanned aerial vehicle is modified to H avo The calculation method is as follows:
H avo =H cru +N higher ·H bias -N lower ·H bias
wherein N is higher Is unmanned plane Uav 0 The distance between the two is smaller than D avo_d And height z i >z 0 (i=1, 2,.,. The number of unmanned aerial vehicles of N-1) (number ratio unmanned aerial vehicle Uav when the heights are the same 0 Number of unmanned aerial vehicles with large numbers), N lower Is unmanned plane Uav 0 The distance between the two is smaller than D avo_d And height z i <z 0 (i=1, 2,.,. The number of unmanned aerial vehicles of N-1) (number ratio unmanned aerial vehicle Uav when the heights are the same 0 Unmanned aerial vehicle number with small number);H bias The height adjustment parameter of the obstacle avoidance among the unmanned aerial vehicle is related to the maximum height of the unmanned aerial vehicle body, and is generally more than half of the maximum height of the unmanned aerial vehicle body;
after the expected cruising altitude of the unmanned aerial vehicle is modified, the speed v of the unmanned aerial vehicle in the vertical direction is changed h To a desired height, v h The calculation method is as follows:
v h =(H avo -z 0 )·v bias
wherein v is bias The unmanned aerial vehicle height adjustment method is an adjustment parameter of the unmanned aerial vehicle height direction speed during dynamic obstacle avoidance, and the larger the parameter is, the faster the unmanned aerial vehicle height adjustment is;
when min { d 1 ,d 2 ,...,d N-1 }≥D avo_d When the unmanned aerial vehicle exits from the inter-aircraft dynamic obstacle avoidance program, the expected cruising altitude of the unmanned aerial vehicle is changed into H cru At the same time with v h =(H cru -z 0 )·v bias Is returned to the original desired height.

Claims (6)

1. The collaborative searching and tracking positioning method for the multiple moving targets by the unmanned aerial vehicle cluster is characterized by comprising the following steps:
step S1, a gridding environment information and an unmanned aerial vehicle cruising model are constructed according to specific task requirements; the S1 is realized according to the following steps:
S11, acquiring map information of a search task area according to a search task requirement, and rasterizing environment information of the task area;
explicitly searching task space boundary range { O x ,O y ,O z }, wherein O x ,O y Refers to the boundary range of the XY axis plane, O z Refers to the range of elevation boundaries;
rasterizing modeling is carried out on an XY axis plane of a search task space, and the area size of a single grid is set to be S 2 The number of grids q=o x ×O y /S 2
Classifying different grids into a set of obstacle grids phi according to actual obstacle information obs And a trafficable grid set Φ ava
S12, searching a cruising path in a passable area;
setting a search condition set U cond ={C 1 ,C 2 ,C 3 }, wherein C 1Uav n Refers to a grid covered by a monocular camera view of an nth unmanned aerial vehicle, C 1 Meaning that the total grid covered by the monocular camera vision of each unmanned aerial vehicle is equal to the trafficable grid set phi ava
C 2Where i.noteq.j, uav i,t Refers to a grid covered by the field of view of the ith unmanned aerial vehicle at the moment t, C 2 Meaning that each unmanned aerial vehicle cannot search for repeated grids at the same time t;
C 3 :Z uav =H cru ,0<H cru <O z ,C 3 finger unmanned aerial vehicle keeps high H cru
To search for condition sets U cond Based on, in the trafficable grid set phi ava Planning grid areas which are required to be searched for and covered by each unmanned aerial vehicle, and connecting the centers of the planned areas to form an area search cruising path of each unmanned aerial vehicle;
S13, after the cruising path is determined, the unmanned aerial vehicle takes off and flies along the cruising path at the speed v and the height H cru Cruising;
step S2, performing target detection on a search area by using an airborne monocular camera based on a deep learning method;
s21, training a target detection algorithm network model;
firstly, N task area images containing targets are collected through an unmanned plane monocular camera, and targets expected to be tracked in the images are marked in a manual marking mode to generate corresponding multi-category labels;
taking the image and the labeled target label data as a training set, training a target detection algorithm depth target detection model, and completing training when one of the following conditions is met:
condition 1: the accuracy of the target detection algorithm model is not improved any more;
condition 2: the training times reach the preset maximum set times E max
Secondly, collecting M task area images containing targets, inputting a trained target detection algorithm model to generate new target labels, manually checking and correcting detection results to finally obtain a target label set L of the area to be searched O The method comprises the steps of carrying out a first treatment on the surface of the Wherein M is much greater than N;
m images and tag set L O Inputting the target detection algorithm network model for training, and ending training when one of the following conditions is met:
Condition 1: the accuracy of the target detection algorithm model is not improved any more;
condition 2, training times reaching a predetermined maximum set time E' max
S22, acquiring an image of a region to be searched in real time;
cruising search is carried out on the unmanned aerial vehicle after taking off according to the path obtained in the step S1, RGB image data of a search area are collected in real time through an onboard monocular camera, and the image data collected by different unmanned aerial vehicles at the same time t are integrated into an image set P t ={p 1 ,p 2 ,…,p n P, where n Searching area images acquired at the moment t of the nth unmanned aerial vehicle;
s23, performing target detection on the monocular camera image by using a target detection algorithm model;
utilizing the S21 pre-trained multi-classification target detection algorithm model to real-time acquire the image set P obtained in the step S22 t Image detection is carried out to obtain a target information set U of each image O Which includes the length and width (H, L) of each target bounding box, the center point coordinates (x o ,y o ) Confidence CONF O And class number cls;
s24, optimizing a filtering target detection result;
the filtration conditions were set as follows:
condition 1, detection frame confidence CONF O ≥T C Wherein T is C Referring to a lowest confidence threshold constant, discarding the bounding box if less than the constant;
condition 2, target bounding box area H.L is greater than or equal to T S Wherein T is S Referring to the lowest area threshold constant, discarding the bounding box if less than the constant;
condition 3, overlapping area S of two adjacent bounding boxes IOU ≤T IOU Wherein T is IOU Referring to a maximum overlapping area threshold constant, if the maximum overlapping area threshold constant is larger than the maximum overlapping area threshold constant, discarding bounding boxes with lower confidence;
condition 4, target class number Cls is not in target class set Cls to be tracked ex In, i.eDiscarding the bounding box;
according to the filtering condition, the target information set U O Filtering and updating to obtain a final target information set U 'to be tracked in the task area' O
Step S3, after the unmanned aerial vehicle detects the target, the unmanned aerial vehicle autonomously selects different tracking strategies to intelligently track the target;
s31, when the unmanned aerial vehicle detects the target, judging whether the newly found target is tracked by other unmanned aerial vehicles, and eliminating the tracked target
The unmanned aerial vehicle flies along the cruising path, and a target detection algorithm is utilized to detect a target of a real-time image of the monocular camera in the cruising process; meanwhile, the unmanned aerial vehicle acquires and stores the current tracking and releasing target positions of other unmanned aerial vehicles in real time, carries out multi-frame clustering according to the acquired target position information released by other unmanned aerial vehicles, and analyzes to obtain the tracking and discovering conditions of the current targets;
S32, traversing U 'by unmanned aerial vehicle' O According to the distance between the detected target and the unmanned plane, selecting the nearest target, and judging to enter a long-distance tracking or short-distance tracking program:
surrounding large frames with target detection algorithmsThe small is taken as the judgment basis of the near-far target, wherein h i And l i Respectively, currently detects U' O The height and width of the ith target bounding box;
h i <T h or l i <T l Or y min,i <T y Step S33 is entered, otherwise step S34 is entered;
wherein T is h 、T l 、T y Numerical selection is set according to the imaging resolution of the monocular camera used, y min,i Is U' O The y value of the pixel point at the upper left corner of the ith bounding box;
s33, when the unmanned aerial vehicle judges that the target position is far away from the unmanned aerial vehicle, entering a remote tracking program;
when unmanned aerial vehicle is tracked remotely, cloud platform angle keeps T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
first, a relative distance (u) of the target bounding box from the center point imaged by the monocular camera is calculated x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) The position of the center point for imaging the monocular camera is determined according to y o Setting the forward speed v of the unmanned aerial vehicle in a pixel distance segmentation way from the imaging center point of the monocular camera x Is of the size of (2):
v x =k fi ×y o +b fi
wherein k is fi 、b fi Parameters are adjusted for different speeds during remote tracking, i representing y at this time o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
wherein k is f Yaw rate w for unmanned aerial vehicle z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
s34, when the unmanned aerial vehicle judges that the target position is close to the unmanned aerial vehicle, entering close-range tracking;
unmanned aerial vehicle closely tracks, and cloud platform angle adjustment is T g2 Wherein T is g2 Greater than T g1 The unmanned aerial vehicle adjusts the advancing speed and the steering speed of the unmanned aerial vehicle according to the position of the target bounding box to be tracked in the imaging of the monocular camera;
calculating the relative distance (u) of the object bounding box from the center point of monocular camera imaging x ,u y ):
u x =x o -u cx
u y =y o -u cy
Wherein (u) cx ,u cy ) Center point position for monocular camera imaging, speed of advance of unmanned aerial vehicle v x According to the size y o And u is equal to cy Ratio u of (2) tc Segment setting is performed, wherein forward is positive and backward is negative:
v x =k ni ×y o +b ni
wherein k is ni 、b ni For different speed adjustment parameters in close-range tracking, i represents y at this time o An ith section of pixel distance from the monocular camera imaging center point;
rotational speed w of unmanned aerial vehicle z The size of (2) is:
Wherein k is n Yaw for unmanned aerial vehicle in short distanceAngular velocity w z Wherein the left turn is positive and the right turn is negative with the viewing angle of the unmanned aerial vehicle;
s35, in the unmanned aerial vehicle tracking process, after a target is temporarily lost due to shielding, the unmanned aerial vehicle pre-judges and tracks the target motion trail based on the motion state before the target is lost;
step S4, measuring and calculating the position of the target while tracking, and sharing the target position in real time
S41, eliminating the wrong target position;
the current unmanned aerial vehicle records the coordinates of each category target detection frame in the nearest n frames on the image; frame number n when a certain class of object is detected d If the number is less than n, the target is not processed, and the sporadic detection result is used as the detection of possible errors to be eliminated;
s42, automatically correcting the target class errors in target detection by utilizing the front and rear frame information;
in unmanned aerial vehicle close range tracking, the class of the tracked object is locked to be cls tr The method comprises the steps of carrying out a first treatment on the surface of the At t s In the time range of seconds, when the target class is C tr Is detected to disappear and other target class C is detected e When comparing object class C tr Coordinates of last moment (x t ,y t ) Coordinates (x) of other object classes currently detected e ,y e ) If the distance d between them err <d tr Then the target class C e Modified to C tr
Wherein,d tr judging a threshold value with similar distance between two coordinates for the image coordinate system;
s43, smoothing the target detection result by utilizing multi-frame sequential information;
when a target is detected in continuous frames, a multi-frame smoothing technology is utilized to exclude a detection result with larger error, the target detection positions of the multi-frames are fused, and an average value of target coordinates is calculated;
after false detection is eliminated, calculating average sitting of target detection frames in n framesScale value (x) m ,y m ):
Wherein x is i ,y i Respectively an x-axis coordinate and a y-axis coordinate of a target detection frame in an ith frame;
s44, calculating the position of the target in the world coordinate system and publishing the position to the global network;
under the camera image coordinate system and the unmanned aerial vehicle body coordinate system, corresponding coordinates of n and n >4 are found out and brought into the following perspective transformation formula, and a homography matrix Trans of the perspective transformation is calculated:
[x u y u 1]=[x m y m 1]·Trans
wherein [ x ] u y u 1]Representing the coordinates of a point under the unmanned aerial vehicle body coordinate system, [ x ] m y m 1]Representing coordinates of points under a camera image coordinate system;
obtaining coordinates of points corresponding to the unmanned aerial vehicle body coordinate system under the image coordinate system through perspective transformation homography matrix Trans, and obtaining the relative position of the target and the unmanned aerial vehicle;
and then, converting the relative positions of the target and the unmanned aerial vehicle into absolute positions in a world coordinate system through a transformation matrix and the pose of the unmanned aerial vehicle:
[x w y w 1 1]=[x u y u 1 1]·Tr w_uav
Wherein,yaw angle, x of unmanned aerial vehicle uav And y uav The coordinates of the unmanned aerial vehicle in a world coordinate system;
object class cls to be tracked and coordinates x of the object in the world coordinate system w ,y w Publishing to a global network;
s5, planning a path returning to the cruising point by using a dynamic path planning algorithm;
after the unmanned aerial vehicle finishes tracking the target, a collision-free path is planned by adopting dynamic path planning rules by taking the current position as a starting point and taking a cruising path point closest to the unmanned aerial vehicle as an end point;
after reaching the end point, returning to the step S2, and continuing searching and detecting until the searching, tracking and positioning tasks of all the moving targets are completed;
step S6, in the whole process of searching tasks, the unmanned aerial vehicle in the steps S1-S5 needs to perform obstacle avoidance treatment on static obstacles and other unmanned aerial vehicles;
s61, a static obstacle avoidance algorithm of the unmanned aerial vehicle according to the position change speed of surrounding static obstacles;
according to the position of the static obstacle, traversing the area in a certain range around the unmanned aerial vehicle to obtain a static obstacle point nearest to the unmanned aerial vehicle;
at speed v when the drone is in close range tracking 0 <v avo_m When flying, starting a fine adjustment obstacle avoidance program; traversing omega around unmanned aerial vehicle s Points within range find distance unmanned plane Uav 0 Nearest obstacle point, Ω s The calculation can be made by the following formula:
Ω s ={(x,y)|x 0 -D avo_s <x<x 0 +D avo_s ,y 0 -D avo_s <y<y 0 +D avo_s }
wherein D is avo_s Avoidance for unmanned aerial vehicle tracking in close rangeResponse distance to the obstacle at the time of obstacle;
if omega s Points within the range contain points M marked as obstacles obs Starting a small-speed static obstacle avoidance program; under the unmanned aerial vehicle coordinate system, the forward direction of the unmanned aerial vehicle is taken as the x axis, the leftward translation direction of the unmanned aerial vehicle is taken as the y axis, and then the obstacle point M obs Distance from unmanned plane in x-axis direction d x The distance from the unmanned plane in the y-axis direction is d y The Euclidean distance to the unmanned aerial vehicle is
After obstacle avoidance, the flying speed of the unmanned aerial vehicle in the advancing direction is as follows:
the flying speed of the unmanned aerial vehicle in the left moving direction is as follows:
wherein d avo_sx And d avo_sy In the small-speed static obstacle avoidance program, speed adjustment parameters of the unmanned aerial vehicle in the forward and leftward movement directions are respectively set;
at a speed v when the unmanned aerial vehicle is in the process of remote tracking 0 ≥v avo_m During flight, starting an emergency obstacle avoidance program; traversing omega around unmanned aerial vehicle h Points within range find distance unmanned plane Uav 0 Nearest obstacle point, Ω h The calculation can be made by the following formula:
Ω h ={(x,y)|x 0 -D avo_h <x<x 0 +D avo_h ,y 0 -D avo_h <y<y 0 +D avo_h }
wherein D is avo_h The response distance of the unmanned aerial vehicle to the obstacle when the unmanned aerial vehicle is used for remotely tracking and avoiding the obstacle is provided;
if omega h Points within the range contain points M marked as obstacles obs Starting a high-speed static obstacle avoidance program; under the unmanned aerial vehicle coordinate system, the forward direction of the unmanned aerial vehicle is taken as the x axis, the leftward translation direction of the unmanned aerial vehicle is taken as the y axis, and then the obstacle point M obs Distance from unmanned plane in x-axis direction d x The distance from the unmanned plane in the y-axis direction is d y The Euclidean distance to the unmanned aerial vehicle is
After obstacle avoidance, the flying speed of the unmanned aerial vehicle in the advancing direction is as follows:
the flying speed of the unmanned aerial vehicle in the left moving direction is as follows:
v y =abs(v 0 )·k avo_hx ·(D avo_h -d e )·(D avo_h -|d y |)
wherein d avo_hx And k avo_hx Respectively adjusting parameters of the speed of the unmanned aerial vehicle in the directions of an x axis and a y axis in a high-speed static obstacle avoidance program;
for the conditions of different speeds of the unmanned aerial vehicle, v avo_m In order to distinguish boundary values of large-speed flight and small-speed flight of the unmanned aerial vehicle, parameter settings of obstacle avoidance programs are different under different speeds, and the general conditions are as follows:
d avo_hx >d avo_sx ,d avo_sy >1,k avo_hx <<1,D avo_h >D avo_s
s62, the unmanned aerial vehicle changes the inter-machine dynamic obstacle avoidance algorithm of the height according to the position relation between other unmanned aerial vehicles;
unmanned plane Uav 0 Acquiring Uav of other N-1 unmanned aerial vehicles in real time 1 ,Uav 2 ,...,Uav N-1 And calculates the distance d with other unmanned aerial vehicles 1 ,d 2 ,...,d N-1 The method comprises the steps of carrying out a first treatment on the surface of the When min { d 1 ,d 2 ,...,d N-1 }<D avo_d During the starter obstacle avoidance procedure, D avo_d The response distance for dynamic obstacle avoidance between unmanned aerial vehicles is;
in general, the initial cruising altitude of the unmanned aerial vehicle is H cru After the obstacle avoidance process of the starter, the expected cruising altitude of the unmanned aerial vehicle is modified to H avo The calculation method is as follows:
H avo =H cru +N higher ·H bias -N lower ·H bias
wherein N is higher Is unmanned plane Uav 0 The distance between the two is smaller than D avo_d And height z i <z 0 (i=1, 2,., N-1) the number of unmanned aerial vehicles, the number being the same when the heights are the same than the unmanned aerial vehicle Uav 0 Number of unmanned aerial vehicles with large number N lower Is unmanned plane Uav 0 The distance between the two is smaller than D avo_d And height z i <z 0 (i=1, 2,., N-1) the number of unmanned aerial vehicles, the number being the same when the heights are the same than the unmanned aerial vehicle Uav 0 Numbering the number of unmanned aerial vehicles with small numbers; h bias The height adjustment parameter of the obstacle avoidance among the unmanned aerial vehicle is related to the maximum height of the unmanned aerial vehicle body, and is generally more than half of the maximum height of the unmanned aerial vehicle body;
after the expected cruising altitude of the unmanned aerial vehicle is modified, the speed v of the unmanned aerial vehicle in the vertical direction is changed h To a desired height, v h The calculation method is as follows:
v h =(H avo -z 0 )·v bias
wherein v is bias The unmanned aerial vehicle height adjustment method is an adjustment parameter of the unmanned aerial vehicle height direction speed during dynamic obstacle avoidance, and the larger the parameter is, the faster the unmanned aerial vehicle height adjustment is;
when min { d 1 ,d 2 ,...,d N-1 }≥D avo_d When the unmanned aerial vehicle exits from the inter-aircraft dynamic obstacle avoidance program, the expected cruising altitude of the unmanned aerial vehicle is changed into H cru At the same time with v h =(H cru -z 0 )·v bias Is returned to the original desired height.
2. The method for collaborative searching and tracking positioning of multiple moving objects by an unmanned aerial vehicle cluster according to claim 1, wherein the object detection algorithm is YOLOv4 or YOLOv7.
3. The method for collaborative searching and tracking positioning of multiple moving objects by an unmanned aerial vehicle cluster according to claim 2, wherein the step S31 specifically comprises the following steps:
s311, each unmanned aerial vehicle acquires the target positions of other unmanned aerial vehicles which are tracked and released at present, stores the acquired continuous positions in a classification mode, and stores the target positions in an array C respectively k Wherein k is a category label, C k The size is L, and if the front and rear position information acquisition interval is greater than s seconds, the current category target is determined to be lost, and C is determined to be k Initializing and zeroing the array;
s312 if the current time C k The last digit value of the array is non-zero, and C k Continuously updating the data in the array, and then classifying the object into the class C k K-means clustering is carried out on the array, and it is determined that a plurality of different targets under the category are currently tracked;
s313 as the nth unmanned aerial vehicle, the target detection set obtained by S24I.e. when the target is detected, if C k If the distance E is not the same as T, comparing the target position obtained by remote measurement with the target positions of the same class which are currently tracked by other unmanned aerial vehicles, and if the Euclidean distance E is less than T l The target is the target which is tracked by other unmanned aerial vehicles, and the target is collected U' O The target is deleted.
4. The method for collaborative searching and tracking positioning of multiple moving objects by an unmanned aerial vehicle cluster according to claim 1, wherein the step S35 specifically comprises the following steps:
s351, creating a queue q, wherein the queue stores the current tracked target position of the unmanned aerial vehicle in real time, the last data of the queue q is the latest motion position of the current tracked target, and the data in the queue q are continuous in time;
s352 automatically acquires data in a queue q when the target is lost due to occlusion of the current tracked target and the unmanned aerial vehicle, calculates the advancing direction theta of the target before disappearance, and fits the motion trail of the target:
wherein y is act,n Y-value, x, representing the last target position in queue q act,n X value, y representing last target position in queue q act,i Y-value, x, representing the first target position in queue q act,1 X-value, v, representing the first target position in queue q act Representing the target movement speed, t q Representing the time interval from the acquisition of the first position to the last position in the whole queue q;
s353, predicting the movement track of the blocked target according to the calculated movement direction and movement speed of the target:
x act,pre =x act,n +cos(q)×(t now -t lose )
y act,pre =y act,n +sin(q)×(t now -t lose )
wherein x is act,pre 、y act,pre Representing the x, y coordinates, t, of the current predicted target position now Indicating the current time, t lose Indicating the target losing moment; the predicted target position is used as the target position of the tracked target after being temporarily lost due to shielding;
s354, the unmanned aerial vehicle adjusts the tracking speed according to the predicted target moving direction and speed, predicts and tracks, and when the target is not shielded any more, the unmanned aerial vehicle enters S34;
v wx =cos(θ)×v act
v wy =sin(θ)×v act
wherein v is wx 、v wy Respectively representing the speeds of the unmanned aerial vehicle in the x axis and the y axis of the world coordinates at the current moment.
5. The method for collaborative searching and tracking positioning of unmanned aerial vehicle clusters on multiple moving targets according to claim 1, wherein the dynamic path planning algorithm in S5 adopts D * The algorithm is used for planning a path returning to the cruising point, and the specific steps are as follows:
s51, rasterizing and dividing a task area into an unviewable area, a difficult-to-pass area and an unviewable area;
rasterizing a task area; according to the passing difficulty, the area is divided into 3 main parts: passable areas, non-passable areas and difficult-to-pass areas; the passable area refers to an area with an open visual field and normal passing of the unmanned aerial vehicle, the non-passable area refers to an area with static barriers, and the difficult-to-pass area refers to an area with an unopened visual field and easy collision of the unmanned aerial vehicle;
s52, setting a loss value of each grid, and planning an optimal path for returning to the cruising point;
After the unmanned aerial vehicle completes tracking one target, D is used * Calculating a path returning to a cruising point by an algorithm; d (D) * The algorithm comprises heuristic functions f (n) =g (n) +h (n), wherein g (n) is the shortest path value currently known from the starting point to the point n, and h (n) is the loss value from the point n to the target;
loss values of passable areas, inaccessible areas and inaccessible areas in adjacent nodes are respectively set to N QYS 、N QYS +n and Inf; at the same time set to 1.4 x N in diagonally adjacent nodes QYS 、1.4*(N QYS +n) and Inf.
6. The method for collaborative searching and tracking positioning of multiple moving objects by an unmanned aerial vehicle cluster according to claim 3, wherein the specific method for k-means clustering in S312 is as follows: determining initial centroids of clusters using elbow's law k Values, k=2, 3,4 for different valuesClustering is performed separately, and the sum k of the clustering distances is calculated by the following formula dist
C k Sample A in array i The clustering result is C (i) Clustering is performed to obtain distance statistics W (in) of different samples among the same target:
sample distance statistic W (out) between different targets:
the sample statistics for the entire cluster are:
K dist =W(in)+W(out);
selecting the most proper k value by using an ellaw so as to determine the number of tracked targets in the current category; and simultaneously, taking the clustering center position of each category as the coordinates of each target to obtain the tracking condition of other unmanned aerial vehicles on different targets in different categories.
CN202211358543.1A 2022-11-01 2022-11-01 Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster Active CN115661204B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211358543.1A CN115661204B (en) 2022-11-01 2022-11-01 Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211358543.1A CN115661204B (en) 2022-11-01 2022-11-01 Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster

Publications (2)

Publication Number Publication Date
CN115661204A CN115661204A (en) 2023-01-31
CN115661204B true CN115661204B (en) 2023-11-10

Family

ID=84994992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211358543.1A Active CN115661204B (en) 2022-11-01 2022-11-01 Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster

Country Status (1)

Country Link
CN (1) CN115661204B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115790608B (en) * 2023-01-31 2023-05-30 天津大学 AUV path planning algorithm and device based on reinforcement learning
CN115826622B (en) * 2023-02-13 2023-04-28 西北工业大学 Night co-location method for unmanned aerial vehicle group
CN117115414B (en) * 2023-10-23 2024-02-23 西安羚控电子科技有限公司 GPS-free unmanned aerial vehicle positioning method and device based on deep learning
CN117409340B (en) * 2023-12-14 2024-03-22 上海海事大学 Unmanned aerial vehicle cluster multi-view fusion aerial photography port monitoring method, system and medium
CN117472083B (en) * 2023-12-27 2024-02-23 南京邮电大学 Multi-unmanned aerial vehicle collaborative marine search path planning method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160085963A (en) * 2015-01-08 2016-07-19 서울대학교산학협력단 UAV flight control device and method for object shape mapping and real-time guidance using depth map
CN107656545A (en) * 2017-09-12 2018-02-02 武汉大学 A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid
CN109917767A (en) * 2019-04-01 2019-06-21 中国电子科技集团公司信息科学研究院 A kind of distribution unmanned plane cluster autonomous management system and control method
CN111813151A (en) * 2020-08-26 2020-10-23 江苏威思迈智能科技有限公司 Unmanned aerial vehicle cluster control method based on machine vision
CN112130587A (en) * 2020-09-30 2020-12-25 北京理工大学 Multi-unmanned aerial vehicle cooperative tracking method for maneuvering target
WO2021013110A1 (en) * 2019-07-19 2021-01-28 深圳市道通智能航空技术有限公司 Target tracking-based unmanned aerial vehicle obstacle avoidance method and apparatus, and unmanned aerial vehicle
CN113759958A (en) * 2021-07-07 2021-12-07 哈尔滨工程大学 Unmanned aerial vehicle formation flight path planning method based on positioning precision
CN114115331A (en) * 2021-10-29 2022-03-01 西安电子科技大学 Multi-unmanned aerial vehicle multi-load cooperative reconnaissance method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7970507B2 (en) * 2008-01-23 2011-06-28 Honeywell International Inc. Method and system for autonomous tracking of a mobile target by an unmanned aerial vehicle
US20170269612A1 (en) * 2016-03-18 2017-09-21 Sunlight Photonics Inc. Flight control methods for operating close formation flight
WO2020103034A1 (en) * 2018-11-21 2020-05-28 深圳市道通智能航空技术有限公司 Method and device for planning path of unmanned aerial vehicle, and unmanned aerial vehicle
CN112926514A (en) * 2021-03-26 2021-06-08 哈尔滨工业大学(威海) Multi-target detection and tracking method, system, storage medium and application

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160085963A (en) * 2015-01-08 2016-07-19 서울대학교산학협력단 UAV flight control device and method for object shape mapping and real-time guidance using depth map
CN107656545A (en) * 2017-09-12 2018-02-02 武汉大学 A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid
CN109917767A (en) * 2019-04-01 2019-06-21 中国电子科技集团公司信息科学研究院 A kind of distribution unmanned plane cluster autonomous management system and control method
WO2021013110A1 (en) * 2019-07-19 2021-01-28 深圳市道通智能航空技术有限公司 Target tracking-based unmanned aerial vehicle obstacle avoidance method and apparatus, and unmanned aerial vehicle
CN111813151A (en) * 2020-08-26 2020-10-23 江苏威思迈智能科技有限公司 Unmanned aerial vehicle cluster control method based on machine vision
CN112130587A (en) * 2020-09-30 2020-12-25 北京理工大学 Multi-unmanned aerial vehicle cooperative tracking method for maneuvering target
CN113759958A (en) * 2021-07-07 2021-12-07 哈尔滨工程大学 Unmanned aerial vehicle formation flight path planning method based on positioning precision
CN114115331A (en) * 2021-10-29 2022-03-01 西安电子科技大学 Multi-unmanned aerial vehicle multi-load cooperative reconnaissance method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
An Improved GA-based Approach for UAV Swarm Formation Transformation;聂一鸣;《2022 IEEE 6th Information Technology and Mechatronics Engineering Conference (ITOEC)》;全文 *

Also Published As

Publication number Publication date
CN115661204A (en) 2023-01-31

Similar Documents

Publication Publication Date Title
CN115661204B (en) Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster
CN113269098B (en) Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle
CN111461023B (en) Method for quadruped robot to automatically follow pilot based on three-dimensional laser radar
CN111932588B (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN110866887A (en) Target situation fusion sensing method and system based on multiple sensors
CN206691107U (en) Pilotless automobile system and automobile
CN109059944B (en) Motion planning method based on driving habit learning
CN110426046B (en) Unmanned aerial vehicle autonomous landing runway area obstacle judging and tracking method
CN107161141A (en) Pilotless automobile system and automobile
CN109992006A (en) A kind of accurate recovery method and system of power patrol unmanned machine
CN105759829A (en) Laser radar-based mini-sized unmanned plane control method and system
CN110533695A (en) A kind of trajectory predictions device and method based on DS evidence theory
CN110254722B (en) Aircraft system, aircraft system method and computer-readable storage medium
CN110865650B (en) Unmanned aerial vehicle pose self-adaptive estimation method based on active vision
CN103149940A (en) Unmanned plane target tracking method combining mean-shift algorithm and particle-filter algorithm
CN111474953B (en) Multi-dynamic-view-angle-coordinated aerial target identification method and system
CN110824495B (en) Laser radar-based drosophila visual inspired three-dimensional moving target detection method
CN111123953B (en) Particle-based mobile robot group under artificial intelligence big data and control method thereof
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN108733042A (en) The method for tracking target and device of automatic driving vehicle
CN112987765B (en) Precise autonomous take-off and landing method of unmanned aerial vehicle/boat simulating attention distribution of prey birds
Silva et al. Saliency-based cooperative landing of a multirotor aerial vehicle on an autonomous surface vehicle
Gao et al. Design and implementation of autonomous mapping system for ugv based on lidar
Cigla et al. Image-based visual perception and representation for collision avoidance
Bikmaev et al. Visual Localization of a Ground Vehicle Using a Monocamera and Geodesic-Bound Road Signs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant