CN112270250A - Target tracking method for tracking ground moving target by unmanned aerial vehicle - Google Patents

Target tracking method for tracking ground moving target by unmanned aerial vehicle Download PDF

Info

Publication number
CN112270250A
CN112270250A CN202011154817.6A CN202011154817A CN112270250A CN 112270250 A CN112270250 A CN 112270250A CN 202011154817 A CN202011154817 A CN 202011154817A CN 112270250 A CN112270250 A CN 112270250A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
yaw
target
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011154817.6A
Other languages
Chinese (zh)
Other versions
CN112270250B (en
Inventor
黎瑶
唐文兵
郑李斌
李佳慧
丁佐华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN202011154817.6A priority Critical patent/CN112270250B/en
Publication of CN112270250A publication Critical patent/CN112270250A/en
Application granted granted Critical
Publication of CN112270250B publication Critical patent/CN112270250B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/048Fuzzy inferencing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Fuzzy Systems (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a target tracking method for tracking a ground moving target by an unmanned aerial vehicle, which detects the moving target by using OpenCV, calculates a direction and a speed instruction which should be executed by the unmanned aerial vehicle by using a two-type fuzzy logic controller according to position information of the target on an image obtained by an airborne camera, and finally controls the unmanned aerial vehicle to continuously update the instruction so as to realize continuous tracking of the moving target. The invention considers the complexity of the tracking environment, combines the traditional target detection method, and can better process uncertainty by using a two-type fuzzy logic control theory, thereby finally realizing the continuous tracking of the ground moving target by the unmanned aerial vehicle.

Description

Target tracking method for tracking ground moving target by unmanned aerial vehicle
Technical Field
The invention belongs to the technical field of autonomous tracking control of unmanned aerial vehicles, and particularly relates to a target tracking method for tracking a ground moving target by an unmanned aerial vehicle.
Background
With the maturity of low altitude Unmanned Aerial Vehicle (UAV) hovering, cruising and pan-tilt technologies, as well as the rise of Computer Vision and Deep Learning (Deep Learning), target tracking based on an Unmanned Aerial Vehicle platform has become a research hotspot at home and abroad, and has been used for tracking crime vehicles in urban anti-terrorism, tracking ground maneuvering targets in air war and the like. Tracking the unmanned aerial vehicle refers to an unmanned aerial vehicle with special purposes, which acquires image information by using sensors such as an airborne pan-tilt and the like, acquires position information of a target on an image (or predicts the position information of the target by state estimation and multi-sensor information fusion) through a target detection process, and calculates a control command (usually a velocity vector) through a tracking algorithm to control the motion of the unmanned aerial vehicle so as to ensure that the target is always located near a central area of a visual field obtained by an airborne camera. However, the above process is driven only by the movement of the target, and does not take into account the uncertainty of the environment where the drone is located, such as the uncertainty of the sensor device, the uncertainty of the movement of the target, and the uncertainty of the actual measurement.
Aiming at the uncertain problem of the target tracking process, the proposal of the fuzzy control theory provides a new control method. The method can solve the control problem of time-varying parameters without accurately describing the model. A traditional type-1 fuzzy logic controller (T1-FLC) refers to an intelligent controller which takes the type-1 fuzzy set (T1-FS) theory proposed in Zadeh in 1965 as the basis, generalizes the operation of a human expert system on certain things and processes into a series of fuzzy rules by using the experience and knowledge of human beings, and incorporates human fuzzy reasoning into decision. As a rule-based nonlinear controller, T1-FLC converts language information of expert knowledge into characteristics of a control strategy, can solve many control problems existing in real life which are complex and cannot establish an accurate mathematical model system, and is therefore an effective means for processing uncertainty and inaccuracy problems in a control system.
However, in real-world unstructured dynamic environments and many practical applications that require problem complexity and result accuracy, the characterization of objects with T1-FLC is difficult to reach and suffers from a number of uncertainties, including mainly
1) Measurement noise uncertainty, i.e., the sensor measurement is noisy due to various high noise and variations in measurement conditions causing the training data used to adjust or optimize system parameters;
2) actuator characteristic uncertainty, i.e., a change in the control input due to a change in the characteristics of the actuator due to wear, environment, etc.;
3) language uncertainty, including differences of the same language for different human meanings, and differences of conclusions obtained by expert questionnaires in the same rule;
4) operating environment uncertainty, i.e., changes in system operating conditions, result in corresponding input and output changes.
Therefore, T1-FLC is limited in processing uncertainty problems, and the limitation is mainly reflected in that T1-FS represented by accurate membership functions are used in principle analysis, and the T1-FS cannot directly process the uncertainty of fuzzy rules.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a target tracking method for tracking a ground moving target by an unmanned aerial vehicle.
In order to realize the purpose of the invention, the invention adopts the following technical scheme: a target tracking method for tracking a ground moving target by an unmanned aerial vehicle specifically comprises the following steps:
(1) shooting a video when the unmanned aerial vehicle flies, selecting a target to be tracked by the unmanned aerial vehicle from a frame picture of the returned video, and extracting HOG and HSV characteristics of the target;
(2) positioning the central coordinate (x) of the target on the image through the HOG and HSV characteristics extracted in the step 1o,yo) (ii) a Calculating the center coordinate (x)0,y0) Looking at unmanned aerial vehicleCenter of field position coordinate (x)c,yc) Absolute offset amount Δ x in the horizontal direction and absolute offset amount Δ y in the vertical direction:
Figure BDA0002742368940000021
(3) when | delta x | ≦ wsafeIn the second time, the angle of the unmanned aerial vehicle does not need to be adjusted; when | delta y | is less than or equal to h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 4 is implemented; wherein, wsafeTo stabilize the width of the range, hsafeThe height of the stable range is obtained, and the center of the stable range is superposed with the center of the visual field of the unmanned aerial vehicle;
(4) the absolute displacement amount Δ x in the horizontal direction and the absolute displacement amount Δ y in the vertical direction are normalized, that is, Δ x '═ Δ x/(w/2) and Δ y' ═ Δ y/(h/2), so that Δ x and Δ y are scaled to [ -1,1]Wherein when Δx′<When 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards; similarly, when Δy′<0, indicating that it should be backed up, otherwise it should be advanced; the resolution of the view frame of the unmanned aerial vehicle is w multiplied by h;
(5) and (3) taking the delta x 'and the delta y' in the step (4) as the input of the two-type fuzzy logic controller to obtain the output deflection angle yaw 'and the speed v' of the two-type fuzzy logic controller, and calculating the deflection angle yaw and the speed v of the unmanned aerial vehicle at the next moment through direction reasoning:
yaw=Maxyyaw'
v=Maxvv'
among them, MaxyIs a maximum value range of the unmanned aerial vehicle deflection angle, namely the unmanned aerial vehicle deflection angle is less than pi/2, MaxvThe maximum flying speed is set for the unmanned aerial vehicle, the positive and negative of yaw 'respectively represent leftward and rightward deflection, and the positive and negative of v' represent forward and backward flying;
the language ambiguity sets of Δ x 'and Δ y' are all { NB, NS, MM, PS, PB }, NB representing the normalized offset | Δ x '| 1 or | Δ y' | 1 of the target from the center point position in the horizontal and vertical directions and located to the left of the center pointSide/upper, NS denotes a normalized offset | Δ x '| of 0.5 or | Δ y' | of 0.5 from the center point position and is located on the left side/upper of the center point, MM denotes that the target is just at the center position of the image frame, PS denotes a normalized offset | Δ x '| of 0.5 or | Δ y' | 0.5 from the center point position and is located on the right side/lower of the center point, PB denotes a normalized offset | Δ x '| 1 or | Δ y' | 1 from the center point position and is located on the right side/lower of the center point; the fuzzy set of yaw' is { LB, LS, ZO, RS, RB }, LB representing a left turn
Figure BDA0002742368940000031
Near, LS means turning to the left
Figure BDA0002742368940000032
Near, ZO means no rotation, RS means right rotation
Figure BDA0002742368940000033
Adjacent, RB denotes turning to the right
Figure BDA0002742368940000034
Nearby; the fuzzy set of v' is { GB, GS, ZO, BS, BB }, GB meaning fast-forward, GS meaning slow-forward, ZO meaning motionless, BS meaning slow-reverse, BB meaning fast-reverse.
(6) And (5) repeating the step 2-5 to enable the selected tracking target to be always kept at the positive center position of the image until the tracking task is completed.
Further, for the two-section fuzzy controller of the interval for controlling the unmanned plane yaw angle yaw', the following requirements are satisfied:
(a)Δx'=NB,yaw'=LB;
(b)Δx'=NS,yaw'=LS;
(c)Δx'=MM,yaw'=ZO;
(d)Δx'=PS,yaw'=RS;
(e)Δx'=PB,yaw'=RB;
to the interval type two fuzzy controller of control unmanned aerial vehicle speed v', satisfy:
(a)Δy′=NB,yaw'=GB;
(b)Δy′=NS,yaw'=GS;
(c)Δy′=MM,yaw'=ZO;
(d)Δy′=PS,yaw'=BS;
(e)Δy′=PB,yaw'=BB。
compared with the prior art, the invention has the following beneficial effects:
(1) the two-type fuzzy logic control theory is used for better solving the uncertainty problem, has certain uncertainty processing capability, combines tasks, and is suitable for various machine intelligent system processing uncertainties.
(2) In the tracking task, the method is applied to the tracking control process, and meanwhile, the obstacle avoidance problem in the task can be solved by combining the related obstacle avoidance algorithm, so that the accuracy and the robustness of obstacle avoidance are improved.
(3) The tracking method based on the fuzzy logic theory is applicable to tracking task application scenes of unmanned vehicles, multi-agent systems, robots and other systems.
Drawings
FIG. 1 is a flow chart of a target tracking method for an unmanned aerial vehicle tracking a ground moving target according to the present invention;
FIG. 2 is a block diagram of an embodiment of the present invention;
FIG. 3 is a coordinate diagram of target pixels in an image frame;
fig. 4 is an architecture diagram of the fuzzy controller.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings.
As shown in fig. 1, a flowchart of a target tracking method for tracking a ground moving target by an unmanned aerial vehicle of the present invention is specifically shown, and the process includes:
(1) shooting a video when the unmanned aerial vehicle flies, selecting a target to be tracked by the unmanned aerial vehicle from a frame picture of the returned video, and extracting HOG and HSV characteristics of the target;
(2) positioning the central coordinate (x) of the target on the image through the HOG and HSV characteristics extracted in the step 1o,yo) (ii) a Calculating the center coordinate (x)0,y0) And unmanned plane view center position coordinate (x)c,yc) Absolute offset amount Δ x in the horizontal direction and absolute offset amount Δ y in the vertical direction:
Figure BDA0002742368940000041
(3) when | delta x | ≦ wsafeIn the second time, the angle of the unmanned aerial vehicle does not need to be adjusted; when | delta y | is less than or equal to h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 4 is implemented; wherein, wsafeTo stabilize the width of the range, hsafeThe height of the stable range is obtained, and the center of the stable range is superposed with the center of the visual field of the unmanned aerial vehicle;
(4) the absolute displacement amount Δ x in the horizontal direction and the absolute displacement amount Δ y in the vertical direction are normalized, that is, Δ x '═ Δ x/(w/2) and Δ y' ═ Δ y/(h/2), so that Δ x and Δ y are scaled to [ -1,1]Wherein when Δx′<When 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards; similarly, when Δy′<0, indicating that it should be backed up, otherwise it should be advanced; the resolution of the view frame of the unmanned aerial vehicle is w multiplied by h;
(5) and (3) taking the Δ x 'and the Δ y' in the step (4) as the input of the two-type fuzzy logic controller to obtain the output deflection angle yaw 'and the speed v' of the two-type fuzzy logic controller, and then calculating the deflection angle yaw and the speed v of the unmanned aerial vehicle at the next moment in a reverse mode:
yaw=Maxyyaw'
v=Maxvv'
among them, MaxyIs a maximum value range of the unmanned aerial vehicle deflection angle, namely the unmanned aerial vehicle deflection angle is less than pi/2, MaxvThe maximum flying speed is set for the unmanned aerial vehicle, the positive and negative of yaw 'respectively represent leftward and rightward deflection, and the positive and negative of v' represent forward and backward flying;
the set of linguistic ambiguities for Δ x ', Δ y ' are all { NB, NS, MM, PS, PB }, NB representing the normalized offset | Δ x ' of the target from the center point location in the horizontal and vertical directions1 or | Δ y ' | is near 1 and located on the left side/top of the center point, NS denotes a normalized offset | Δ x ' | 0.5 or | Δ y ' | 0.5 of the target from the center point and located on the left side/top of the center point, MM denotes that the target is exactly at the center position of the image frame, PS denotes a normalized offset | Δ x ' | 0.5 or | Δ y ' | 0.5 of the target from the center point position and located on the right side/bottom of the center point, PB denotes a normalized offset | Δ x ' | 1 or | Δ y ' | 1 of the target from the center point position and located on the right side/bottom of the center point; the fuzzy set of yaw' is { LB, LS, ZO, RS, RB }, LB representing a left turn
Figure BDA0002742368940000051
Near, LS means turning to the left
Figure BDA0002742368940000052
Near, ZO means no rotation, RS means right rotation
Figure BDA0002742368940000053
Adjacent, RB denotes turning to the right
Figure BDA0002742368940000054
Nearby; the fuzzy set of v' is { GB, GS, ZO, BS, BB }, GB meaning fast-forward, GS meaning slow-forward, ZO meaning motionless, BS meaning slow-reverse, BB meaning fast-reverse.
For the two-section fuzzy controller of the interval for controlling the unmanned plane deflection angle yaw', the following requirements are satisfied:
(a)Δx'=NB,yaw'=LB;
(b)Δx'=NS,yaw'=LS;
(c)Δx'=MM,yaw'=ZO;
(d)Δx'=PS,yaw'=RS;
(e)Δx'=PB,yaw'=RB;
to the interval type two fuzzy controller of control unmanned aerial vehicle speed v', satisfy:
(a)Δy′=NB,yaw'=GB;
(b)Δy′=NS,yaw'=GS;
(c)Δy′=MM,yaw'=ZO;
(d)Δy′=PS,yaw'=BS;
(e)Δy′=PB,yaw'=BB;
(6) and (5) repeating the step 2-5 to enable the selected tracking target to be always kept at the positive center position of the image until the tracking task is completed.
In order to implement the method of the present invention on a real drone platform, fig. 2 shows a frame diagram of an implementation of the present invention, and as can be seen from fig. 2, the lowest layer of the frame is a hardware platform of the drone, which includes a battery, a motor, a sensor, an antenna, an onboard processor, and the like, and a Linux operating system is installed on the onboard processor of the drone. In order to realize global information sharing between drones, a Robot Operation System (ROS) is considered to be deployed on a Linux Operation System as a Meta-Operation System. In the ROS, a Master Node (Master Node) is responsible for message communication between nodes. The unmanned aerial vehicle serving as a node broadcasts information such as position and speed of the unmanned aerial vehicle through topics (topic), and on the basis of an ROS layer, an unmanned aerial vehicle tracking system comprises two main functional modules, namely target positioning and tracking control. The target positioning module mainly comprises two parts of target detection and positioning based on OpenCV, and the tracking control is mainly based on the design of a controller of an interval two-type fuzzy logic control theory. The target tracking method for tracking the ground moving target by the unmanned aerial vehicle comprises the following specific processes:
(1) shooting a video when the unmanned aerial vehicle flies, selecting a target to be tracked by the unmanned aerial vehicle from a frame picture of the returned video, and extracting HOG and HSV characteristics of the target;
(2) positioning the central coordinate (x) of the target on the image through the HOG and HSV characteristics extracted in the step 1o,yo) Thick dots as in fig. 3; calculating the center coordinate (x)0,y0) And unmanned plane view center position coordinate (x)c,yc) Absolute offset amount Δ x in the horizontal direction and absolute offset amount Δ y in the vertical direction:
Figure BDA0002742368940000071
and the delta x and the delta y are used for judging whether the unmanned aerial vehicle needs to adjust the deflection angle and the speed at the current moment or not and calculating a control instruction in the fuzzy rule of the two-type fuzzy controller.
(3) To reduce the "jitter" problem of the drone with target center coordinates near the ideal value, a stability range, w, is setsafeTo stabilize the width of the range, hsafeThe height of the stable range is obtained, and the center of the stable range is superposed with the center of the visual field of the unmanned aerial vehicle; when | delta x | ≦ wsafeIn the second time, the angle of the unmanned aerial vehicle does not need to be adjusted; when | delta y | is less than or equal to h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 4 is implemented;
(4) in addition, the resolution of the view frame of the unmanned aerial vehicle is w × h, and the absolute offset Δ x in the horizontal direction and the absolute offset Δ y in the vertical direction are normalized, that is, Δ x '═ Δ x/(w/2) and Δ y' ═ Δ y/(h/2), so that Δ x and Δ y are scaled to [ -1,1 []Wherein when Δx′<When 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards; similarly, when Δy′<At 0, it means that it should be backed up, otherwise it should be advanced.
(5) In order to locate the position of the target in the image of the camera view range of the unmanned aerial vehicle, the control instruction of the unmanned aerial vehicle needs to be obtained through calculation and reasoning, uncertainty also exists when the unmanned aerial vehicle is controlled, and therefore, the controller based on the interval two-type fuzzy control theory is provided, and the unmanned aerial vehicle can track and move the target smoothly and continuously. Using the delta x 'and the delta y' in the step 4 as the input of the two-type fuzzy logic controller to obtain the output deflection angle yaw 'and the speed v' of the two-type fuzzy logic controller, so as to calculate the deflection angle yaw and the speed v of the unmanned aerial vehicle at the next moment,
yaw=Maxyyaw'
v=Maxvv'
among them, MaxyIs a maximum value range of the unmanned aerial vehicle deflection angle, namely the unmanned aerial vehicle deflection angle is less than pi/2, MaxvIs the maximum flying speed set for the unmanned plane, and the positive and negative of yaw' respectively represent the deviation to the left and the rightTurning, the positive and negative of v' indicates forward and backward flight.
Then, fuzzy sets and membership functions are designed for the input and output of the two-type fuzzy logic controller. The language blur sets of Δ x 'and Δ y' are { NB, NS, MM, PS, PB }, where NB denotes a normalized offset | Δ x '| 1 or | Δ y' | 1 of the target from the center point position in the horizontal and vertical directions and is located on the left side/above the center point, NS denotes a normalized offset | Δ x '| 0.5 or | Δ y' | 0.5 of the target from the center point position and is located on the left side/above the center point, MM denotes the target is located exactly at the center position of the image frame, PS denotes a normalized offset | Δ x '| 0.5 or | Δ y' | 0.5 of the target from the center point position and is located on the right side/below the center point, and PB denotes a normalized offset | Δ x '| 1 or | Δ y' | 1 of the target from the center point position and is located on the right side/below the center point; the fuzzy set of yaw' is { LB, LS, ZO, RS, RB }, LB representing a left turn
Figure BDA0002742368940000081
Near, LS means turning to the left
Figure BDA0002742368940000082
Near, ZO means no rotation, RS means right rotation
Figure BDA0002742368940000083
Adjacent, RB denotes turning to the right
Figure BDA0002742368940000084
Nearby; the fuzzy set of v' is { GB, GS, ZO, BS, BB }, GB represents fast forward, GS represents slow forward, ZO represents motionless, BS represents slow backward, BB represents fast backward, and for the convenience of the calculation of the two-type fuzzy logic controller, all the membership functions of the input and output variables are set as Gaussian membership functions.
As shown in fig. 4, the architecture diagram of the whole fuzzy controller designed by the present invention is shown, the controller has two branches to control the deflection angle and speed of the unmanned aerial vehicle respectively, and the two branches are respectively processed by two types of fuzzy processing to obtain corresponding control instructions.
For the two-section fuzzy controller of the interval for controlling the unmanned plane deflection angle yaw', the following requirements are satisfied:
(a)Δx'=NB,yaw'=LB;
(b)Δx'=NS,yaw'=LS;
(c)Δx'=MM,yaw'=ZO;
(d)Δx'=PS,yaw'=RS;
(e)Δx'=PB,yaw'=RB;
to the interval type two fuzzy controller of control unmanned aerial vehicle speed v', satisfy:
(a)Δy′=NB,v'=GB;
(b)Δy′=NS,v'=GS;
(c)Δy′=MM,v'=ZO;
(d)Δy′=PS,v'=BS;
(e)Δy′=PB,v'=BB。
for the output type-two fuzzy sets, the type must be reduced before refinement, i.e. the type-two fuzzy sets are converted into a type-one fuzzy set. The basic idea of type reduction is to represent the n +1 type fuzzy set with the most representative n type fuzzy set. When n is 0, it is the operation of the precisor during the defuzzification process, and the result is output as a determined value. The n-type fuzzy set representing n +1 type must effectively represent the characteristics of the original set, the two-type reduction type can be said to calculate the centroid of each embedded one-type set in the two-type fuzzy set, and all the centroids obtained together with the membership value generated by the same form a new one-type fuzzy set, which is the result output of the reduction type, and the typical calculation formula is as follows:
Figure BDA0002742368940000091
wherein Fn(x') is the set of upper and lower bound intervals of membership function values corresponding to each fuzzy inference rule, ylAnd yrAre respectively represented as follows:
Figure BDA0002742368940000092
Figure BDA0002742368940000093
the Karnik-Mendel (KM) algorithm can be used to perform y separatelylAnd yrThe calculation is carried out by the following specific steps:
will be provided withy n(N ═ 1,2,. and N) are named in order, and each isy 1y 2≤...≤y NIn the invention, the speed v or the deflection angle yaw of the unmanned aerial vehicle is referred, and the following steps are calculated by taking v as an example;
(1) initializing membership function fnComprises the following steps:
Figure BDA0002742368940000094
then, calculating:
Figure BDA0002742368940000095
(2) finding a switching point k (1. ltoreq. k. ltoreq.N-1) that satisfies the following formula:
v k≤v≤v k+1
(3) order:
Figure BDA0002742368940000101
then, calculating:
Figure BDA0002742368940000102
(4) judging whether v' is true or not, if so, stopping calculation and enabling vl=v(vrV) and L k (R k); if not, performing the step 6;
(5) let v ═ v' and go to step 3, so repeat the above steps.
(6) And (5) repeating the step 2-5 to enable the selected tracking target to be always kept at the positive center position of the image until the tracking task is completed.
In the target tracking method, the target can be accurately tracked until the tracking task is completed. That is, the target is always kept at the right center of the image, and the pixel error range is within
Figure BDA0002742368940000103
In the meantime.

Claims (2)

1. A target tracking method for tracking a ground moving target by an unmanned aerial vehicle is characterized by comprising the following steps:
(1) shooting a video when the unmanned aerial vehicle flies, selecting a target to be tracked by the unmanned aerial vehicle from a frame picture of the returned video, and extracting HOG and HSV characteristics of the target;
(2) positioning the central coordinate (x) of the target on the image through the HOG and HSV characteristics extracted in the step 1o,yo) (ii) a Calculating the center coordinate (x)0,y0) And unmanned plane view center position coordinate (x)c,yc) Absolute offset amount Δ x in the horizontal direction and absolute offset amount Δ y in the vertical direction:
Figure FDA0002742368930000011
(3) when | delta x | ≦ wsafeIn the second time, the angle of the unmanned aerial vehicle does not need to be adjusted; when | delta y | is less than or equal to h/2, the speed of the unmanned aerial vehicle is 0, namely, the unmanned aerial vehicle does not need to advance or retreat, otherwise, the step 4 is implemented; wherein, wsafeTo stabilize the width of the range, hsafeThe height of the stable range is obtained, and the center of the stable range is superposed with the center of the visual field of the unmanned aerial vehicle;
(4) the absolute shift amount Δ x in the horizontal direction and the absolute shift amount Δ y in the vertical direction are normalized, that is, Δ x '═ Δ x/(w/2) and Δ y' ═ Δ y/(h/2), so thatΔ x and Δ y scaled to [ -1,1]Wherein when Δx′<When 0, the unmanned aerial vehicle rotates leftwards, otherwise, the unmanned aerial vehicle rotates rightwards; similarly, when Δy′<0, indicating that it should be backed up, otherwise it should be advanced; the resolution of the view frame of the unmanned aerial vehicle is w multiplied by h;
(5) and (3) taking the Δ x 'and the Δ y' in the step (4) as the input of the two-type fuzzy logic controller to obtain the output deflection angle yaw 'and the speed v' of the two-type fuzzy logic controller, so as to calculate the deflection angle yaw and the speed v of the unmanned aerial vehicle at the next moment:
yaw=Maxyyaw'
v=Maxvv'
among them, MaxyIs a maximum value range of the unmanned aerial vehicle deflection angle, namely the unmanned aerial vehicle deflection angle is less than pi/2, MaxvThe maximum flying speed is set for the unmanned aerial vehicle, the positive and negative of yaw 'respectively represent leftward and rightward deflection, and the positive and negative of v' represent forward and backward flying;
the language blur sets of Δ x 'and Δ y' are { NB, NS, MM, PS, PB }, where NB denotes a normalized offset | Δ x '| 1 or | Δ y' | 1 of the target from the center point position in the horizontal and vertical directions and is located on the left side/above the center point, NS denotes a normalized offset | Δ x '| 0.5 or | Δ y' | 0.5 of the target from the center point position and is located on the left side/above the center point, MM denotes the target is located exactly at the center position of the image frame, PS denotes a normalized offset | Δ x '| 0.5 or | Δ y' | 0.5 of the target from the center point position and is located on the right side/below the center point, and PB denotes a normalized offset | Δ x '| 1 or | Δ y' | 1 of the target from the center point position and is located on the right side/below the center point; the fuzzy set of yaw' is { LB, LS, ZO, RS, RB }, LB representing a left turn
Figure FDA0002742368930000021
Left-right, LS indicates turning left
Figure FDA0002742368930000022
Left and right, ZO for no rotation and RS for right rotation
Figure FDA0002742368930000023
Left-right, RB indicates turning to the right
Figure FDA0002742368930000024
Left and right; the fuzzy set of v' is { GB, GS, ZO, BS, BB }, GB meaning fast-forward, GS meaning slow-forward, ZO meaning motionless, BS meaning slow-reverse, BB meaning fast-reverse.
(6) And (5) repeating the step 2-5 to enable the selected tracking target to be always kept at the positive center position of the image until the tracking task is completed.
2. The target tracking method according to claim 1, wherein for a zone type two fuzzy controller controlling the unmanned aerial vehicle yaw angle yaw', it is satisfied that:
(a)Δx'=NB,yaw'=LB;
(b)Δx'=NS,yaw'=LS;
(c)Δx'=MM,yaw'=ZO;
(d)Δx'=PS,yaw'=RS;
(e)Δx'=PB,yaw'=RB;
to the interval type two fuzzy controller of control unmanned aerial vehicle speed v', satisfy:
(a)Δy′=NB,yaw'=GB;
(b)Δy′=NS,yaw'=GS;
(c)Δy′=MM,yaw'=ZO;
(d)Δy′=PS,yaw'=BS;
(e)Δy′=PB,yaw'=BB。
CN202011154817.6A 2020-10-26 2020-10-26 Target tracking method for tracking ground moving target by unmanned aerial vehicle Active CN112270250B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011154817.6A CN112270250B (en) 2020-10-26 2020-10-26 Target tracking method for tracking ground moving target by unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011154817.6A CN112270250B (en) 2020-10-26 2020-10-26 Target tracking method for tracking ground moving target by unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112270250A true CN112270250A (en) 2021-01-26
CN112270250B CN112270250B (en) 2024-04-09

Family

ID=74342271

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011154817.6A Active CN112270250B (en) 2020-10-26 2020-10-26 Target tracking method for tracking ground moving target by unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112270250B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150229841A1 (en) * 2012-09-18 2015-08-13 Hangzhou Hikvision Digital Technology Co., Ltd. Target tracking method and system for intelligent tracking high speed dome camera
CN105467833A (en) * 2015-12-07 2016-04-06 南京航空航天大学 A non-linear self-adaptive flight control method
CN110083167A (en) * 2019-06-05 2019-08-02 浙江大华技术股份有限公司 A kind of path following method and device of mobile robot
CN110825108A (en) * 2019-11-11 2020-02-21 浙江理工大学 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace
CN111650831A (en) * 2019-05-29 2020-09-11 北京航空航天大学 Design of interval 2 type fuzzy logic controller of virtual flexible needle in medical robot controller

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150229841A1 (en) * 2012-09-18 2015-08-13 Hangzhou Hikvision Digital Technology Co., Ltd. Target tracking method and system for intelligent tracking high speed dome camera
CN105467833A (en) * 2015-12-07 2016-04-06 南京航空航天大学 A non-linear self-adaptive flight control method
CN111650831A (en) * 2019-05-29 2020-09-11 北京航空航天大学 Design of interval 2 type fuzzy logic controller of virtual flexible needle in medical robot controller
CN110083167A (en) * 2019-06-05 2019-08-02 浙江大华技术股份有限公司 A kind of path following method and device of mobile robot
CN110825108A (en) * 2019-11-11 2020-02-21 浙江理工大学 Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
唐鸿儒;宋爱国;: "基于多传感器信息融合的侦察机器人模糊路径跟踪行为研究", 传感技术学报, no. 08, pages 1809 - 1814 *

Also Published As

Publication number Publication date
CN112270250B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
Shi et al. Decoupled visual servoing with fuzzy Q-learning
Lee et al. A practical fuzzy logic controller for the path tracking of wheeled mobile robots
He et al. On the design and use of a micro air vehicle to track and avoid adversaries
CN110825108B (en) Cooperative anti-collision method for multiple tracking unmanned aerial vehicles in same airspace
Xu et al. Vision-based autonomous landing of unmanned aerial vehicle on a motional unmanned surface vessel
Eguíluz et al. Asynchronous event-based line tracking for time-to-contact maneuvers in uas
Tran et al. A Vision-based Method for Autonomous Landing on a Target with a Quadcopter
Chen et al. A review of autonomous obstacle avoidance technology for multi-rotor UAVs
Qadir et al. Vision based neuro-fuzzy controller for a two axes gimbal system with small UAV
Durdevic et al. Vision aided navigation of a quad-rotor for autonomous wind-farm inspection
Kim et al. A deep-learning-aided automatic vision-based control approach for autonomous drone racing in game of drones competition
Mutz et al. Following the leader using a tracking system based on pre-trained deep neural networks
Guan et al. Formation Tracking of Mobile Robots Under Obstacles Using Only an Active RGB-D Camera
Wang et al. An integrated teleoperation assistance system for collision avoidance of high-speed uavs in complex environments
Hoshino et al. End-to-end discrete motion planner based on deep neural network for autonomous mobile robots
CN112270250A (en) Target tracking method for tracking ground moving target by unmanned aerial vehicle
Flores-Delgado et al. Embedded control using monocular vision: Face tracking
Shastry et al. Autonomous detection and tracking of a high-speed ground vehicle using a quadrotor UAV
Wang et al. Safer uav piloting: A robust sense-and-avoid solution for remotely piloted quadrotor uavs in complex environments
Boumehraz et al. Vision based tracking and interception of moving target by mobile robot using fuzzy control
Pant et al. Obstacle Avoidance Method for UAVs using Polar Grid
Zhang et al. Vision-based moving target interception with a mobile robot based on motion prediction and online planning
Kurdi et al. Trajectory and Motion for Agricultural Robot
CN114326766A (en) Vehicle-mounted machine cooperative autonomous tracking and landing method
Olivares-Mendez et al. Vision based fuzzy control approaches for unmanned aerial vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant