CN110610512A - Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm - Google Patents

Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm Download PDF

Info

Publication number
CN110610512A
CN110610512A CN201910849702.XA CN201910849702A CN110610512A CN 110610512 A CN110610512 A CN 110610512A CN 201910849702 A CN201910849702 A CN 201910849702A CN 110610512 A CN110610512 A CN 110610512A
Authority
CN
China
Prior art keywords
target
center
neural network
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910849702.XA
Other languages
Chinese (zh)
Other versions
CN110610512B (en
Inventor
陈刚
刘永琦
聂良丞
董锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201910849702.XA priority Critical patent/CN110610512B/en
Publication of CN110610512A publication Critical patent/CN110610512A/en
Application granted granted Critical
Publication of CN110610512B publication Critical patent/CN110610512B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle target tracking method based on a BP neural network fusion Kalman filtering algorithm, which comprises the following steps: constructing a data set for training a BP neural network; constructing and online training a BP neural network for predicting the central position coordinates of the shielded target; predicting the position coordinate of the center of the shielded target by fusing a BP neural network and a Kalman filtering algorithm; the airborne computer calculates the flight speed according to the deviation of the position coordinates of the target center and the image center, and directly sends a speed instruction to an unmanned aerial vehicle flight control system working in an offboard flight mode through an ROS operating system to realize target tracking; experimental test results show that the position coordinates of the center of the shielded target can be accurately predicted when the shielded target is tracked by the unmanned aerial vehicle by combining the BP neural network and the Kalman filtering algorithm, and meanwhile, the real-time performance of target tracking of the unmanned aerial vehicle can be greatly improved by directly sending a speed instruction to control the unmanned aerial vehicle by the onboard computer.

Description

Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm
Technical Field
The invention relates to the field of target tracking of unmanned aerial vehicles, in particular to an unmanned aerial vehicle target tracking method based on a BP neural network fusion Kalman filtering algorithm.
Background
In order to meet the requirements of future special operations and urban street operations, the unmanned aerial vehicle technology is rapidly developed. The unmanned aerial vehicle can implement target tracking task in human unreachable or dangerous area through carrying on the camera. In the target tracking process of the unmanned aerial vehicle, the unmanned aerial vehicle usually faces the problems of illumination change, easy obstruction of the target by an obstacle, communication delay and the like, and finally leads the unmanned aerial vehicle to fail in tracking the target. Therefore, how to accurately predict the position coordinates of the shielded target and control the unmanned aerial vehicle to track the target to fly in real time becomes a problem to be solved urgently in the field of unmanned aerial vehicle target tracking.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings of the prior art and provide an unmanned aerial vehicle target tracking method based on a BP neural network fusion Kalman filtering algorithm, so that the problems that the unmanned aerial vehicle is prone to target tracking failure and target tracking real-time performance is poor when a tracked target is shielded by an obstacle are effectively solved.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
an unmanned aerial vehicle target tracking method based on a BP neural network fusion Kalman filtering algorithm comprises the steps of constructing a data set for training a BP neural network, constructing and real-time online training the BP neural network for predicting the position coordinates of the center of an occluded target on an onboard computer, then fusing and predicting the position coordinates of the center of the target predicted by the BP neural network and the Kalman filtering algorithm to obtain the position coordinates of the center of the occluded target, and finally sending a speed instruction to a flight control system by the onboard computer in real time to control an unmanned aerial vehicle to realize target tracking flight; the method specifically comprises the following steps:
1) constructing a data set for training the BP neural network: constructing a data set for training a BP neural network by using position coordinates of a target center obtained by a target tracking algorithm when an unmanned aerial vehicle is used for tracking a target without obstruction by using an airborne computer; the method comprises the following specific steps:
1-1) firstly, when an unmanned aerial vehicle tracks a target without obstruction shielding, calculating the position coordinates x and y of the center of the tracked target in each frame of tracked image in real time by using a target tracking algorithm, then storing the position coordinates of the target center in the latest N frames of images in a data container, and updating the position coordinates of the target center stored in the data container in real time along with the movement of the target;
1-2) constructing an input/output coordinate difference value for training a BP (back propagation) neural network by using the position coordinates of the target center in the N frames of images updated in real time in the data container, firstly taking out the position coordinates of the target center in the front N-1 frames of images in the data container, correspondingly subtracting the position coordinates of the target center in the adjacent images to construct an N-2 group of input coordinate difference values, then taking out the position coordinates of the target center in the back N-3 frames of images in the data container, and correspondingly subtracting the position coordinates of the target center in the adjacent images to construct an N-4 group of output coordinate difference values;
2) training a BP neural network and predicting the central position coordinates of the shielded target in real time: training a BP neural network on line by using the constructed input and output coordinate difference, and then fusing and predicting a predicted value of the BP neural network and a predicted value of a Kalman filtering algorithm to obtain a position coordinate of the center of the shielded target when the target is shielded by the obstacle; the method comprises the following specific steps:
2-1) firstly, utilizing the input and output coordinate difference value constructed in the step 1-2) to train a BP neural network on line, then, when an unmanned aerial vehicle tracks a target, judging whether the target is blocked by an obstacle or not by calculating a babbit coefficient between a current target color histogram and an initially selected target color histogram in real time, when the babbit coefficient is larger than a set threshold value, judging that the target in a k frame image at the current moment is blocked by the obstacle, secondly, correspondingly subtracting the position coordinates of the tracked target center in two previous frames of images adjacent to the k frame image, namely the k-1 frame and the k-2 frame image, to construct a group of input coordinate difference values, inputting the group of input coordinate difference values into the BP neural network, namely predicting to obtain a group of output coordinate difference values, and finally, adding the position coordinates of the target center in the k-1 frame image and the coordinate difference value output by the BP neural network to obtain the blocked target in the k frame image predicted by the BP neural network Position coordinates of the target center;
2-2) then predicting the position coordinates of the center of the occluded object, Karl using a Kalman filter according to a state prediction equationThe state prediction equation of the Mandarin filter is xk=Axk-1+Bkuk+wkX in the equation of state predictionkAnd xk-1Respectively representing the motion states of the target in the k frame image and the k-1 frame image, wherein the motion states of the target comprise position coordinates, speed and acceleration of the center of the target, and A represents a state transition matrix from the k-1 frame image to the k frame image; b iskA control matrix representing a tracked object motion system; u. ofkThe input control quantity of the tracked target motion system is input; w is akNoise representing a tracked object motion system; when the unmanned aerial vehicle tracks that a target is shielded by an obstacle, a Kalman filter predicts the position coordinate of the target center in the kth frame image according to the position coordinate of the target center in the kth frame image by using a state prediction equation, and finally performs fusion prediction on the position coordinate of the shielded target center in the kth frame image predicted by a BP neural network and the position coordinate of the shielded target center in the kth frame image predicted by a Kalman filtering algorithm according to the state prediction equation to obtain the position coordinate of the target center shielded by the obstacle in the kth frame image;
3) the airborne computer controls the unmanned aerial vehicle to track the target to fly: the airborne computer controls the flight speed of the unmanned aerial vehicle according to the deviation of the position coordinates of the target center and the image center, and finally the target tracking flight is realized; the method comprises the following specific steps:
3-1) firstly, calculating the flight speeds of the unmanned aerial vehicle along the x direction and the y direction by the airborne computer according to the deviation of the current position coordinate of the target center and the image center coordinate in the x direction and the y direction by using a PID controller, then directly sending the flight speeds to an unmanned aerial vehicle flight control system in real time by the airborne computer, and finally controlling the unmanned aerial vehicle to realize real-time target tracking flight.
In step 1-2), the BP neural network comprises an input layer, a hidden layer and an output layer, wherein the input layer comprises four neurons, the hidden layer comprises nine neurons, and the output layer comprises two neurons; the image saved in the data container for constructing the training BP neural network dataset, updated in real time, is six frames, i.e., N equals 6.
In step 2-2), the specific operation step of fusing the position coordinate of the center of the occluded target in the k frame image predicted by the BP neural network and the position coordinate of the center of the occluded target predicted by the kalman filter algorithm according to the state prediction equation is as follows: and fusing the position coordinates of the center of the blocked target in the k frame image predicted by the BP neural network as the measurement value of a Kalman filter with the position coordinates of the center of the blocked target in the k frame image predicted by a Kalman filtering algorithm according to a state prediction equation, and finally predicting to obtain the position coordinates of the center of the target blocked by the obstacle in the k frame image.
In the step 3-1), the unmanned aerial vehicle flight control system works in an offboard flight mode, the airborne computer is connected with the unmanned aerial vehicle flight control system through a USB data line, the flight speed is sent to the unmanned aerial vehicle flight control system in real time through an ROS operating system, and finally target tracking flight of the unmanned aerial vehicle is achieved.
Compared with the prior art, the invention has the following advantages:
1. the novel target tracking method for the unmanned aerial vehicle integrates the advantages of a BP neural network and a Kalman filtering algorithm, so that the central position coordinates of the target when the target is shielded by an obstacle in the target tracking process of the unmanned aerial vehicle can be accurately predicted, and the target tracking stability of the unmanned aerial vehicle is improved.
2. The airborne computer directly sends speed instructions to the unmanned aerial vehicle flight control system through the ROS operating system, so that the data transmission time is saved, and the real-time performance of target tracking of the unmanned aerial vehicle is improved.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments.
The unmanned aerial vehicle target tracking method based on the BP neural network fusion Kalman filtering algorithm comprises the steps of constructing a data set for training the BP neural network, constructing and real-timely training the BP neural network on an on-board computer on line for predicting the central position coordinate of an occluded target, fusing the position coordinates of the target center predicted by the BP neural network and the Kalman filtering algorithm to obtain the central position coordinate of the occluded target, and finally sending a speed instruction to a flight control system by the on-board computer in real time by using an ROS operating system to control an unmanned aerial vehicle to realize target tracking flight; the specific implementation process is as follows:
1) constructing a data set for training the BP neural network: constructing a data set for training a BP neural network by using a target center position coordinate obtained by a target tracking algorithm when an unmanned aerial vehicle is used for tracking a target without obstruction by using an airborne computer; the method comprises the following specific steps:
1-1) firstly, when the unmanned aerial vehicle tracks a target without obstruction shielding, calculating the position coordinates x and y of the center of the tracked target in each frame of tracking image in real time by using a target tracking algorithm, then saving the center position coordinates of the target in the latest six (but not limited to six) frames of images in a data container, and updating the center position coordinates of the target stored in the data container in real time along with the movement of the target;
1-2) constructing an input/output coordinate difference value for training a BP (back propagation) neural network by using position coordinates of target centers in six (but not limited to six) frames of images updated in real time in a data container, firstly taking out the position coordinates of the target centers in the first five (but not limited to five) frames of images in the data container, correspondingly subtracting the position coordinates of the target centers in adjacent images to construct four (but not limited to four) groups of input coordinate difference values, then taking out the position coordinates of the target centers in the last three (but not limited to three) frames of images in the data container, and correspondingly subtracting the position coordinates of the target centers in adjacent images to construct two (but not limited to two) groups of output coordinate difference values;
2) training a BP neural network and predicting the central position coordinates of the shielded target in real time: training a BP neural network on line by using the constructed input and output coordinate difference, and then fusing the predicted value of the BP neural network and the predicted value of the Kalman filtering algorithm to obtain the position coordinate of the center of the shielded target when the target is shielded by the obstacle; the method comprises the following specific steps:
2-1) firstly, utilizing the input and output coordinate difference value constructed in the step 1-2) to train a BP neural network on line, then, when an unmanned aerial vehicle tracks a target, judging whether the target is blocked by an obstacle or not by calculating a babbit coefficient between a current target color histogram and an initially selected target color histogram in real time, when the babbit coefficient is larger than 0.7 (but not limited to 0.7), judging that the target in the current moment, namely the k frame image is blocked by the obstacle, secondly, correspondingly subtracting the position coordinates of the tracked target center in the first two frames, namely the k-1 frame and the k-2 frame image, which are adjacent to the k frame image, to construct a group of input coordinate difference values, inputting the group of input coordinate difference values into the BP neural network, namely predicting to obtain a group of output coordinate difference values, and finally, adding the position coordinates of the target center in the k-1 frame image and the coordinate difference value output by the BP neural network to obtain the blocked coordinate difference value in the k frame image predicted by the Position coordinates of a center of the gear target;
2-2) then predicting the position coordinates of the center of the occluded target by using a Kalman filter according to a state prediction equation of the Kalman filter, wherein the state prediction equation of the Kalman filter is xk=Axk-1+Bkuk+wkX in the equation of state predictionkAnd xk-1Respectively representing the motion states of the target in the k frame and the k-1 frame image, wherein the motion states of the target comprise position coordinates, speed and acceleration of the center of the target, A represents a state transition matrix from the k-1 frame image to the k frame image; b iskA control matrix representing a tracked object motion system; u. ofkThe input control quantity of the tracked target motion system is input; w is akNoise representing a tracked object motion system; when the unmanned aerial vehicle tracks that a target is shielded by an obstacle, a Kalman filter predicts the position coordinate of the target center in the kth frame image according to the position coordinate of the target center in the kth frame image by using a state prediction equation, and finally performs fusion prediction on the position coordinate of the shielded target center in the kth frame image predicted by a BP neural network and the position coordinate of the shielded target center in the kth frame image predicted by a Kalman filtering algorithm according to the state prediction equation to obtain the position coordinate of the target center shielded by the obstacle in the kth frame image;
3) the airborne computer controls the unmanned aerial vehicle to track the target to fly: the airborne computer controls the flight speed of the unmanned aerial vehicle according to the deviation of the position coordinates of the target center and the image center, and finally the target tracking flight is realized; the method comprises the following specific steps:
3-1) firstly, calculating the flight speeds of the unmanned aerial vehicle along the x direction and the y direction by the airborne computer through the PID controller according to the deviation between the current position coordinate of the target center and the image center coordinate in the x direction and the y direction, then directly sending the flight speeds to the unmanned aerial vehicle flight control system by the airborne computer through an ROS operating system in real time, and finally controlling the unmanned aerial vehicle to realize real-time target tracking flight.
As a preferred embodiment of the present invention, in step 2-2), the specific operation step of fusing the position coordinate of the center of the occluded target in the k frame image predicted by the BP neural network and the position coordinate of the center of the occluded target predicted by the kalman filter algorithm according to the state prediction equation is: and fusing the position coordinates of the center of the blocked target in the k frame image predicted by the BP neural network as the measurement value of a Kalman filter with the position coordinates of the center of the blocked target in the k frame image predicted by a Kalman filtering algorithm according to a state prediction equation, and finally predicting to obtain the position coordinates of the center of the target blocked by the obstacle in the k frame image.
In the step 3-1), the unmanned aerial vehicle flight control system works in an offboard flight mode, the airborne computer is connected with the unmanned aerial vehicle flight control system through a USB data line, the flight speed is sent to the unmanned aerial vehicle flight control system in real time through an ROS operating system, and finally target tracking flight of the unmanned aerial vehicle is achieved.
The above-mentioned embodiments are only preferred embodiments of the present invention, and not intended to limit the scope of the present invention, so that any quantitative changes made in the procedures, principles and BP neural network structure of the present invention are all covered by the scope of the present invention.

Claims (4)

1. An unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm is characterized by comprising the steps of constructing a data set for training a BP neural network, constructing and real-time online training the BP neural network for predicting the position coordinates of the center of a blocked target on an onboard computer, fusing the position coordinates of the center of the blocked target predicted by the BP neural network and the Kalman filtering algorithm to obtain the position coordinates of the center of the blocked target, and finally sending a speed instruction to a flight control system by the onboard computer in real time to control an unmanned aerial vehicle to realize target tracking flight; the method specifically comprises the following steps:
1) constructing a data set for training the BP neural network: constructing a data set for training a BP neural network by using position coordinates of a target center obtained by a target tracking algorithm when an unmanned aerial vehicle is used for tracking a target without obstruction by using an airborne computer; the method comprises the following specific steps:
1-1) firstly, when an unmanned aerial vehicle tracks a target without obstruction shielding, calculating the position coordinates x and y of the center of the tracked target in each frame of tracked image in real time by using a target tracking algorithm, then storing the position coordinates of the target center in the latest N frames of images in a data container, and updating the position coordinates of the target center stored in the data container in real time along with the movement of the target;
1-2) constructing an input/output coordinate difference value for training a BP (back propagation) neural network by using the position coordinates of the target center in the N frames of images updated in real time in the data container, firstly taking out the position coordinates of the target center in the front N-1 frames of images in the data container, correspondingly subtracting the position coordinates of the target center in the adjacent images to construct an N-2 group of input coordinate difference values, then taking out the position coordinates of the target center in the back N-3 frames of images in the data container, and correspondingly subtracting the position coordinates of the target center in the adjacent images to construct an N-4 group of output coordinate difference values;
2) training a BP neural network and predicting the central position coordinates of the shielded target in real time: training a BP neural network on line by using the constructed input and output coordinate difference, and then fusing the predicted value of the BP neural network and the predicted value of the Kalman filtering algorithm to obtain the position coordinate of the center of the shielded target when the target is shielded by the obstacle; the method comprises the following specific steps:
2-1) firstly, utilizing the input and output coordinate difference value constructed in the step 1-2) to train a BP neural network on line, then, when an unmanned aerial vehicle tracks a target, judging whether the target is blocked by an obstacle or not by calculating a babbit coefficient between a current target color histogram and an initially selected target color histogram in real time, judging that the target at the current moment, namely the k frame image is blocked by the obstacle when the babbit coefficient is larger than a set threshold value, secondly, correspondingly subtracting the position coordinates of the tracked target center in the first two frame images, namely the k-1 frame and the k-2 frame image, which are adjacent to the k frame image, to construct a group of input coordinate difference values, inputting the group of input coordinate difference values into the BP neural network, namely predicting to obtain a group of output coordinate difference values, and finally, adding the position coordinates of the target center in the k-1 frame image and the coordinate difference value output by the BP neural network to obtain the blocked target in the k frame image predicted by the BP neural network Position coordinates of the center;
2-2) then predicting the position coordinates of the center of the occluded target by using a Kalman filter according to a state prediction equation of the Kalman filter, wherein the state prediction equation of the Kalman filter is xk=Axk-1+Bkuk+wkX in the equation of state predictionkAnd xk-1Respectively representing the motion states of the target in the k frame and the k-1 frame image, wherein the motion states of the target comprise the central position coordinate, the speed and the acceleration of the target, and A represents a state transition matrix from the k-1 frame image to the k frame image; b iskA control matrix representing a tracked object motion system; u. ofkThe input control quantity of the tracked target motion system is input; w is akNoise representing a tracked object motion system; when the unmanned aerial vehicle tracks that a target is shielded by an obstacle, a Kalman filter predicts the position coordinate of the target center in the kth frame image according to the position coordinate of the target center in the kth-1 frame image by using a state prediction equation to obtain the position coordinate of the target center in the kth frame image, and finally performs fusion prediction on the position coordinate of the shielded target center in the kth frame image predicted by a BP neural network and the position coordinate of the shielded target center in the kth frame image predicted by a Kalman filtering algorithm according to the state prediction equation to obtain the position coordinate of the target center shielded by the obstacle in the kth frame image;
3) the airborne computer controls the unmanned aerial vehicle to track the target to fly: the airborne computer controls the flight speed of the unmanned aerial vehicle according to the deviation of the position coordinates of the target center and the image center, and finally the target tracking flight is realized; the method comprises the following specific steps:
3-1) firstly, calculating the flight speeds of the unmanned aerial vehicle along the x direction and the y direction by the airborne computer according to the deviation of the current position coordinate of the target center and the image center coordinate in the x direction and the y direction by using a PID controller, then directly sending the flight speeds to an unmanned aerial vehicle flight control system in real time by the airborne computer, and finally controlling the unmanned aerial vehicle to realize real-time target tracking flight.
2. The unmanned aerial vehicle target tracking method based on the BP neural network fusion Kalman filtering algorithm according to claim 1, characterized in that: in step 1-2), the BP neural network comprises an input layer, a hidden layer and an output layer, wherein the input layer comprises four neurons, the hidden layer comprises nine neurons, and the output layer comprises two neurons; the image saved in the data container for constructing the training BP neural network dataset, updated in real time, is six frames, i.e., N equals 6.
3. The unmanned aerial vehicle target tracking method based on the BP neural network fusion Kalman filtering algorithm according to claim 1, characterized in that: in step 2-2), the specific operation step of fusing the position coordinate of the center of the occluded target in the k frame image predicted by the BP neural network and the position coordinate of the center of the occluded target predicted by the kalman filtering algorithm according to the state prediction equation is as follows: and fusing the position coordinates of the center of the blocked target in the k frame image predicted by the BP neural network as the measurement value of a Kalman filter with the position coordinates of the center of the blocked target in the k frame image predicted by a Kalman filtering algorithm according to a state prediction equation, and finally predicting to obtain the position coordinates of the center of the target blocked by the obstacle in the k frame image.
4. The unmanned aerial vehicle target tracking method based on the BP neural network fusion Kalman filtering algorithm according to claim 1, characterized in that: in the step 3-1), the unmanned aerial vehicle flight control system works in an offboard flight mode, the airborne computer is connected with the unmanned aerial vehicle flight control system through a USB data line, the flight speed is sent to the unmanned aerial vehicle flight control system in real time through an ROS operating system, and finally target tracking of the unmanned aerial vehicle is achieved.
CN201910849702.XA 2019-09-09 2019-09-09 Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm Active CN110610512B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910849702.XA CN110610512B (en) 2019-09-09 2019-09-09 Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910849702.XA CN110610512B (en) 2019-09-09 2019-09-09 Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm

Publications (2)

Publication Number Publication Date
CN110610512A true CN110610512A (en) 2019-12-24
CN110610512B CN110610512B (en) 2021-07-27

Family

ID=68892560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910849702.XA Active CN110610512B (en) 2019-09-09 2019-09-09 Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm

Country Status (1)

Country Link
CN (1) CN110610512B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476116A (en) * 2020-03-24 2020-07-31 南京新一代人工智能研究院有限公司 Rotor unmanned aerial vehicle system for vehicle detection and tracking and detection and tracking method
CN112435275A (en) * 2020-12-07 2021-03-02 中国电子科技集团公司第二十研究所 Unmanned aerial vehicle maneuvering target tracking method integrating Kalman filtering and DDQN algorithm
CN113253755A (en) * 2021-05-08 2021-08-13 广东白云学院 Neural network-based rotor unmanned aerial vehicle tracking algorithm
CN113467462A (en) * 2021-07-14 2021-10-01 中国人民解放军国防科技大学 Pedestrian accompanying control method and device for robot, mobile robot and medium
CN116048120B (en) * 2023-01-10 2024-04-16 中国建筑一局(集团)有限公司 Autonomous navigation system and method for small four-rotor unmanned aerial vehicle in unknown dynamic environment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180114072A1 (en) * 2016-10-25 2018-04-26 Vmaxx, Inc. Vision Based Target Tracking Using Tracklets
CN108053427A (en) * 2017-10-31 2018-05-18 深圳大学 A kind of modified multi-object tracking method, system and device based on KCF and Kalman
CN109145836A (en) * 2018-08-28 2019-01-04 武汉大学 Ship target video detection method based on deep learning network and Kalman filtering
CN109785363A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第五十二研究所 A kind of unmanned plane video motion Small object real-time detection and tracking
CN109919981A (en) * 2019-03-11 2019-06-21 南京邮电大学 A kind of multi-object tracking method of the multiple features fusion based on Kalman filtering auxiliary

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180114072A1 (en) * 2016-10-25 2018-04-26 Vmaxx, Inc. Vision Based Target Tracking Using Tracklets
CN108053427A (en) * 2017-10-31 2018-05-18 深圳大学 A kind of modified multi-object tracking method, system and device based on KCF and Kalman
CN109145836A (en) * 2018-08-28 2019-01-04 武汉大学 Ship target video detection method based on deep learning network and Kalman filtering
CN109785363A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第五十二研究所 A kind of unmanned plane video motion Small object real-time detection and tracking
CN109919981A (en) * 2019-03-11 2019-06-21 南京邮电大学 A kind of multi-object tracking method of the multiple features fusion based on Kalman filtering auxiliary

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476116A (en) * 2020-03-24 2020-07-31 南京新一代人工智能研究院有限公司 Rotor unmanned aerial vehicle system for vehicle detection and tracking and detection and tracking method
WO2021189507A1 (en) * 2020-03-24 2021-09-30 南京新一代人工智能研究院有限公司 Rotor unmanned aerial vehicle system for vehicle detection and tracking, and detection and tracking method
CN112435275A (en) * 2020-12-07 2021-03-02 中国电子科技集团公司第二十研究所 Unmanned aerial vehicle maneuvering target tracking method integrating Kalman filtering and DDQN algorithm
CN113253755A (en) * 2021-05-08 2021-08-13 广东白云学院 Neural network-based rotor unmanned aerial vehicle tracking algorithm
CN113467462A (en) * 2021-07-14 2021-10-01 中国人民解放军国防科技大学 Pedestrian accompanying control method and device for robot, mobile robot and medium
CN113467462B (en) * 2021-07-14 2023-04-07 中国人民解放军国防科技大学 Pedestrian accompanying control method and device for robot, mobile robot and medium
CN116048120B (en) * 2023-01-10 2024-04-16 中国建筑一局(集团)有限公司 Autonomous navigation system and method for small four-rotor unmanned aerial vehicle in unknown dynamic environment

Also Published As

Publication number Publication date
CN110610512B (en) 2021-07-27

Similar Documents

Publication Publication Date Title
CN110610512B (en) Unmanned aerial vehicle target tracking method based on BP neural network fusion Kalman filtering algorithm
US11726477B2 (en) Methods and systems for trajectory forecasting with recurrent neural networks using inertial behavioral rollout
CN111273668B (en) Unmanned vehicle motion track planning system and method for structured road
CN110362083B (en) Autonomous navigation method under space-time map based on multi-target tracking prediction
CN111651705B (en) Cluster formation tracking control method
WO2022257283A1 (en) Vehicle obstacle avoidance method and apparatus, electronic device, and storage medium
CN113682318B (en) Vehicle running control method and device
CN111897316A (en) Multi-aircraft autonomous decision-making method under scene fast-changing condition
US11577723B2 (en) Object trajectory association and tracking
CN113635912B (en) Vehicle control method, device, equipment, storage medium and automatic driving vehicle
CN111679660A (en) Unmanned deep reinforcement learning method integrating human-like driving behaviors
CN114442630A (en) Intelligent vehicle planning control method based on reinforcement learning and model prediction
CN112622924B (en) Driving planning method and device and vehicle
Zhou et al. Identify, estimate and bound the uncertainty of reinforcement learning for autonomous driving
CN114267191B (en) Control system, method, medium, equipment and application for relieving traffic jam of driver
CN115454082A (en) Vehicle obstacle avoidance method and system, computer readable storage medium and electronic device
CN115384552A (en) Control method and device for automatic driving vehicle and automatic driving vehicle
Ren et al. Intelligent path planning and obstacle avoidance algorithms for autonomous vehicles based on enhanced rrt algorithm
CN113391642A (en) Unmanned aerial vehicle autonomous obstacle avoidance method and system based on monocular vision
US20240054822A1 (en) Methods and systems for managing data storage in vehicle operations
CN117590865B (en) Fixed wing unmanned aerial vehicle tracking target motion prediction method
CN117475627A (en) Intersection intelligent network vehicle connection control method based on reinforcement learning
CN117312760B (en) Space grid-based space-time distribution prediction method for moving target
Ochs et al. One Stack to Rule them All: To Drive Automated Vehicles, and Reach for the 4th level
CN115482687B (en) Method, device, equipment and medium for vehicle lane change risk assessment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant