CN111161305A - Intelligent unmanned aerial vehicle identification tracking method and system - Google Patents

Intelligent unmanned aerial vehicle identification tracking method and system Download PDF

Info

Publication number
CN111161305A
CN111161305A CN201911313632.2A CN201911313632A CN111161305A CN 111161305 A CN111161305 A CN 111161305A CN 201911313632 A CN201911313632 A CN 201911313632A CN 111161305 A CN111161305 A CN 111161305A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
tracking
detected
current camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911313632.2A
Other languages
Chinese (zh)
Inventor
潘力
伍夏清
姚笛
王先高
刘永强
沈智杰
景晓军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Surfilter Technology Development Co ltd
Surfilter Network Technology Co ltd
Original Assignee
Shenzhen Surfilter Technology Development Co ltd
Surfilter Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Surfilter Technology Development Co ltd, Surfilter Network Technology Co ltd filed Critical Shenzhen Surfilter Technology Development Co ltd
Priority to CN201911313632.2A priority Critical patent/CN111161305A/en
Publication of CN111161305A publication Critical patent/CN111161305A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The invention discloses an intelligent identification and tracking method for an unmanned aerial vehicle, which comprises the following steps: step S1, searching by using the photoelectric equipment to obtain a video stream; step S2, performing target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected; step S3, tracking the motion information of the detected unmanned aerial vehicle through an optical flow tracking algorithm; and step S4, feeding back the detected position information and motion information of the unmanned aerial vehicle. The unmanned aerial vehicle intelligent identification tracking method provided by the invention is based on photoelectric equipment, and searches, detects and tracks the unmanned aerial vehicle moving in the air by utilizing a deep learning convolutional neural network target detection algorithm and an optical flow tracking algorithm, thereby realizing the identification and tracking of the unmanned aerial vehicle.

Description

Intelligent unmanned aerial vehicle identification tracking method and system
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an intelligent unmanned aerial vehicle identification and tracking method and system.
Background
In the civilian field, unmanned aerial vehicle can be used to all kinds of monitoring, control, tour, search and rescue, photography, survey and drawing, investigation and investigation. Such as environmental monitoring, disaster monitoring, traffic road monitoring, border patrol and control, drug contraband, agricultural surveying, large pasture patrol, urban surveillance, aerial photography, and the like. Therefore, if can realize having very important meaning to accurate detection, discernment and the pursuit of unmanned aerial vehicle.
Disclosure of Invention
The invention mainly aims to provide an intelligent unmanned aerial vehicle identification and tracking method and system.
In order to achieve the purpose, the invention provides an intelligent unmanned aerial vehicle identification and tracking method, which comprises the following steps:
step S1, searching by using the photoelectric equipment to obtain a video stream;
step S2, performing target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected;
step S3, tracking the motion information of the detected unmanned aerial vehicle through an optical flow tracking algorithm;
and step S4, feeding back the detected position information and motion information of the unmanned aerial vehicle.
In the intelligent unmanned aerial vehicle identification and tracking method provided by the invention, before step S1, the method further comprises:
and S0, training the acquired and labeled unmanned aerial vehicle pictures based on the yolov3 model, and establishing the unmanned aerial vehicle feature model based on the yolov 3.
In the method for intelligently identifying and tracking an unmanned aerial vehicle provided by the present invention, the step S2 includes:
step S21, extracting frame data from the video stream;
step S22, carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
step S23, when the unmanned aerial vehicle is detected, acquiring the parameter information of the detected unmanned aerial vehicle and the central point of the unmanned aerial vehicle;
and step S24, calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the detected parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle.
In the intelligent unmanned aerial vehicle identification and tracking method provided by the invention, the angular coordinate of the unmanned aerial vehicle in a space coordinate system is calculated by the following formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
In the method for intelligently identifying and tracking an unmanned aerial vehicle provided by the present invention, the step S3 includes:
step S31, estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinates of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
step S32, when the distance between the unmanned aerial vehicles reaches a preset value, stopping the unmanned aerial vehicle searching process;
s33, adjusting the current camera position, starting an optical flow tracking algorithm, and tracking the motion information of the unmanned aerial vehicle
The invention also provides an intelligent unmanned aerial vehicle identification and tracking system, which comprises:
an optoelectronic device for acquiring a video stream;
the detection module is used for carrying out target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected;
the tracking module is used for tracking the detected motion information of the unmanned aerial vehicle through an optical flow tracking algorithm;
and the feedback module is used for feeding back the detected position information and motion information of the unmanned aerial vehicle.
In the intelligent unmanned aerial vehicle identification and tracking system provided by the invention, the system further comprises:
and the training module is used for training the acquired and labeled unmanned aerial vehicle pictures based on the yolov3 model and establishing the unmanned aerial vehicle feature model based on the yolov 3.
In the intelligent unmanned aerial vehicle identification and tracking system provided by the invention, the detection module comprises:
a data extraction unit for extracting frame data from the video stream;
the target detection unit is used for carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
the parameter extraction unit is used for acquiring parameter information of the unmanned aerial vehicle and a central point of the unmanned aerial vehicle when the unmanned aerial vehicle is detected;
and the calculating unit is used for calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle.
In the intelligent unmanned aerial vehicle identification tracking system provided by the invention, the angular coordinate of the unmanned aerial vehicle in a space coordinate system is calculated by the following formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
In the intelligent unmanned aerial vehicle identification tracking system provided by the invention, the tracking module comprises:
the pre-estimation unit is used for pre-estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinate of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
the judging unit is used for stopping the unmanned aerial vehicle searching process when the distance of the unmanned aerial vehicle reaches a preset value;
and the tracking unit is used for adjusting the current camera position, starting an optical flow tracking algorithm and tracking the motion information of the unmanned aerial vehicle.
The intelligent unmanned aerial vehicle identification and tracking method and system have the following beneficial effects: the unmanned aerial vehicle intelligent identification tracking method provided by the invention is based on photoelectric equipment, and searches, detects and tracks the unmanned aerial vehicle moving in the air by utilizing a deep learning convolutional neural network target detection algorithm and an optical flow tracking algorithm, thereby realizing the identification and tracking of the unmanned aerial vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts:
fig. 1 is a schematic flow chart of an intelligent unmanned aerial vehicle identification and tracking method according to an embodiment of the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Exemplary embodiments of the invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
In order to better understand the technical solutions, the technical solutions will be described in detail below with reference to the drawings and the specific embodiments of the specification, and it should be understood that the embodiments and specific features of the embodiments of the present invention are detailed descriptions of the technical solutions of the present application, and are not limited to the technical solutions of the present application, and the technical features of the embodiments and examples of the present invention may be combined with each other without conflict.
Example one
Fig. 1 is a flowchart of an intelligent unmanned aerial vehicle identification and tracking method according to an embodiment of the present invention. As shown in fig. 1, the intelligent unmanned aerial vehicle identification and tracking method provided by the invention comprises the following steps:
step S1, searching by using the photoelectric equipment to obtain a video stream; (ii) a
Specifically, in an embodiment of the present invention, the optoelectronic device is one or more cameras including a pan-tilt, and the video stream can be acquired through the optoelectronic device search. Before searching, photoelectric equipment needs to be installed, original points of an x axis and a y axis of the photoelectric equipment are calibrated, and the original points of the photoelectric equipment are used as a central original point of a system coordinate axis. After the equipment calibration is completed, starting the photoelectric equipment to start retrieval, setting a camera pitching angle according to the installation position and the scene characteristics, and then controlling the camera holder to rotate at a certain speed by 360 degrees. The video acquisition frame rate of the camera is about 30fps, the focal length of the camera is set according to three values of far (1km), middle (500m) and near (100m), and the camera sequentially rotates for a circle and searches in turn according to three grades of far, middle and near.
Step S2, performing target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected;
specifically, in an embodiment of the present invention, frame data is obtained from a video stream obtained by a photoelectric device, a frame picture is subjected to an object detection deduction based on a yolov3 drone feature model, and when a drone is detected, a model of the drone, coordinates (x, y) of a center point in a current frame, and a width-height value (wx, wy) of the drone can be obtained. Therefore, step S2 includes:
step S21, extracting frame data from the video stream;
step S22, carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
step S23, when the unmanned aerial vehicle is detected, acquiring the parameter information of the detected unmanned aerial vehicle and the central point of the unmanned aerial vehicle, wherein the parameter information of the unmanned aerial vehicle refers to the model of the unmanned aerial vehicle and the width and height data of the unmanned aerial vehicle.
And step S24, calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the detected parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle.
In this step, the angular coordinate of the drone in the spatial coordinate system is calculated by the following formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
Further, in an embodiment of the present invention, the method further includes:
training the collected and labeled unmanned aerial vehicle pictures based on the yolov3 model, and establishing the unmanned aerial vehicle feature model based on the yolov 3.
YOLO (you Only Look one) is an object detection algorithm based on a Convolutional Neural Network (CNN), YOLOv3 is the third version of a YOLO series target detection algorithm, and compared with the previous algorithms, the precision is obviously improved particularly for small targets. In the present invention, training using the yolov3 model comprises the following steps: shooting a photo of the unmanned aerial vehicle by using a camera; performing cutting data processing on the photo; manually calibrating the position of the unmanned aerial vehicle to manufacture a Label; inputting training data and Label into yolov3 model for training; and detecting test data by using the trained unmanned aerial vehicle characteristic model, and adjusting model parameters according to the test data. The number of the acquired frames is identified through the unmanned aerial vehicle feature model based on yolov3, and the identification precision can be further improved.
Step S3, tracking the motion information of the detected unmanned aerial vehicle through an optical flow tracking algorithm;
in particular, in one embodiment of the present invention, the optical flow algorithm is a method for directly processing the image itself, and is very useful in the image processing fields such as pattern recognition, computer vision, and the like. On an image plane, the motion of an object is often represented by the change of the gray scale distribution of different points in an image sequence, so that a motion field in space can be represented as an optical flow field on the image plane, and the optical flow field reflects the gray scale change trend of each point on the image. And after the target unmanned aerial vehicle is detected, tracking the unmanned aerial vehicle by using an optical flow algorithm. Therefore, step S3 includes:
step S31, estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinates of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
step S32, when the distance between the unmanned aerial vehicles reaches a preset value, stopping the unmanned aerial vehicle searching process;
and S33, adjusting the current camera position, starting an optical flow tracking algorithm, and tracking the motion information of the unmanned aerial vehicle.
In the optical flow method, it is assumed that:
constant brightness, no change in pixel brightness of objects in the image between successive frames
Short distance (short time) motion, time between adjacent frames sufficiently short, object motion less
Spatial uniformity, neighboring pixels have similar motion
On an image plane, the motion of an object is often represented by the change of the gray scale distribution of different points in an image sequence, so that a motion field in space can be represented as an optical flow field on the image plane, and the optical flow field reflects the gray scale change trend of each point on the image.
Let I (x, y, t) be the gray value of a pixel point on an image, and go to (x + dx, y + dy) after dt times, assuming that the gray value of the pixel does not change in adjacent time periods, the optical flow equation is obtained as follows:
I(x,y,t)=I(x+dx,y+dy,t+dt)
the optical flow equation is expanded in a taylor series:
Figure BDA0002324934620000061
from the above formula, one can obtain:
Figure BDA0002324934620000062
note the book
Figure BDA0002324934620000071
Namely the pixel light stream to be solved;
Figure BDA0002324934620000072
is the pixel gray scale spatial differential;
Figure BDA0002324934620000073
is the time gray scale differential of the pixel coordinate points; arranging into a matrix form:
Figure BDA0002324934620000074
this equation represents the time gray differential for the same coordinate position as the product of the spatial gray differential and the velocity relative to the observer at this position. From the spatial consistency assumption, for a number of points around, there are:
Figure BDA0002324934620000075
this is a standard system of linear equations that can be solved by the least squares method
Figure BDA0002324934620000076
The iterative solution can be carried out to obtain the optical flow position information, and the target position information of the next frame can be solved by a small amount of calculation, so that a basis is provided for target tracking.
And step S4, feeding back the detected position information and motion information of the unmanned aerial vehicle.
Specifically, in an embodiment of the present invention, the detection information of the drone is fed back to the system in real time, and the detection state of the drone is reported as needed.
The unmanned aerial vehicle intelligent identification tracking method provided by the invention is based on photoelectric equipment, and searches, detects and tracks the unmanned aerial vehicle moving in the air by utilizing a deep learning convolutional neural network target detection algorithm and an optical flow tracking algorithm, thereby realizing the identification and tracking of the unmanned aerial vehicle.
Example two
Based on the same inventive concept, the embodiment discloses an intelligent unmanned aerial vehicle identification and tracking system, which comprises:
the training module is used for training the acquired and labeled unmanned aerial vehicle pictures based on the yolov3 model and establishing the unmanned aerial vehicle feature model based on the yolov 3;
an optoelectronic device for acquiring a video stream;
the detection module is used for carrying out target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected; the detection module comprises:
a data extraction unit for extracting frame data from the video stream;
the target detection unit is used for carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
the parameter extraction unit is used for acquiring parameter information of the unmanned aerial vehicle and a central point of the unmanned aerial vehicle when the unmanned aerial vehicle is detected;
the calculation unit is used for calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the detected parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle;
wherein, calculate the angular coordinate of this unmanned aerial vehicle in the space coordinate system through following formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
The tracking module is used for tracking the detected motion information of the unmanned aerial vehicle through an optical flow tracking algorithm;
the tracking module includes:
the pre-estimation unit is used for pre-estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinate of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
the judging unit is used for stopping the unmanned aerial vehicle searching process when the distance of the unmanned aerial vehicle reaches a preset value;
and the tracking unit is used for adjusting the current camera position, starting an optical flow tracking algorithm and tracking the motion information of the unmanned aerial vehicle.
And the feedback module is used for feeding back the detected position information and motion information of the unmanned aerial vehicle.
The functions of the functional modules of the system according to the embodiment of the present invention may be specifically implemented according to the method in the embodiment of the method, and the specific implementation process may refer to the description related to the embodiment of the method, which is not described herein again.
EXAMPLE III
Based on the same inventive concept, the embodiment discloses an intelligent unmanned aerial vehicle identification and tracking system, which comprises a processor and a memory, wherein the memory stores a computer program, and the computer program realizes the steps of the method according to the first embodiment when being executed by the processor.
Example four
Based on the same inventive concept, the present embodiment discloses a computer-readable storage medium, storing a computer program, which, when executed by a processor, performs the steps of the method according to the first embodiment.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some or all of the components in accordance with embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.

Claims (10)

1. An intelligent unmanned aerial vehicle identification and tracking method is characterized by comprising the following steps:
step S1, searching by using the photoelectric equipment to obtain a video stream;
step S2, performing target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected;
step S3, tracking the motion information of the detected unmanned aerial vehicle through an optical flow tracking algorithm;
and step S4, feeding back the detected position information and motion information of the unmanned aerial vehicle.
2. The intelligent unmanned aerial vehicle identification and tracking method of claim 1, wherein before step S1, the method further comprises:
and S0, training the acquired and labeled unmanned aerial vehicle pictures based on the yolov3 model, and establishing the unmanned aerial vehicle feature model based on the yolov 3.
3. The intelligent unmanned aerial vehicle identification and tracking method of claim 1, wherein the step S2 includes:
step S21, extracting frame data from the video stream;
step S22, carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
step S23, when the unmanned aerial vehicle is detected, acquiring the parameter information of the detected unmanned aerial vehicle and the central point of the unmanned aerial vehicle;
and step S24, calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the detected parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle.
4. The intelligent unmanned aerial vehicle identification tracking method of claim 3, wherein the angular coordinate of the unmanned aerial vehicle in the spatial coordinate system is calculated by the following formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
5. The intelligent unmanned aerial vehicle identification and tracking method of claim 3, wherein the step S3 comprises:
step S31, estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinates of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
step S32, when the distance between the unmanned aerial vehicles reaches a preset value, stopping the unmanned aerial vehicle searching process;
and S33, adjusting the current camera position, starting an optical flow tracking algorithm, and tracking the motion information of the unmanned aerial vehicle.
6. The utility model provides an unmanned aerial vehicle intelligent recognition tracker which characterized in that includes:
an optoelectronic device for acquiring a video stream;
the detection module is used for carrying out target detection on frame data acquired from the video stream according to the unmanned aerial vehicle feature model based on yolov3, and acquiring the position information of the currently detected unmanned aerial vehicle when the unmanned aerial vehicle is detected;
the tracking module is used for tracking the detected motion information of the unmanned aerial vehicle through an optical flow tracking algorithm;
and the feedback module is used for feeding back the detected position information and motion information of the unmanned aerial vehicle.
7. The intelligent unmanned aerial vehicle identification and tracking system of claim 6, further comprising:
and the training module is used for training the acquired and labeled unmanned aerial vehicle pictures based on the yolov3 model and establishing the unmanned aerial vehicle feature model based on the yolov 3.
8. The intelligent drone identification tracking system of claim 6, wherein the detection module comprises:
a data extraction unit for extracting frame data from the video stream;
the target detection unit is used for carrying out target detection on the frame data by using the yolov 3-based unmanned aerial vehicle feature model;
the parameter extraction unit is used for acquiring parameter information of the unmanned aerial vehicle and a central point of the unmanned aerial vehicle when the unmanned aerial vehicle is detected;
and the calculating unit is used for calculating the angular coordinate of the unmanned aerial vehicle in a space coordinate system according to the parameter information of the current camera of the unmanned aerial vehicle and the central point of the unmanned aerial vehicle.
9. The intelligent drone identification tracking system of claim 8, wherein the angular position of the drone in the spatial coordinate system is calculated by the formula:
horizontal angle (px A/Wp + Ax)
Perpendicular angle py B/Hp + Ay
The unmanned aerial vehicle center point positioning method comprises the steps that Ax and Ay are rotation angles of the upper left corner of a current camera relative to an initial point of a holder of the photoelectric equipment, A and B are a horizontal field angle and a vertical field angle of the current camera respectively, the camera shoots a frame resolution, Wp and Hp are the number of pixels on a horizontal axis and the number of pixels on a vertical axis respectively, and px and py are coordinates of the central point of the unmanned aerial vehicle relative to the pixels on the upper left corner of a current frame picture respectively.
10. The intelligent drone identification tracking system of claim 8, wherein the tracking module comprises:
the pre-estimation unit is used for pre-estimating the distance of the unmanned aerial vehicle according to the parameter information of the unmanned aerial vehicle, the coordinate of the central point of the unmanned aerial vehicle, the magnification factor of the current camera, the horizontal viewing angle of the current camera and the horizontal viewing field width of the current camera;
the judging unit is used for stopping the unmanned aerial vehicle searching process when the distance of the unmanned aerial vehicle reaches a preset value;
and the tracking unit is used for adjusting the current camera position, starting an optical flow tracking algorithm and tracking the motion information of the unmanned aerial vehicle.
CN201911313632.2A 2019-12-18 2019-12-18 Intelligent unmanned aerial vehicle identification tracking method and system Pending CN111161305A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911313632.2A CN111161305A (en) 2019-12-18 2019-12-18 Intelligent unmanned aerial vehicle identification tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911313632.2A CN111161305A (en) 2019-12-18 2019-12-18 Intelligent unmanned aerial vehicle identification tracking method and system

Publications (1)

Publication Number Publication Date
CN111161305A true CN111161305A (en) 2020-05-15

Family

ID=70557241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911313632.2A Pending CN111161305A (en) 2019-12-18 2019-12-18 Intelligent unmanned aerial vehicle identification tracking method and system

Country Status (1)

Country Link
CN (1) CN111161305A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111881982A (en) * 2020-07-30 2020-11-03 北京环境特性研究所 Unmanned aerial vehicle target identification method
CN113075937A (en) * 2021-03-17 2021-07-06 北京理工大学 Control method for capturing target by unmanned aerial vehicle based on target acceleration estimation
CN113359853A (en) * 2021-07-09 2021-09-07 中国人民解放军国防科技大学 Route planning method and system for unmanned aerial vehicle formation cooperative target monitoring

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606155A (en) * 2013-11-27 2014-02-26 中国科学院西安光学精密机械研究所 Camera view field calibrating method and device
CN104796672A (en) * 2015-05-09 2015-07-22 合肥工业大学 Emergency monitoring cloud platform device for unmanned aerial vehicle and operating method of emergency monitoring cloud platform device for unmanned aerial vehicle
CN105716582A (en) * 2016-02-15 2016-06-29 中林信达(北京)科技信息有限责任公司 Method and device for measuring field angle of vidicon and vidicon field angle measuring instrument
CN106707296A (en) * 2017-01-09 2017-05-24 华中科技大学 Dual-aperture photoelectric imaging system-based unmanned aerial vehicle detection and recognition method
CN108298101A (en) * 2017-12-25 2018-07-20 上海歌尔泰克机器人有限公司 The control method and device of holder rotation, unmanned plane
CN108897342A (en) * 2018-08-22 2018-11-27 江西理工大学 For the positioning and tracing method and system of the civilian multi-rotor unmanned aerial vehicle fast moved
CN109269430A (en) * 2018-08-12 2019-01-25 浙江农林大学 The more plants of standing tree diameter of a cross-section of a tree trunk 1.3 meters above the ground passive measurement methods based on depth extraction model
CN109523571A (en) * 2018-10-25 2019-03-26 广州番禺职业技术学院 A kind of the motion profile optimization method and system of non-characteristic matching
CN110570456A (en) * 2019-07-26 2019-12-13 南京理工大学 Motor vehicle track extraction method based on fusion of YOLO target detection algorithm and optical flow tracking algorithm

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606155A (en) * 2013-11-27 2014-02-26 中国科学院西安光学精密机械研究所 Camera view field calibrating method and device
CN104796672A (en) * 2015-05-09 2015-07-22 合肥工业大学 Emergency monitoring cloud platform device for unmanned aerial vehicle and operating method of emergency monitoring cloud platform device for unmanned aerial vehicle
CN105716582A (en) * 2016-02-15 2016-06-29 中林信达(北京)科技信息有限责任公司 Method and device for measuring field angle of vidicon and vidicon field angle measuring instrument
CN106707296A (en) * 2017-01-09 2017-05-24 华中科技大学 Dual-aperture photoelectric imaging system-based unmanned aerial vehicle detection and recognition method
CN108298101A (en) * 2017-12-25 2018-07-20 上海歌尔泰克机器人有限公司 The control method and device of holder rotation, unmanned plane
CN109269430A (en) * 2018-08-12 2019-01-25 浙江农林大学 The more plants of standing tree diameter of a cross-section of a tree trunk 1.3 meters above the ground passive measurement methods based on depth extraction model
CN108897342A (en) * 2018-08-22 2018-11-27 江西理工大学 For the positioning and tracing method and system of the civilian multi-rotor unmanned aerial vehicle fast moved
CN109523571A (en) * 2018-10-25 2019-03-26 广州番禺职业技术学院 A kind of the motion profile optimization method and system of non-characteristic matching
CN110570456A (en) * 2019-07-26 2019-12-13 南京理工大学 Motor vehicle track extraction method based on fusion of YOLO target detection algorithm and optical flow tracking algorithm

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111881982A (en) * 2020-07-30 2020-11-03 北京环境特性研究所 Unmanned aerial vehicle target identification method
CN113075937A (en) * 2021-03-17 2021-07-06 北京理工大学 Control method for capturing target by unmanned aerial vehicle based on target acceleration estimation
CN113359853A (en) * 2021-07-09 2021-09-07 中国人民解放军国防科技大学 Route planning method and system for unmanned aerial vehicle formation cooperative target monitoring
CN113359853B (en) * 2021-07-09 2022-07-19 中国人民解放军国防科技大学 Route planning method and system for unmanned aerial vehicle formation cooperative target monitoring

Similar Documents

Publication Publication Date Title
CN110674746B (en) Method and device for realizing high-precision cross-mirror tracking by using video spatial relationship assistance, computer equipment and storage medium
CN107016367B (en) Tracking control method and tracking control system
CN106874854B (en) Unmanned aerial vehicle tracking method based on embedded platform
US8005264B2 (en) Method of automatically detecting and tracking successive frames in a region of interesting by an electronic imaging device
CN109520500B (en) Accurate positioning and street view library acquisition method based on terminal shooting image matching
CN111161305A (en) Intelligent unmanned aerial vehicle identification tracking method and system
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN105812746B (en) A kind of object detection method and system
CN109308693A (en) By the target detection and pose measurement list binocular vision system of a ptz camera building
CN112164015A (en) Monocular vision autonomous inspection image acquisition method and device and power inspection unmanned aerial vehicle
JP2017537484A (en) System and method for detecting and tracking movable objects
CN115439424A (en) Intelligent detection method for aerial video image of unmanned aerial vehicle
US9418299B2 (en) Surveillance process and apparatus
CN109035294B (en) Image extraction system and method for moving target
US20190311209A1 (en) Feature Recognition Assisted Super-resolution Method
CN108537726B (en) Tracking shooting method and device and unmanned aerial vehicle
CN113452912B (en) Pan-tilt camera control method, device, equipment and medium for inspection robot
CN110889829A (en) Monocular distance measurement method based on fisheye lens
CN110443247A (en) A kind of unmanned aerial vehicle moving small target real-time detecting system and method
CN111242988A (en) Method for tracking target by using double pan-tilt coupled by wide-angle camera and long-focus camera
CN111260539A (en) Fisheye pattern target identification method and system
CN115205382A (en) Target positioning method and device
CN111758118B (en) Visual positioning method, device, equipment and readable storage medium
Cai et al. Sea-skyline-based image stabilization of a buoy-mounted catadioptric omnidirectional vision system
CN116309851B (en) Position and orientation calibration method for intelligent park monitoring camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination