CN115665553A - Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium - Google Patents

Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium Download PDF

Info

Publication number
CN115665553A
CN115665553A CN202211204867.XA CN202211204867A CN115665553A CN 115665553 A CN115665553 A CN 115665553A CN 202211204867 A CN202211204867 A CN 202211204867A CN 115665553 A CN115665553 A CN 115665553A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
image
camera
horizontal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211204867.XA
Other languages
Chinese (zh)
Other versions
CN115665553B (en
Inventor
陈磊
黄金叶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiyang Special Equipment Technology Engineering Co ltd
Original Assignee
Shenzhen Qiyang Special Equipment Technology Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qiyang Special Equipment Technology Engineering Co ltd filed Critical Shenzhen Qiyang Special Equipment Technology Engineering Co ltd
Priority to CN202211204867.XA priority Critical patent/CN115665553B/en
Publication of CN115665553A publication Critical patent/CN115665553A/en
Application granted granted Critical
Publication of CN115665553B publication Critical patent/CN115665553B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses an automatic tracking method and device of an unmanned aerial vehicle, electronic equipment and a storage medium, wherein the method can continuously acquire images of the unmanned aerial vehicle in the flight process of the unmanned aerial vehicle, and obtains the rotation angle of a camera by calculating the deviation angle of the unmanned aerial vehicle and the center of the images in the horizontal and vertical directions, so that the rotation direction of the camera is continuously adjusted based on the rotation angle, and the real-time tracking of the unmanned aerial vehicle is further realized; therefore, satellites and radars are not needed in the unmanned aerial vehicle tracking process, so that the tracking cost is reduced, meanwhile, the problem of low recognition rate caused by small radar reflection area of the unmanned aerial vehicle is solved, and the accuracy of unmanned aerial vehicle tracking is improved.

Description

Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle tracking, and particularly relates to an automatic tracking method and device of an unmanned aerial vehicle, electronic equipment and a storage medium.
Background
In recent years, the unmanned aerial vehicle technology is rapidly developed and widely used in various fields, so that the convenience of work in various fields is greatly improved; correspondingly, in the continuous application process, higher requirements are also put forward on the tracking, identifying and positioning technology of the unmanned aerial vehicle. At present, there are various schemes for tracking, identifying and positioning of an unmanned aerial vehicle, which mainly include a satellite positioning mode, radar detection and the like, wherein the satellite positioning mode mainly utilizes a satellite, a ground signal transceiver station and a remote sensing controller, and utilizes GPS positioning and beidou positioning to realize information interaction among the unmanned aerial vehicle, the satellite and the ground signal transceiver station, so as to obtain position information and a flight state of the unmanned aerial vehicle, and further realize tracking and identifying of the unmanned aerial vehicle; and radar detection mainly uses passive radar to detect, position and track unmanned aerial vehicle, and receives unmanned aerial vehicle's telemetering measurement signal, through the analysis to telemetering measurement signal, accomplishes unmanned aerial vehicle's tracking discernment.
However, the tracking method of the unmanned aerial vehicle has the following defects that the tracking method based on the satellite and the remote sensing aircraft is high in deployment cost and is not suitable for the civil field; the tracking method using the radar has all-weather identification capability, but the identification rate of the unmanned aerial vehicle is low due to the fact that the radar reflection section of the unmanned aerial vehicle is small, and therefore the unmanned aerial vehicle cannot be accurately tracked; therefore, it is urgent to provide a tracking method which has low cost and high recognition rate and can accurately track the unmanned aerial vehicle.
Disclosure of Invention
The invention aims to provide an automatic tracking method and device of an unmanned aerial vehicle, electronic equipment and a storage medium, which are used for solving the problems that the cost is high and the unmanned aerial vehicle cannot be accurately tracked due to low recognition rate in the prior art when the unmanned aerial vehicle is tracked.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, an automatic tracking method for an unmanned aerial vehicle is provided, and is applied to automatic tracking of an unmanned aerial vehicle by at least one camera, where for any camera, the method includes:
acquiring an unmanned aerial vehicle image shot by any camera, and carrying out target detection on the unmanned aerial vehicle image to obtain a central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image;
calculating to obtain a horizontal deviation angle and a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image based on the central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image;
calculating to obtain a horizontal rotation angle of any camera according to a horizontal deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image; and
calculating to obtain a vertical rotation angle of any camera based on a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image;
controlling a horizontal steering engine of any camera to rotate according to the horizontal rotation angle, and controlling a vertical steering engine of any camera to rotate according to the vertical rotation angle;
reacquiring the unmanned aerial vehicle image shot by any camera until the unmanned aerial vehicle image is not detected until the unmanned aerial vehicle image is reacquired so as to finish the automatic tracking of the unmanned aerial vehicle through the unmanned aerial vehicle image shot by any camera.
Based on the above disclosure, the camera is used for acquiring the image of the unmanned aerial vehicle, and the image of the unmanned aerial vehicle is subjected to target detection, so that the unmanned aerial vehicle in the image of the unmanned aerial vehicle is subjected to target identification to obtain the position of the unmanned aerial vehicle in the image; then, the invention can calculate the deviation angles of the unmanned aerial vehicle in the image in the vertical and horizontal directions relative to the image center based on the position of the unmanned aerial vehicle in the image; then, the rotation angles of the camera in the horizontal direction and the vertical direction can be calculated based on the deviation angles in the two directions, and finally, the rotation angles in the horizontal direction and the vertical direction can be used for controlling the horizontal steering engine and the vertical steering engine of the camera to rotate, so that the adjustment of the direction of the camera is completed; according to the principle, during the flight process of the unmanned aerial vehicle, images of the unmanned aerial vehicle are continuously acquired, and the rotation angle of the camera can be calculated in real time according to the method, so that the rotation direction of the camera is continuously adjusted based on the calculated rotation angle, and finally real-time tracking of the unmanned aerial vehicle is realized; therefore, satellites and radars are not needed in the unmanned aerial vehicle tracking process, so that the tracking cost is reduced, the problem of low recognition rate caused by small radar reflection area of the unmanned aerial vehicle is solved, and the accuracy of unmanned aerial vehicle tracking is improved.
In one possible design, calculating a horizontal deviation angle and a vertical deviation angle between a center point of the drone in the drone image and an image center point in the drone image based on coordinates of the center point of the drone in the drone image, includes:
acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
determining coordinates of an image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
calculating a vertical distance, a horizontal distance and a straight line distance between the image center point and the unmanned aerial vehicle center point according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the unmanned aerial vehicle center point in the unmanned aerial vehicle image;
calculating to obtain a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle; and
and calculating to obtain a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the image central point and the central point of the unmanned aerial vehicle.
Based on the above disclosure, the invention discloses a calculation process of the deviation angle of the unmanned aerial vehicle in two directions relative to the image center point, namely, firstly, the maximum detection distance of the camera in the horizontal and vertical directions is utilized to determine the image center point coordinate in the unmanned aerial vehicle image, then, the horizontal distance, the vertical distance and the straight line distance between the image center point coordinate and the unmanned aerial vehicle center are calculated based on the image center point coordinate and the unmanned aerial vehicle center coordinate, finally, a right triangle is formed by three line segments, and the horizontal deviation angle and the vertical deviation angle between the unmanned aerial vehicle center point and the image center point can be calculated by using an inverse trigonometric function.
In a possible design, the method for calculating the horizontal rotation angle of any camera according to the horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the central point of the image in the unmanned aerial vehicle image comprises the following steps:
acquiring a proportional gain coefficient, an integral gain coefficient and a differential gain coefficient;
calculating to obtain a horizontal rotation angle of any camera by using a PID control algorithm based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the horizontal deviation angle;
correspondingly, based on the vertical deviation angle between the central point of unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image, calculate and obtain the vertical turned angle of arbitrary camera, include:
and calculating the vertical rotation angle of any camera by utilizing a PID control algorithm and based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the vertical deviation angle.
Based on the disclosure, the invention utilizes the PID control algorithm to calculate the horizontal rotation angle of the camera based on the horizontal deviation angle and calculate the vertical rotation angle of the camera based on the vertical deviation angle, so that the steering engine of the camera is controlled to rotate by adopting the PID, the rotation accuracy and the real-time performance are high, and the real-time performance of tracking the unmanned aerial vehicle can be ensured.
In one possible design, when applied to the automatic tracking of the drone by two cameras, the method further comprises:
acquiring the camera parameters of the two cameras when the unmanned aerial vehicle is positioned in the center of the corresponding camera pictures of the two cameras to serve as position calculation parameters, wherein the installation positions of the two cameras are at the same horizontal height;
calculating to obtain the distance between a first camera of the two cameras and an unmanned aerial vehicle calibration point based on the horizontal distance between the two cameras and the position calculation parameter, wherein the unmanned aerial vehicle calibration point is an intersection point between a calibration line of the unmanned aerial vehicle and the ground, and the calibration line is a line segment which is obtained by vertically drawing a vertical line downwards by taking the central point of the unmanned aerial vehicle as a starting point;
and calculating to obtain the world coordinate of the unmanned aerial vehicle based on the position calculation parameter and the distance between the first camera and the unmanned aerial vehicle calibration point.
Based on the disclosure, the invention can also calculate the world coordinate of the unmanned aerial vehicle by capturing the camera parameters of the unmanned aerial vehicle through the two cameras, so as to obtain the position of the unmanned aerial vehicle; therefore, the unmanned aerial vehicle can be accurately positioned in the tracking process, so that the position identification of the unmanned aerial vehicle is realized.
In one possible design, the position calculation parameters include: the horizontal angle of first camera and the horizontal angle of second camera, wherein, based on horizontal distance between two cameras and the position calculation parameter, calculate the distance that obtains between first camera in two cameras and the unmanned aerial vehicle calibration point, include:
according to the following formula (1), calculating to obtain the distance between the first camera and the unmanned aerial vehicle calibration point:
Figure BDA0003873224250000041
in the above formula (1), l represents the distance between the first camera and the unmanned aerial vehicle calibration point, α 'represents the horizontal angle of the first camera, β' represents the horizontal angle of the second camera, and d represents the horizontal distance between the first camera and the second camera.
In one possible design, the position calculation parameters include: the horizontal angle of the first camera, the pitch angle of the first camera and the world coordinate of the first camera;
correspondingly, based on the position calculation parameter and the distance between the first camera and the unmanned aerial vehicle calibration point, the world coordinate of the unmanned aerial vehicle is obtained by calculation, and the method comprises the following steps:
according to the following formula (2), formula (3) and formula (4), sequentially calculating to obtain the abscissa, ordinate and z-axis coordinate of the unmanned aerial vehicle under a world coordinate system;
x′=lcosα′+X 1 (2)
y′=lsinα′+Y 1 (3)
z′=ltanα+Z 1 (4)
in the formula (2), X 'represents the abscissa of the unmanned aerial vehicle in the world coordinate system, l represents the distance between the first camera and the unmanned aerial vehicle calibration point, α' represents the horizontal angle of the first camera, and X 1 Representing the abscissa value of the first camera corresponding to the world coordinate;
in the above formula (3), Y' represents the ordinate of the unmanned aerial vehicle in the world coordinate system, and Y 1 Expressing the longitudinal coordinate value of the first camera corresponding to the world coordinate;
in the above formula (4), Z' represents the Z-axis coordinate of the unmanned aerial vehicle in the world coordinate system, Z 1 And a represents the coordinate value of the z axis of the first camera corresponding to the world coordinate, and a represents the pitch angle of the first camera.
In a second aspect, an automatic tracking apparatus for a drone is provided, including:
the target detection unit is used for acquiring an unmanned aerial vehicle image shot by any camera and carrying out target detection on the unmanned aerial vehicle image to obtain a central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image;
the deviation angle calculation unit is used for calculating and obtaining a horizontal deviation angle and a vertical deviation angle between the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image center point in the unmanned aerial vehicle image based on the coordinates of the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image;
the camera angle calculation unit is used for calculating a horizontal rotation angle of any camera according to a horizontal deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image; and
the camera angle calculation unit is further used for calculating a vertical rotation angle of any camera based on a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image;
the steering engine control unit is used for controlling the horizontal steering engine of any camera to rotate according to the horizontal rotation angle and controlling the vertical steering engine of any camera to rotate according to the vertical rotation angle;
the target detection unit is further used for reacquiring the unmanned aerial vehicle image shot by any camera until the unmanned aerial vehicle image reacquired does not detect the unmanned aerial vehicle image, so that the unmanned aerial vehicle image shot by any camera is used for completing the automatic tracking of the unmanned aerial vehicle.
In one possible design, the deviation angle calculation unit includes: the method comprises the following steps of obtaining a subunit, a center point calculating subunit, a distance calculating subunit and an angle calculating subunit;
the acquisition subunit is used for acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
the center point calculation subunit is used for determining coordinates of an image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
the distance calculation subunit is used for calculating a vertical distance, a horizontal distance and a straight-line distance between the image center point and the unmanned aerial vehicle center point according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the unmanned aerial vehicle center point in the unmanned aerial vehicle image;
the angle calculation subunit is used for calculating a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the central point of the image in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle;
and the angle calculation subunit is further used for calculating a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle.
In a third aspect, another automatic tracking apparatus for a drone is provided, which takes an apparatus as an electronic device, and includes a memory, a processor, and a transceiver, which are sequentially connected in a communication manner, where the memory is used to store a computer program, the transceiver is used to transmit and receive messages, and the processor is used to read the computer program and execute an automatic tracking method for the drone according to any one of the first aspect or the possible designs of the first aspect.
In a fourth aspect, there is provided a storage medium having stored thereon instructions for executing the method for automatic tracking of a drone according to the first aspect or any one of the possible designs of the first aspect when the instructions are run on a computer.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of automatic tracking of a drone as described in the first aspect or any one of the possible designs of the first aspect.
Has the beneficial effects that:
(1) According to the invention, during the flight of the unmanned aerial vehicle, the image of the unmanned aerial vehicle can be continuously acquired, and the rotation angle of the camera can be obtained by calculating the deviation angle of the unmanned aerial vehicle and the image center in the image in the horizontal and vertical directions, so that the rotation direction of the camera is continuously adjusted based on the rotation angle, and the real-time tracking of the unmanned aerial vehicle is further realized; therefore, satellites and radars are not needed in the unmanned aerial vehicle tracking process, so that the tracking cost is reduced, meanwhile, the problem of low recognition rate caused by small radar reflection area of the unmanned aerial vehicle is solved, and the accuracy of unmanned aerial vehicle tracking is improved.
(2) The invention can also calculate the world coordinate of the unmanned aerial vehicle by capturing the camera parameters behind the unmanned aerial vehicle through the two cameras, thereby obtaining the position of the unmanned aerial vehicle; therefore, the unmanned aerial vehicle can be accurately positioned in the tracking process, so that the position identification of the unmanned aerial vehicle is realized.
Drawings
Fig. 1 is a schematic flow chart illustrating steps of an automatic tracking method for an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a camera provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a principle of calculating a horizontal deviation angle and a vertical deviation angle according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating a calculation principle of world coordinates of the unmanned aerial vehicle according to the embodiment of the present invention;
fig. 5 is a schematic structural diagram of an automatic tracking apparatus of an unmanned aerial vehicle according to an embodiment of the present invention.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Reference numerals: 10-a camera; 20-a holder; 30-a horizontal steering engine; 40-vertical steering engine.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the present invention will be briefly described below with reference to the accompanying drawings and the embodiments or the description in the prior art, it is obvious that the following description of the structure of the drawings is only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts. It should be noted that the description of the embodiments is provided to help understanding of the present invention, but the present invention is not limited thereto.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments of the present invention.
It should be understood that, for the term "and/or" as may appear herein, it is merely an associative relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists independently, B exists independently, and A and B exist simultaneously; for the term "/and" as may appear herein, which describes another associative object relationship, it means that two relationships may exist, e.g., a/and B, may mean: a exists independently, and A and B exist independently; in addition, for the character "/" that may appear herein, it generally means that the former and latter associated objects are in an "or" relationship.
Example (b):
referring to fig. 1, the automatic tracking method for the unmanned aerial vehicle provided by this embodiment is applied to automatic tracking of an unmanned aerial vehicle by at least one camera, where the tracking method for each camera is the same, and is described in detail below by taking any camera as an example, specifically, any camera can continuously acquire an image of the unmanned aerial vehicle during flight of the unmanned aerial vehicle, and obtains the position of the unmanned aerial vehicle in the image by performing target recognition on the image of the unmanned aerial vehicle, and then, based on the position of the unmanned aerial vehicle in the image, horizontal and vertical deviation angles between the unmanned aerial vehicle and an image center point can be calculated; finally, any camera can obtain the rotation angle of the camera in the horizontal and vertical directions based on the two calculated deviation angles, and then the horizontal steering engine and the vertical steering engine can be controlled to rotate based on the two rotation angles, so that the real-time adjustment of the shooting direction is realized; optionally, the method provided by this embodiment may be, but is not limited to, executed on a camera side, and it is understood that the foregoing execution main body does not constitute a limitation to the embodiment of this application, and accordingly, the execution steps of the method may be, but are not limited to, as shown in steps S1 to S5 below.
S1, acquiring an unmanned aerial vehicle image shot by any camera, and performing target detection on the unmanned aerial vehicle image to obtain a central point coordinate of an unmanned aerial vehicle in the unmanned aerial vehicle image; in specific application, any camera can acquire images according to a preset sampling period, and the preset sampling period can be set according to actual use, such as 100ms, 10ms and the like; of course, the video stream of the flight of the unmanned aerial vehicle can be collected according to the preset sampling period, and then the key frame of the video stream is taken as the image of the unmanned aerial vehicle; optionally, for example, the improved YOLOv3 neural network is used to perform target detection on the image of the unmanned aerial vehicle, and the coordinates of the central point of the unmanned aerial vehicle in the image are actually the coordinates of the central point of the prediction frame in the image of the unmanned aerial vehicle.
Further, for example, a first feature fusion layer, a second feature fusion layer, and an upsampling layer are added to the improved YOLOv3 neural network on the original YOLOv3 neural network, where the first feature fusion layer is used to perform feature fusion on a feature map output by a 103 th convolutional layer in the YOLOv3 neural network using a1 × 1 convolutional layer to obtain a first feature map, and output the first feature map to a 108 th convolutional layer of the YOLOv3 neural network, and meanwhile, for example, the number of channels of the first feature fusion layer is 64; therefore, by means of the feature fusion mode, feature information extracted from the unmanned aerial vehicle image can be increased.
In this embodiment, the upsampling layer is used to upsample a 108 th layer in a YOLOv3 neural network, and a receptive field scale of the upsampled feature map is 104 × 104 (originally 52 × 52); therefore, the size of the receptive field is increased, the richness of feature information extraction can be improved, and meanwhile, the receptive field size is reduced on the 109 th convolution layer, so that the capability of detecting small targets is improved.
Furthermore, the second feature fusion layer is configured to perform feature fusion on a feature map output by the 11 th convolutional layer in the YOLOv3 neural network and a feature map output by the 109 th convolutional layer to obtain a second feature map; the 11 th layer of feature information belongs to features of the bottom layer, and after the 11 th layer of feature information is subjected to convolution processing for multiple times, more details of the small target object are saved, so that the two layers of feature maps are fused, and the obtained second feature map can contain more feature information in the unmanned aerial vehicle image.
In addition, the receptive field sizes of the feature maps output by the 81 st layer, the 93 th layer, the 105 th layer and the 117 th layer in the improved YOLOv3 neural network are 13 × 13, 26 × 26, 53 × 53 and 104 × 104 in sequence, and the improved YOLOv3 neural network uses a logistic regression function to respectively perform target detection on the feature maps output by the 81 st layer, the 93 th layer, the 105 th layer and the 117 th layer; therefore, target detection is carried out on the characteristic graphs under different receptive field scales, and the missing detection rate and the false detection rate of the unmanned aerial vehicle in the unmanned aerial vehicle image can be reduced.
Optionally, an example of the training process of the improved YOLOv3 neural network is: (1) Shooting a photo of the unmanned aerial vehicle, and acquiring a data set, wherein the data set comprises an infrared image shot at night and an unmanned aerial vehicle sample image shot in the day; (2) Labeling the unmanned aerial vehicle in each image in the data set by using labelme (a QT-based cross-platform image labeling tool which can be used for labeling information such as classification, detection, segmentation, key points and the like), wherein each image is provided with a corresponding annotation file, and the file comprises the position and the size of a boundary frame of the unmanned aerial vehicle; (3) Dividing the data set into a training set and a testing set, wherein the proportion of the training set to the testing set is 9:1; (4) Training an improved YOLOv3 neural network by taking the training set as input and the position and the size of a boundary frame of the unmanned aerial vehicle in each image in the training set as output, and adjusting network parameters by using the test set in the training process until the accuracy and the recall rate reach more than 85 percent, and ending the training; of course, in this embodiment, the improved YOLOv3 neural network may be, but is not limited to, deployed in a camera, so that when in use, unmanned aerial vehicle identification may be completed locally.
After the coordinates of the central point of the unmanned aerial vehicle in the image are identified, the deviation angle between the central point of the unmanned aerial vehicle and the central point of the image can be calculated based on the coordinates of the central point of the unmanned aerial vehicle, so that the rotation angle of the camera can be adjusted based on the calculated deviation angle, wherein the calculation process of the deviation angle is as shown in the following step S2.
S2, calculating to obtain a horizontal deviation angle and a vertical deviation angle between the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image center point in the unmanned aerial vehicle image based on the center point coordinates of the unmanned aerial vehicle in the unmanned aerial vehicle image; during specific application, the coordinates of the image center point in the unmanned aerial vehicle image are determined, and then the deviation angles of the unmanned aerial vehicle relative to the image center point in the horizontal and vertical directions can be calculated according to the coordinates of the image center point and the coordinates of the unmanned aerial vehicle center point; alternatively, the calculation process of the two deviation angles may be, but not limited to, as shown in the following steps S21 to S24.
S21, acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction; when the system is applied specifically, the maximum detection distance of any camera in the horizontal and vertical directions is the inherent attribute of the camera, and the maximum detection distance can be stored in a processor of the camera in advance and can be read when the system is used.
After the maximum detection distance of any camera in the horizontal and vertical directions is obtained, the coordinates of the image center point in the unmanned aerial vehicle image can be determined, wherein the determination process is as shown in the following step S22.
S22, determining coordinates of an image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction; in specific application, the coordinates of the central point of the image are (w/2, h/2), wherein w is the maximum detection distance of any camera in the horizontal direction, and h is the maximum detection distance of any camera in the vertical direction.
After the coordinates of the image center point in the image of the unmanned aerial vehicle are obtained, the horizontal deviation angle and the vertical deviation angle between the unmanned aerial vehicle and the image center point in the image can be calculated by combining the coordinates of the image center point of the unmanned aerial vehicle, wherein the calculation process is as shown in the following steps S23 and S24.
S23, calculating a vertical distance, a horizontal distance and a straight line distance between the image center point and the center point of the unmanned aerial vehicle according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image.
S24, calculating to obtain a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle; and calculating to obtain a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the image central point and the central point of the unmanned aerial vehicle.
Referring to fig. 3, point O in fig. 3 represents a central point of the image, and point D represents a central point of the drone, wherein a horizontal distance, a vertical distance, and a straight-line distance between the central point of the image and the central point of the drone may form a right triangle, that is, a triangle ODH in fig. 3, where OH is the horizontal distance, DH is the vertical distance, and OD is the straight-line distance, so that OH, OD, and DH may be calculated by using coordinates of the central point of the image and the central point of the drone, and then angles of the angle HOD and the angle ODH may be calculated by using an inverse trigonometric function, specifically, the angle HOD is the horizontal deviation angle, and the angle ODH is the vertical deviation angle.
After a horizontal deviation angle and a vertical deviation angle between the unmanned aerial vehicle and an image central point in the unmanned aerial vehicle image are obtained, a horizontal rotation angle and a vertical rotation angle of any camera can be calculated and obtained based on the two deviation angles, wherein the calculation process is shown as the following step S3.
S3, calculating to obtain a horizontal rotation angle of any camera according to a horizontal deviation angle between a center point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image center point in the unmanned aerial vehicle image, and calculating to obtain a vertical rotation angle of any camera based on a vertical deviation angle between the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image center point in the unmanned aerial vehicle image; in specific application, the present embodiment uses a PID (Proportional Integral Derivative) control algorithm to calculate a horizontal rotation angle of any one camera based on a horizontal deviation angle, and calculate a vertical rotation angle of any one camera based on a vertical deviation angle; alternatively, the PID calculation process is as shown in step S31 and step S32 described below.
S31, acquiring a proportional gain coefficient, an integral gain coefficient and a differential gain coefficient; in specific application, the three coefficients are preset values and can be prestored in a processor of any camera.
S32, calculating to obtain a horizontal rotation angle of any camera by using a PID control algorithm and based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the horizontal deviation angle, and calculating to obtain a vertical rotation angle of any camera by using the PID control algorithm and based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the vertical deviation angle; in a specific application, the horizontal rotation angle may be calculated according to the following formula (5), and the vertical rotation angle may be calculated by using the following formula (6).
Figure BDA0003873224250000111
Figure BDA0003873224250000121
In the above formula (5), u 1 (t) represents the horizontal rotation angle, k, of any camera 1 ,k 2 ,k 3 A proportional gain coefficient, an integral gain coefficient and a differential gain coefficient, e 1 (t) represents the horizontal deviation angle of any camera, t 1 And t 2 Respectively corresponding to the acquisition starting point time and the acquisition ending point time of the unmanned aerial vehicle image for any camera, so that u 1 (t) is represented by 1 To t 2 The horizontal rotation angle of any camera in this time period.
Similarly, u in the above formula (6) 2 (t) represents a vertical rotation angle at any one camera, e 2 (t) represents a vertical deviation angle of any camera.
Therefore, based on the formula (5) and the formula (6), the horizontal rotation angle and the vertical rotation angle of any camera in the corresponding acquisition time of the unmanned aerial vehicle image can be calculated; after the rotation angles in the two directions are obtained, the steering engine corresponding to any camera can be controlled, as shown in step S4 below.
S4, controlling a horizontal steering engine of any camera to rotate according to the horizontal rotation angle, and controlling a vertical steering engine of any camera to rotate according to the vertical rotation angle; in specific application, referring to fig. 2, the camera 10 is mounted on the pan/tilt 20, and the pan/tilt 20 includes a horizontal steering engine 30 and a vertical steering engine 40, that is, the horizontal steering engine 30 drives the camera 10 to rotate 180 degrees in the horizontal direction, and the vertical steering engine 40 drives the camera 10 to rotate 180 degrees in the vertical direction; in this embodiment, the horizontal rotation angle is the rotation angle of the horizontal steering engine, and the vertical rotation angle is the rotation angle of the vertical steering engine, so that the adjustment of the shooting direction of the camera can be realized only by adjusting the rotation of the corresponding steering engine according to the two rotation angles.
Therefore, the adjustment of the rotation direction of the camera in a sampling period is completed through the steps, and the unmanned aerial vehicle images are continuously acquired according to the principle, so that the rotation direction of the camera can be continuously adjusted based on the acquired unmanned aerial vehicle images, and the unmanned aerial vehicle can be automatically tracked, as shown in the step S5.
S5, acquiring the unmanned aerial vehicle image shot by any camera again until the unmanned aerial vehicle is not detected in the acquired unmanned aerial vehicle image, so that the unmanned aerial vehicle image shot by any camera can be used for completing automatic tracking of the unmanned aerial vehicle; during specific application, the unmanned aerial vehicle images can be collected again according to a preset sampling period (if the images are collected once every 10 ms), and the steps S1-S4 are repeated, so that the horizontal rotation angle and the vertical rotation angle of any camera can be calculated based on the unmanned aerial vehicle images collected every time, the adjustment of the shooting direction of the camera is realized based on the calculated two rotation angles, and further the automatic tracking of the unmanned aerial vehicle is realized.
According to the automatic tracking method of the unmanned aerial vehicle, which is described in detail in the steps S1 to S5, the images of the unmanned aerial vehicle can be continuously acquired in the flying process of the unmanned aerial vehicle, and the rotating angle of the camera is calculated by calculating the deviation angles of the unmanned aerial vehicle and the center of the image in the horizontal and vertical directions in the images, so that the rotating direction of the camera is continuously adjusted based on the calculated rotating angle, and the real-time tracking of the unmanned aerial vehicle is further realized; therefore, satellites and radars are not needed in the unmanned aerial vehicle tracking process, so that the tracking cost is reduced, the problem of low recognition rate caused by small radar reflection area of the unmanned aerial vehicle is solved, and the accuracy of unmanned aerial vehicle tracking is improved.
In a possible design, the second aspect of the present embodiment provides, on the basis of the first aspect of the present embodiment, a position locating method in a tracking process of an unmanned aerial vehicle, where the method can calculate world coordinates of the unmanned aerial vehicle, so that accurate positioning of the unmanned aerial vehicle can be achieved, where the method provided by the second aspect of the present embodiment is applied to accurate positioning of the unmanned aerial vehicle when tracked by two cameras, and an operation process of the method may be, but is not limited to, as shown in steps S6 to S8 below.
S6, acquiring shooting parameters of the two cameras when the unmanned aerial vehicle is positioned in the center of a corresponding shooting picture of the two cameras to serve as position calculation parameters, wherein the installation positions of the two cameras are positioned at the same horizontal height; in specific applications, the position calculation parameters may include, but are not limited to: the horizontal angle of the first camera, the pitch angle of the first camera, the horizontal angle of the second camera and the world coordinate of the first camera; in this embodiment, whether the unmanned aerial vehicle is in the center of the corresponding image of the two cameras can be obtained by judging whether the center of the unmanned aerial vehicle coincides with the center of each image in the images of the unmanned aerial vehicle corresponding to the first camera and the second camera, that is, when the centers of the unmanned aerial vehicle in the two images coincide with the centers of each image, it is indicated that the unmanned aerial vehicle is in the center of the corresponding image of the two cameras, and at this time, the camera parameters of the two cameras can be obtained; of course, the first and second cameras can directly output the pitch and horizontal angles under different measurement conditions, and the world coordinates are obtained when the cameras are installed.
After the camera parameters of the two cameras are obtained, the distance between the first camera and the unmanned aerial vehicle calibration point can be calculated based on the horizontal distance between the two cameras, so that the world coordinate of the unmanned aerial vehicle can be calculated and obtained based on the distance and the camera parameters of the two cameras, wherein the distance calculation process is shown in the following step S7.
S7, calculating to obtain the distance between a first camera of the two cameras and an unmanned aerial vehicle calibration point based on the horizontal distance between the two cameras and the position calculation parameter, wherein the unmanned aerial vehicle calibration point is an intersection point between a calibration line of the unmanned aerial vehicle and the ground, and the calibration line is a line segment which is obtained by vertically guiding a vertical line downwards by taking the central point of the unmanned aerial vehicle as a starting point; in specific application, as shown in fig. 4, A1 in fig. 4 represents a first camera, A2 represents a second camera, a point C represents an unmanned aerial vehicle, α 'represents a horizontal angle of the first camera, β' represents a horizontal angle of the second camera, a represents a pitch angle of the first camera, β represents a pitch angle of the second camera, d represents a horizontal distance between the first camera and a second camera, and l represents a distance between a first camera of the two cameras and a calibration point of the unmanned aerial vehicle; therefore, the first camera and the second camera can be used as the origin, and a three-dimensional coordinate system is established to solve l based on the right-hand rule; two coordinate systems as shown in fig. 4, wherein the center of the drone is drawn downward by a vertical line, and the bottom of the vertical line intersecting the ground is the drone calibration point (i.e. point F in fig. 4), then a right triangle can be formed between the first camera, the drone calibration point and the second camera, and therefore, according to the sine theorem, the following relation (1) can be obtained:
Figure BDA0003873224250000141
in the above formula (1), l represents the distance between the first camera and the unmanned aerial vehicle calibration point, α 'represents the horizontal angle of the first camera, β' represents the horizontal angle of the second camera, and d represents the horizontal distance between the first camera and the second camera.
Therefore, based on the formula (1), after the horizontal angle of the first camera, the horizontal angle of the second camera and the horizontal distance between the two cameras are known, the distance between the first camera and the unmanned aerial vehicle calibration point can be calculated; therefore, the world coordinates of the unmanned aerial vehicle can be calculated based on the calculated distance and by combining the position calculation parameters, wherein the coordinate calculation process is shown as the following step S8.
S8, calculating to obtain the world coordinate of the unmanned aerial vehicle based on the position calculation parameter and the distance between the first camera and the unmanned aerial vehicle calibration point; in specific application, the abscissa, the ordinate and the z-axis coordinate of the unmanned aerial vehicle in the world coordinate system can be obtained by sequentially calculating according to, but not limited to, the following formula (2), formula (3) and formula (4):
x′=lcosα′+X 1 (2)
y′=lsinα′+Y 1 (3)
z′=ltanα+Z 1 (4)
in the formula (2), X 'represents the abscissa of the unmanned aerial vehicle in the world coordinate system, l represents the distance between the first camera and the unmanned aerial vehicle calibration point, α' represents the horizontal angle of the first camera, and X 1 And the abscissa value of the first camera corresponding to the world coordinate is shown.
In the above formula (3), Y' represents the ordinate of the unmanned aerial vehicle in the world coordinate system, and Y 1 And the vertical coordinate value of the first camera corresponding to the world coordinate is shown.
In the above formula (4), Z' represents the Z-axis coordinate of the unmanned aerial vehicle in the world coordinate system, Z 1 And a represents the pitch angle of the first camera.
By the design, the world coordinate of the unmanned aerial vehicle can be calculated by capturing the camera parameters of the unmanned aerial vehicle through the two cameras, so that the position of the unmanned aerial vehicle is obtained; therefore, the unmanned aerial vehicle can be accurately positioned in the tracking process, so that the position identification of the unmanned aerial vehicle is realized.
As shown in fig. 5, a third aspect of the present embodiment provides a hardware device for implementing the automatic tracking method for a drone, which includes:
and the target detection unit is used for acquiring the unmanned aerial vehicle image shot by any camera, and carrying out target detection on the unmanned aerial vehicle image to obtain the coordinate of the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image.
And the deviation angle calculation unit is used for calculating a horizontal deviation angle and a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image based on the central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image.
And the camera angle calculating unit is used for calculating the horizontal rotation angle of any camera according to the horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image.
The camera angle calculating unit is further used for calculating a vertical rotating angle of any camera based on a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image.
And the steering engine control unit is used for controlling the horizontal steering engine of any camera to rotate based on the horizontal rotation angle and controlling the vertical steering engine of any camera to rotate according to the vertical rotation angle.
The target detection unit is further used for reacquiring the unmanned aerial vehicle image shot by any camera until the unmanned aerial vehicle image reacquired does not detect the unmanned aerial vehicle image, so that the unmanned aerial vehicle image shot by any camera is used for completing the automatic tracking of the unmanned aerial vehicle.
In one possible design, the deviation angle calculation unit includes: the device comprises an acquisition subunit, a center point calculation subunit, a distance calculation subunit and an angle calculation subunit.
And the acquisition subunit is used for acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction.
And the center point calculation subunit is used for determining the coordinates of the image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction.
And the distance calculation subunit is used for calculating the vertical distance, the horizontal distance and the linear distance between the image center point and the center point of the unmanned aerial vehicle according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image.
And the angle calculation subunit is used for calculating to obtain a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle.
And the angle calculation subunit is further used for calculating a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle.
For the working process, the working details, and the technical effects of the apparatus provided in this embodiment, reference may be made to the first aspect and the second aspect of the embodiment, which are not described herein again.
As shown in fig. 6, a fourth aspect of this embodiment provides another automatic tracking apparatus for an unmanned aerial vehicle, taking an apparatus as an electronic device as an example, including: a memory, a processor and a transceiver, which are sequentially connected in communication, wherein the memory is used for storing a computer program, the transceiver is used for sending and receiving messages, and the processor is used for reading the computer program and executing the automatic tracking method for the drone according to the first aspect and/or the second aspect of the embodiments.
For example, the Memory may include, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Flash Memory (Flash Memory), a First In First Out (FIFO), a First In Last Out (FILO), and/or a First In Last Out (FILO); in particular, the processor may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array), and meanwhile, the processor may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a CPU (Central Processing Unit); a coprocessor is a low power processor for processing data in a standby state.
In some embodiments, the processor may be integrated with a GPU (Graphics Processing Unit) which is responsible for rendering and drawing contents required to be displayed on the display screen, for example, the processor may not be limited to a processor adopting a model STM32F105 series microprocessor, a Reduced Instruction Set Computer (RISC) microprocessor, an X86 or other architecture processor or an embedded neural Network Processor (NPU); the transceiver may be, but is not limited to, a wireless fidelity (WIFI) wireless transceiver, a bluetooth wireless transceiver, a General Packet Radio Service (GPRS) wireless transceiver, a ZigBee wireless transceiver (ieee802.15.4 standard-based low power local area network protocol), a 3G transceiver, a 4G transceiver, and/or a 5G transceiver, etc. In addition, the device may also include, but is not limited to, a power module, a display screen, and other necessary components.
For the working process, the working details, and the technical effects of the electronic device provided in this embodiment, reference may be made to the first aspect and the second aspect of the embodiment, which are not described herein again.
A fifth aspect of the present embodiment provides a storage medium storing instructions for implementing the automatic tracking method for a drone according to the first and second aspects, that is, the storage medium stores instructions that, when executed on a computer, perform the automatic tracking method for a drone according to the first and/or second aspects.
The storage medium refers to a carrier for storing data, and may include, but is not limited to, a floppy disk, an optical disk, a hard disk, a flash Memory, a flash disk and/or a Memory Stick (Memory Stick), etc., and the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
For the working process, the working details, and the technical effects of the storage medium provided in this embodiment, reference may be made to the first aspect and the second aspect of the embodiment, which are not described herein again.
A sixth aspect of the present embodiments provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method for automatic tracking of a drone according to the first and/or second aspect of the embodiments, wherein the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
Finally, it should be noted that: the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An automatic tracking method of a unmanned aerial vehicle is applied to automatic tracking of the unmanned aerial vehicle by at least one camera, wherein for any camera, the method comprises the following steps:
acquiring an unmanned aerial vehicle image shot by any camera, and carrying out target detection on the unmanned aerial vehicle image to obtain a central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image;
calculating to obtain a horizontal deviation angle and a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image based on the coordinate of the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image;
calculating to obtain a horizontal rotation angle of any camera according to a horizontal deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image; and
calculating to obtain a vertical rotation angle of any camera based on a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image;
controlling a horizontal steering engine of any camera to rotate according to the horizontal rotation angle, and controlling a vertical steering engine of any camera to rotate according to the vertical rotation angle;
reacquiring the unmanned aerial vehicle image shot by any camera until the unmanned aerial vehicle image is not detected until the unmanned aerial vehicle image is reacquired so as to finish the automatic tracking of the unmanned aerial vehicle through the unmanned aerial vehicle image shot by any camera.
2. The method of claim 1, wherein calculating a horizontal deviation angle and a vertical deviation angle between the center point of the drone in the drone image and the image center point in the drone image based on the coordinates of the center point of the drone in the drone image comprises:
acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
determining coordinates of an image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
calculating a vertical distance, a horizontal distance and a straight line distance between the image center point and the unmanned aerial vehicle center point according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the unmanned aerial vehicle center point in the unmanned aerial vehicle image;
calculating to obtain a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the central point of the image in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle; and
and calculating to obtain a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the image central point and the central point of the unmanned aerial vehicle.
3. The method of claim 1, wherein calculating a horizontal rotation angle of any camera according to a horizontal deviation angle between a center point of the drone in the drone image and a center point of the image in the drone image comprises:
acquiring a proportional gain coefficient, an integral gain coefficient and a differential gain coefficient;
calculating to obtain a horizontal rotation angle of any camera by using a PID control algorithm based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the horizontal deviation angle;
correspondingly, based on the vertical deviation angle between the central point of unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image, calculate and obtain the vertical turned angle of arbitrary camera, include:
and calculating the vertical rotation angle of any camera by utilizing a PID control algorithm and based on the proportional gain coefficient, the integral gain coefficient, the differential gain coefficient and the vertical deviation angle.
4. The method of claim 1, when applied to automatic tracking of a drone by two cameras, the method further comprising:
when the unmanned aerial vehicle is positioned in the center of a corresponding image pick-up picture of the two cameras, the pick-up parameters of the two cameras are obtained and used as position calculation parameters, wherein the installation positions of the two cameras are positioned at the same horizontal height;
calculating to obtain the distance between a first camera of the two cameras and an unmanned aerial vehicle calibration point based on the horizontal distance between the two cameras and the position calculation parameter, wherein the unmanned aerial vehicle calibration point is an intersection point between a calibration line of the unmanned aerial vehicle and the ground, and the calibration line is a line segment which is obtained by vertically drawing a vertical line downwards by taking the central point of the unmanned aerial vehicle as a starting point;
and calculating to obtain the world coordinate of the unmanned aerial vehicle based on the position calculation parameter and the distance between the first camera and the unmanned aerial vehicle calibration point.
5. The method of claim 4, wherein the position calculation parameters comprise: the horizontal angle of first camera and the horizontal angle of second camera, wherein, based on horizontal distance between two cameras and the position calculation parameter, calculate the distance between the first camera in obtaining two cameras and the unmanned aerial vehicle calibration point, include:
calculating the distance between the first camera and the unmanned aerial vehicle calibration point according to the following formula (1):
Figure FDA0003873224240000031
in the above formula (1), l represents the distance between the first camera and the unmanned aerial vehicle calibration point, α 'represents the horizontal angle of the first camera, β' represents the horizontal angle of the second camera, and d represents the horizontal distance between the first camera and the second camera.
6. The method of claim 4, wherein the location calculation parameters comprise: the horizontal angle of the first camera, the pitch angle of the first camera and the world coordinate of the first camera;
correspondingly, based on the position calculation parameter and the distance between the first camera and the unmanned aerial vehicle calibration point, the world coordinate of the unmanned aerial vehicle is obtained by calculation, and the method comprises the following steps:
according to the following formula (2), formula (3) and formula (4), sequentially calculating to obtain the abscissa, ordinate and z-axis coordinate of the unmanned aerial vehicle under a world coordinate system;
x′=lcosα′+X 1 (2)
y′=lsinα′+Y 1 (3)
z′=ltanα+Z 1 (4)
in the formula (2), X 'represents the abscissa of the unmanned aerial vehicle in the world coordinate system, l represents the distance between the first camera and the calibration point of the unmanned aerial vehicle, α' represents the horizontal angle of the first camera, and X 1 Representing the abscissa value of the first camera corresponding to the world coordinate;
in the above formula (3), Y' represents the ordinate of the unmanned aerial vehicle in the world coordinate system, and Y 1 Representing a longitudinal coordinate value of the first camera corresponding to the world coordinate;
in the above formula (4), Z' represents the Z-axis coordinate of the unmanned aerial vehicle in the world coordinate system, Z 1 And a represents the coordinate value of the z axis of the first camera corresponding to the world coordinate, and a represents the pitch angle of the first camera.
7. An automatic tracking device of unmanned aerial vehicle, characterized by, include:
the target detection unit is used for acquiring an unmanned aerial vehicle image shot by any camera and carrying out target detection on the unmanned aerial vehicle image to obtain a central point coordinate of the unmanned aerial vehicle in the unmanned aerial vehicle image;
the deviation angle calculation unit is used for calculating and obtaining a horizontal deviation angle and a vertical deviation angle between the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image center point in the unmanned aerial vehicle image based on the coordinates of the center point of the unmanned aerial vehicle in the unmanned aerial vehicle image;
the camera angle calculation unit is used for calculating a horizontal rotation angle of any camera according to a horizontal deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image; and
the camera angle calculation unit is also used for calculating a vertical rotation angle of any camera based on a vertical deviation angle between a central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and an image central point in the unmanned aerial vehicle image;
the steering engine control unit is used for controlling the horizontal steering engine of any camera to rotate according to the horizontal rotation angle and controlling the vertical steering engine of any camera to rotate according to the vertical rotation angle;
the target detection unit is further used for reacquiring the unmanned aerial vehicle image shot by any camera until the unmanned aerial vehicle image reacquired does not detect the unmanned aerial vehicle image, so that the unmanned aerial vehicle image shot by any camera is used for completing the automatic tracking of the unmanned aerial vehicle.
8. The apparatus according to claim 7, wherein the deviation angle calculating unit includes: the device comprises an acquisition subunit, a center point calculation subunit, a distance calculation subunit and an angle calculation subunit;
the acquisition subunit is used for acquiring the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
the center point calculation subunit is used for determining coordinates of an image center point in the unmanned aerial vehicle image based on the maximum detection distance of any camera in the horizontal direction and the maximum detection distance in the vertical direction;
the distance calculation subunit is used for calculating a vertical distance, a horizontal distance and a straight-line distance between the image center point and the unmanned aerial vehicle center point according to the coordinates of the image center point in the unmanned aerial vehicle image and the coordinates of the unmanned aerial vehicle center point in the unmanned aerial vehicle image;
the angle calculation subunit is used for calculating a vertical deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the central point of the image in the unmanned aerial vehicle image according to the vertical distance and the linear distance between the central point of the image and the central point of the unmanned aerial vehicle;
and the angle calculation subunit is also used for calculating a horizontal deviation angle between the central point of the unmanned aerial vehicle in the unmanned aerial vehicle image and the image central point in the unmanned aerial vehicle image according to the horizontal distance and the linear distance between the image central point and the central point of the unmanned aerial vehicle.
9. An electronic device, comprising: a memory, a processor and a transceiver, which are connected in sequence in a communication manner, wherein the memory is used for storing a computer program, the transceiver is used for transmitting and receiving messages, and the processor is used for reading the computer program and executing the automatic tracking method of the unmanned aerial vehicle according to any one of claims 1 to 7.
10. A storage medium having stored thereon instructions for performing the method of automatic tracking of a drone according to any one of claims 1 to 7 when the instructions are run on a computer.
CN202211204867.XA 2022-09-29 2022-09-29 Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium Active CN115665553B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211204867.XA CN115665553B (en) 2022-09-29 2022-09-29 Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211204867.XA CN115665553B (en) 2022-09-29 2022-09-29 Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115665553A true CN115665553A (en) 2023-01-31
CN115665553B CN115665553B (en) 2023-06-13

Family

ID=84984833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211204867.XA Active CN115665553B (en) 2022-09-29 2022-09-29 Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115665553B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116953610A (en) * 2023-09-21 2023-10-27 国网浙江省电力有限公司信息通信分公司 Unmanned aerial vehicle positioning system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10267621A (en) * 1997-03-24 1998-10-09 Komatsu Ltd Apparatus and method for measurement of height of object
CN107564038A (en) * 2017-08-28 2018-01-09 北京小米移动软件有限公司 Offset parameter determines method and apparatus and skew control method and device
CN110232706A (en) * 2019-06-12 2019-09-13 睿魔智能科技(深圳)有限公司 More people are with shooting method, device, equipment and storage medium
CN111242984A (en) * 2020-02-13 2020-06-05 珠海安联锐视科技股份有限公司 Target tracking method based on moving head camera
CN111930147A (en) * 2020-10-09 2020-11-13 成都纵横自动化技术股份有限公司 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle
CN113553889A (en) * 2020-04-26 2021-10-26 杭州萤石软件有限公司 Face tracking method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10267621A (en) * 1997-03-24 1998-10-09 Komatsu Ltd Apparatus and method for measurement of height of object
CN107564038A (en) * 2017-08-28 2018-01-09 北京小米移动软件有限公司 Offset parameter determines method and apparatus and skew control method and device
CN110232706A (en) * 2019-06-12 2019-09-13 睿魔智能科技(深圳)有限公司 More people are with shooting method, device, equipment and storage medium
WO2020248395A1 (en) * 2019-06-12 2020-12-17 睿魔智能科技(深圳)有限公司 Follow shot method, apparatus and device, and storage medium
CN111242984A (en) * 2020-02-13 2020-06-05 珠海安联锐视科技股份有限公司 Target tracking method based on moving head camera
CN113553889A (en) * 2020-04-26 2021-10-26 杭州萤石软件有限公司 Face tracking method and device
CN111930147A (en) * 2020-10-09 2020-11-13 成都纵横自动化技术股份有限公司 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116953610A (en) * 2023-09-21 2023-10-27 国网浙江省电力有限公司信息通信分公司 Unmanned aerial vehicle positioning system and method
CN116953610B (en) * 2023-09-21 2023-12-26 国网浙江省电力有限公司信息通信分公司 Unmanned aerial vehicle positioning system and method

Also Published As

Publication number Publication date
CN115665553B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN112567201B (en) Distance measuring method and device
Zhao et al. Detection, tracking, and geolocation of moving vehicle from uav using monocular camera
CN110497901B (en) Parking space automatic searching method and system based on robot VSLAM technology
KR101534056B1 (en) Traffic signal mapping and detection
CN111830953B (en) Vehicle self-positioning method, device and system
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN110386142A (en) Pitch angle calibration method for automatic driving vehicle
CN109931939A (en) Localization method, device, equipment and the computer readable storage medium of vehicle
CN109596121B (en) Automatic target detection and space positioning method for mobile station
CN112489032A (en) Unmanned aerial vehicle-mounted small target detection and positioning method and system under complex background
US20190311209A1 (en) Feature Recognition Assisted Super-resolution Method
JPWO2020039937A1 (en) Position coordinate estimation device, position coordinate estimation method and program
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN109871739B (en) Automatic target detection and space positioning method for mobile station based on YOLO-SIOCTL
CN110197097B (en) Harbor district monitoring method and system and central control system
US11373409B2 (en) Photography system
CN115665553B (en) Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium
CN116486290B (en) Unmanned aerial vehicle monitoring and tracking method and device, electronic equipment and storage medium
CN114549633A (en) Pose detection method and device, electronic equipment and storage medium
CN111652276B (en) All-weather portable multifunctional bionic positioning and attitude-determining viewing system and method
CN112580489A (en) Traffic light detection method and device, electronic equipment and storage medium
CN114782548B (en) Global image-based radar data calibration method, device, equipment and medium
CN116027351A (en) Hand-held/knapsack type SLAM device and positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant