CN114897935B - Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera - Google Patents
Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera Download PDFInfo
- Publication number
- CN114897935B CN114897935B CN202210521474.5A CN202210521474A CN114897935B CN 114897935 B CN114897935 B CN 114897935B CN 202210521474 A CN202210521474 A CN 202210521474A CN 114897935 B CN114897935 B CN 114897935B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- target object
- aerial vehicle
- image
- tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 15
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical group C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 20
- 230000005540 biological transmission Effects 0.000 claims description 14
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Abstract
The invention relates to the field of robot control, and discloses a method and a system for tracking an aerial target object by an unmanned aerial vehicle based on a virtual camera, wherein the selection right of the tracked target object is transmitted to ground end operators, an onboard camera image is transmitted to a ground station, the ground end operators autonomously and manually select the target object to be tracked in the image, and then an image tracking algorithm is operated to continuously lock the target object in the image acquired by the onboard camera, so that the tracking selection of the aerial target object by the unmanned aerial vehicle is more diversified, and the actual application of the unmanned aerial vehicle is promoted; meanwhile, the position coordinates of the target object on the virtual camera imaging plane are utilized to generate the control signals, so that the unmanned aerial vehicle can be effectively prevented from severely shaking based on the tracking control signals of image errors, the posture of the unmanned aerial vehicle is smoothly adjusted, and the tracking of the air target object is stable and efficient.
Description
Technical Field
The invention relates to the field of robot control, in particular to a method and a system for tracking an aerial target object by an unmanned aerial vehicle based on a virtual camera.
Background
With the development of unmanned aerial vehicle technology, unmanned aerial vehicles are increasingly widely used in industry, agriculture, service industry and daily entertainment. The unmanned aerial vehicle is utilized to autonomously track and monitor the object of interest in the air, so that the characteristics of strong maneuverability, wide adaptability and high cost performance of the unmanned aerial vehicle are fully exerted, and the problems of follow-up auxiliary interception caused by video recording and monitoring of 'low-low' aircrafts in a no-fly zone are effectively solved; the method mainly comprises the following steps:
1. environmental images are acquired by using an onboard camera of the unmanned aerial vehicle,
2. Detecting a target object in an image acquired by an onboard camera by using a target detection algorithm,
3. Transmitting the position coordinates of the target object in the image to an unmanned aerial vehicle control algorithm based on the image,
4. Transmitting a desired control signal generated by the unmanned aerial vehicle based on the control algorithm of the image to the unmanned aerial vehicle flight control,
5. The unmanned aerial vehicle is controlled by the unmanned aerial vehicle to continuously track the target object.
At present, a target detection algorithm (the step 2) adopted in the process of tracking an aerial target object by an unmanned aerial vehicle is generally based on traditional image characteristics or machine learning, and can only identify a certain target object with corresponding image characteristics or trained, so that rapid switching of detection of different targets is difficult, and the application of the target detection algorithm in an actual scene is limited; the unmanned aerial vehicle generates a control signal by directly utilizing the position coordinates of the target object on the imaging plane of the airborne camera based on the control algorithm of the image (step 3), and the problem that the target object is lost and tracking fails due to severe shaking of the control signal caused by rapid adjustment of the self posture of the unmanned aerial vehicle is difficult to solve in the process of tracking the non-cooperative aerial target object.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method and a system for tracking an aerial target object by an unmanned aerial vehicle based on a virtual camera.
In order to solve the technical problems, the invention adopts the following technical scheme:
a tracking method of an unmanned aerial vehicle to an aerial target object based on a virtual camera comprises the following steps:
step one: acquiring an environment image containing a target object through an onboard camera of the unmanned aerial vehicle;
step two: transmitting the environment image to a ground end operator;
Step three: a ground end operator frames and selects a target object to be tracked in an environment image;
Step four: when no target object is selected, continuously acquiring a transmission environment image, and waiting for a ground end operator to select the target object in a frame manner; when the target object is selected, continuously tracking the framed target object in a subsequent image sequence through a kernel correlation filtering algorithm;
step five: calculating coordinates of the object in the virtual camera plane (u 2,v2) using coordinates of the object in the on-board camera plane (u 1,v1):
u2=λu1cosψ/(u1sinψ+λ),v2=λv1cosθ/(v1sinθ+λ);
Wherein lambda is the focal length of the airborne camera, theta is the pitch angle of the unmanned aerial vehicle, psi is the yaw angle of the unmanned aerial vehicle, and the plane of the virtual camera is not changed due to the posture change of the unmanned aerial vehicle;
Step six: calculating a speed control signal (V x,Vy,Vz) of the unmanned aerial vehicle in an inertial coordinate system by using coordinates (u 2,v2) of the target object in the virtual camera plane:
Vx=k1d,Vy=k2u2,Vz=k3v2;
wherein k 1、k2、k3 is a control parameter, and d is the distance between the unmanned aerial vehicle and the target object;
Step seven: and calculating and generating a bottom layer control signal of the unmanned aerial vehicle through a speed control signal (V x,Vy,Vz), wherein the bottom layer control signal enables the unmanned aerial vehicle to continuously track a target object.
A virtual camera-based unmanned aerial vehicle tracking system for an aerial target, comprising:
The data acquisition unit is used for acquiring an environment image containing a target object, namely an airborne camera of the unmanned aerial vehicle;
The airborne computing unit runs a kernel correlation filtering algorithm after the target object is selected, and continuously tracks the selected target object in a subsequent image sequence acquired by an airborne camera; calculating the coordinates of the target object on the virtual camera plane by using the coordinates of the target object on the plane of the airborne camera, and calculating a speed control signal of the unmanned aerial vehicle in an inertial coordinate system by using the coordinates of the target object on the plane of the virtual camera; calculating and generating a bottom layer control signal of the unmanned aerial vehicle through the speed control signal, and transmitting the bottom layer control signal to the flight control unit;
the flight control unit controls the unmanned aerial vehicle according to the bottom control signal, so that the unmanned aerial vehicle can continuously track the target object;
The wireless transmission unit is used for transmitting the acquired image and unmanned aerial vehicle state data to the ground-end image display remote control unit;
The image display remote control unit is used for displaying an environment image and unmanned aerial vehicle state data so that ground terminal operators can frame and select a target object to be tracked in the environment image and return position coordinates of a selection frame;
the data acquisition unit and the flight control unit are both in communication connection with the airborne computing unit; the data acquisition unit, the airborne calculation unit and the flight control unit are in communication connection with the image display remote control unit through the wireless transmission unit.
Compared with the prior art, the invention has the beneficial technical effects that:
The invention changes the target detection algorithm adopted in the process of tracking the aerial target object by the existing unmanned aerial vehicle, and adopts the image tracking algorithm based on manual and automatic selection; when the image acquired by the data acquisition unit is transmitted to the image display remote control unit at the ground end, an operator at the ground end autonomously and manually selects a target to be tracked in the image, and then the airborne calculation unit of the unmanned aerial vehicle runs an image tracking algorithm to continuously lock the target in the image acquired by the data acquisition unit, so that the tracking selection of the unmanned aerial vehicle on the target in the air is more diversified, and the practical application of the unmanned aerial vehicle is promoted.
The control algorithm based on the image of the existing unmanned aerial vehicle is changed, and the control algorithm based on the virtual camera is adopted; when the airborne computing unit obtains the coordinates of the target object on the camera plane, the coordinates of the target object on the virtual camera plane are computed, so that the unmanned aerial vehicle speed controller is designed, control signals are given and transmitted to the unmanned aerial vehicle for flight control, and further, the unmanned aerial vehicle bottom layer control signals are computed and generated to act on the unmanned aerial vehicle, so that the gesture adjustment of the unmanned aerial vehicle is smoother, and the tracking of the air target object is more stable and efficient.
Drawings
FIG. 1 is a schematic diagram of a tracking system of the present invention;
FIG. 2 is a flow chart of the tracking method of the present invention.
Detailed Description
A preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings.
Term interpretation:
Unmanned aerial vehicle flight control: and the bottom layer controller of the unmanned aerial vehicle converts the outer ring (position) control signal into a control signal for the inner ring (gesture) of the unmanned aerial vehicle.
Unmanned aerial vehicle data graph transmission: and the wireless transmission equipment is used for transmitting the images acquired by the onboard camera of the unmanned aerial vehicle and the data of the unmanned aerial vehicle to the ground station.
Inertial coordinate system: and the Z axis of the coordinate system fixedly connected with the earth points to the earth center vertically, the X axis points to the north direction and the Y axis points to the east direction.
As shown in fig. 1, the tracking system of the present invention includes: the system comprises a data acquisition unit, an onboard computing unit, a flight control unit, a wireless transmission unit and an image display remote control unit.
Wherein the data acquisition unit is an onboard camera of the unmanned aerial vehicle, the onboard calculation unit is an onboard computer of the unmanned aerial vehicle, the wireless transmission unit is unmanned aerial vehicle data graph transmission, the flight control unit is unmanned aerial vehicle flight control,
The data acquisition unit and the flight control unit are connected with the airborne calculation unit, and the airborne calculation unit, the data acquisition unit and the flight control unit are connected with the image display remote control unit on the ground end through the wireless transmission unit.
The data acquisition unit is fixedly arranged on the unmanned aerial vehicle, the video image acquired by the camera is transmitted to the image display remote control unit at the ground end through the wireless transmission unit, an operator at the ground end frames the area of the target to be tracked in the transmitted image, and after the target object is selected, a nuclear correlation filtering algorithm is operated on the airborne computing unit of the unmanned aerial vehicle, so that the target object is continuously tracked in the subsequent image sequence acquired by the data acquisition unit.
The kernel correlation filtering algorithm is used for continuously tracking the target object in the subsequent image sequence by calculating the correlation between the image area of the target object to be tracked selected by a ground end operator and different areas of each frame of image in the subsequent image sequence and selecting the area with the highest correlation as the area where the target object is located in the frame of image.
The airborne computing unit utilizes the coordinates of the target object on the camera plane (each frame of image) to compute the coordinates of the target object on the virtual camera plane, so as to design the unmanned aerial vehicle speed controller, give out control signals, transmit the control signals to the unmanned aerial vehicle for flight control, generate control signals of the unmanned aerial vehicle bottom layer through flight control computation, act on the unmanned aerial vehicle, and enable the unmanned aerial vehicle to continuously track the target object.
Through converting the coordinates of the target object on the camera plane to the virtual camera plane, the severe shaking of the unmanned aerial vehicle based on the image error tracking control signal caused by the maneuvering action of the unmanned aerial vehicle in the tracking process is compensated to a certain extent, so that the posture adjustment of the unmanned aerial vehicle is smoother, and the tracking of the target object is more stable and efficient.
The virtual camera plane is not changed due to the change of the gesture of the unmanned aerial vehicle, and in the embodiment, the virtual camera plane is always perpendicular to the X axis of the inertial coordinate system (perpendicular to the ground plane).
As shown in fig. 2, the working process of the tracking system, that is, the method for tracking the aerial target object by the unmanned aerial vehicle based on the virtual camera, includes the following steps:
s1: the unmanned aerial vehicle data acquisition unit acquires an environment image containing a target object;
s2: the wireless transmission unit transmits the acquired image to an image display remote control unit at the ground end;
S3: a ground end operator frames a target object to be tracked in an image displayed by a ground end image display remote control unit (ground station display equipment);
S4: returning to S1 when no object is selected; when the existence target object is selected, operating S5;
S5: a nuclear correlation filtering algorithm is operated on the airborne computing unit, so that the continuous tracking of the target object in the subsequent image sequence acquired by the data acquisition unit is ensured;
S6: the on-board calculation unit calculates coordinates (u 2,v2) of the object in the virtual camera plane using coordinates (u 1,v1) of the object in the on-board camera plane:
u2=λu1cosψ/(u1sinψ+λ),v2=λv1cosθ/(v1sinθ+λ);
Wherein lambda is the focal length of the data acquisition unit, theta is the pitch angle of the unmanned aerial vehicle, and psi is the yaw angle of the unmanned aerial vehicle.
S7, the airborne calculation unit calculates a speed control signal (V x,Vy,Vz) of the unmanned aerial vehicle in an inertial coordinate system by using the coordinate (u 2,v2) of the target object in the virtual camera plane:
Vx=k1d,Vy=k2u2,Vz=k3v2;
Wherein k 1、k2、k3 is a control parameter; d is the distance between the unmanned aerial vehicle and an aerial target object, and can be generally obtained through measurement of an onboard sensor of the unmanned aerial vehicle (such as a binocular camera, a laser radar and the like); the speed control signal (V x,Vy,Vz) obtained by calculation on the airborne calculation unit is transmitted to the unmanned aerial vehicle flight control, and the unmanned aerial vehicle flight control generates an unmanned aerial vehicle bottom layer control signal through calculation and acts on the unmanned aerial vehicle, so that the unmanned aerial vehicle can continuously track the target object.
The unmanned aerial vehicle in the invention can be replaced by any other mobile robots, such as unmanned aerial vehicles of different types, including four-rotor unmanned aerial vehicles, six-rotor unmanned aerial vehicles, eight-rotor unmanned aerial vehicles, vertical take-off and landing unmanned aerial vehicles and the like; or ground mobile robots including two-wheeled differential robots, four-wheeled robots, four-legged robots, etc.; or a surface robot, a submersible robot, etc.
The image processing algorithms and drone control algorithms running on the on-board computing unit of the present invention may be migrated to any other computing device, such as a ground computer, server, personal notebook computer, etc.
The kernel correlation filtering algorithm of the present invention may be replaced with any other similar or improved manually selected image tracking algorithm.
The setting of the virtual camera plane (always perpendicular to the X-axis of the inertial coordinate system) in the present invention can be replaced with any other setting, the main idea being that the virtual camera plane does not change due to the change of the pose of the drone.
The control mode of the unmanned aerial vehicle in the invention can be changed from the speed of the unmanned aerial vehicle in an inertial coordinate system to any other control mode, such as the position, the attitude angle, the angular speed, the moment or the external force of the unmanned aerial vehicle, and the like.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a single embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to specific embodiments, and that the embodiments may be combined appropriately to form other embodiments that will be understood by those skilled in the art.
Claims (2)
1. A tracking method of an unmanned aerial vehicle to an aerial target object based on a virtual camera comprises the following steps:
step one: acquiring an environment image containing a target object through an onboard camera of the unmanned aerial vehicle;
step two: transmitting the environment image to a ground end operator;
Step three: a ground end operator frames and selects a target object to be tracked in an environment image;
Step four: when no target object is selected, continuously acquiring a transmission environment image, and waiting for a ground end operator to select the target object in a frame manner; when the target object is selected, continuously tracking the framed target object in a subsequent image sequence through a kernel correlation filtering algorithm;
Step five: calculating coordinates of the object in the virtual camera plane (u 2,v2) using coordinates of the object in the on-board camera plane (u 1,v1):
u2=λu1cosψ/(u1sinψ+λ),v2=λv1cosθ/(v1sinθ+λ);
Wherein lambda is the focal length of the airborne camera, theta is the pitch angle of the unmanned aerial vehicle, psi is the yaw angle of the unmanned aerial vehicle, and the plane of the virtual camera is not changed due to the posture change of the unmanned aerial vehicle;
Step six: calculating a speed control signal (V x,Vy,Vz) of the unmanned aerial vehicle in an inertial coordinate system by using coordinates (u 2,v2) of the target object in the virtual camera plane:
Vx=k1d,Vy=k2u2,Vz=k3v2;
wherein k 1、k2、k3 is a control parameter, and d is the distance between the unmanned aerial vehicle and the target object;
Step seven: and calculating and generating a bottom layer control signal of the unmanned aerial vehicle through a speed control signal (V x,Vy,Vz), wherein the bottom layer control signal enables the unmanned aerial vehicle to continuously track a target object.
2. A virtual camera-based unmanned aerial vehicle tracking system for an aerial target, comprising:
The data acquisition unit is used for acquiring an environment image containing a target object, namely an airborne camera of the unmanned aerial vehicle;
The airborne computing unit runs a kernel correlation filtering algorithm after the target object is selected, and continuously tracks the selected target object in a subsequent image sequence acquired by an airborne camera; calculating the coordinates of the target object on the virtual camera plane by using the coordinates of the target object on the plane of the airborne camera, and calculating a speed control signal of the unmanned aerial vehicle in an inertial coordinate system by using the coordinates of the target object on the plane of the virtual camera; calculating and generating a bottom layer control signal of the unmanned aerial vehicle through the speed control signal, and transmitting the bottom layer control signal to the flight control unit;
the flight control unit controls the unmanned aerial vehicle according to the bottom control signal, so that the unmanned aerial vehicle can continuously track the target object;
the wireless transmission unit is used for transmitting the acquired image and unmanned aerial vehicle state data to the image display remote control unit;
The image display remote control unit is used for displaying an environment image and unmanned aerial vehicle state data so that ground terminal operators can frame and select a target object to be tracked in the environment image and return position coordinates of a selection frame;
the data acquisition unit and the flight control unit are both in communication connection with the airborne computing unit; the data acquisition unit, the airborne calculation unit and the flight control unit are in communication connection with the image display remote control unit through the wireless transmission unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210521474.5A CN114897935B (en) | 2022-05-13 | Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210521474.5A CN114897935B (en) | 2022-05-13 | Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114897935A CN114897935A (en) | 2022-08-12 |
CN114897935B true CN114897935B (en) | 2024-07-12 |
Family
ID=
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108363946A (en) * | 2017-12-29 | 2018-08-03 | 成都通甲优博科技有限责任公司 | Face tracking system and method based on unmanned plane |
CN109753076A (en) * | 2017-11-03 | 2019-05-14 | 南京奇蛙智能科技有限公司 | A kind of unmanned plane vision tracing implementing method |
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109753076A (en) * | 2017-11-03 | 2019-05-14 | 南京奇蛙智能科技有限公司 | A kind of unmanned plane vision tracing implementing method |
CN108363946A (en) * | 2017-12-29 | 2018-08-03 | 成都通甲优博科技有限责任公司 | Face tracking system and method based on unmanned plane |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11879737B2 (en) | Systems and methods for auto-return | |
CN109911188B (en) | Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment | |
US11822353B2 (en) | Simple multi-sensor calibration | |
US10649469B2 (en) | Indoor mapping and modular control for UAVs and other autonomous vehicles, and associated systems and methods | |
US11127202B2 (en) | Search and rescue unmanned aerial system | |
CN109388150B (en) | Multi-sensor environment mapping | |
US10240930B2 (en) | Sensor fusion | |
CN110192122B (en) | System and method for radar control on unmanned mobile platforms | |
CN110347186B (en) | Ground moving target autonomous tracking system based on bionic binocular linkage | |
CN112130579A (en) | Tunnel unmanned aerial vehicle inspection method and system | |
WO2018053861A1 (en) | Methods and system for vision-based landing | |
JP4012749B2 (en) | Remote control system | |
CN105182992A (en) | Unmanned aerial vehicle control method and device | |
CN111226154B (en) | Autofocus camera and system | |
CN112335190B (en) | Radio link coverage map and impairment system and method | |
CN102654917B (en) | Method and system for sensing motion gestures of moving body | |
Martínez et al. | Trinocular ground system to control UAVs | |
Amidi et al. | Research on an autonomous vision-guided helicopter | |
CN114897935B (en) | Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera | |
Su et al. | A framework of cooperative UAV-UGV system for target tracking | |
CN115237158A (en) | Multi-rotor unmanned aerial vehicle autonomous tracking and landing control system and control method | |
CN212988387U (en) | Indoor unmanned aerial vehicle system based on two-dimensional code array | |
TWI746234B (en) | Method for distance measurement and positioning of unmanned helicopter to sea surface target | |
US20220075370A1 (en) | Systems, methods and programs for continuously directing an unmanned vehicle to an environment agnostic destination marked by a user | |
CN114897935A (en) | Unmanned aerial vehicle tracking method and system for air target object based on virtual camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |