WO2017177542A1 - 目标跟踪方法、装置和系统 - Google Patents

目标跟踪方法、装置和系统 Download PDF

Info

Publication number
WO2017177542A1
WO2017177542A1 PCT/CN2016/086312 CN2016086312W WO2017177542A1 WO 2017177542 A1 WO2017177542 A1 WO 2017177542A1 CN 2016086312 W CN2016086312 W CN 2016086312W WO 2017177542 A1 WO2017177542 A1 WO 2017177542A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
coordinate system
yaw
roll
pitch
Prior art date
Application number
PCT/CN2016/086312
Other languages
English (en)
French (fr)
Inventor
高鹏
朱棣
Original Assignee
高鹏
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 高鹏 filed Critical 高鹏
Publication of WO2017177542A1 publication Critical patent/WO2017177542A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present invention relates to the field of unmanned aerial vehicles, and in particular, to a target tracking method, apparatus and system.
  • Unmanned aerial vehicles also known as drones, unmanned aerial vehicles, etc.
  • unmanned aircraft that are either radio-controlled or under autonomous, semi-autonomous procedures. Because of its low cost, no risk of casualties, and good mobility, it is widely used in various fields of aerial photography, geological survey, line inspection, and emergency rescue. Among them, drones are widely used in aerial photography due to their unique maneuvering advantages. With the development of intelligent technology, people have put forward higher requirements for the intelligent functions of unmanned aerial vehicles, such as requiring the drone to automatically track the target.
  • patent application CN105100728A and the patent CN103149939B all use a visual sensor as a target tracking sensor.
  • the solution of patent application CN104965522A is implemented based on GPS standard positioning.
  • the existing video recognition system is complicated in structure, and the algorithm loses the target in the case where the line of sight is occluded for a short time or the target moves at a high speed, which has certain limitations.
  • the use of vision sensors as target tracking sensors has a long tracking algorithm, which is not conducive to the integration of drone airborne systems.
  • the positioning and tracking accuracy of the target tracking system based on GPS positioning is not high, and it is difficult to achieve the shadow of the target in the shooting device. The sound is stable and locked.
  • the technical problem to be solved by the present invention is to provide a new target tracking method, device and system, which can improve the positioning and tracking accuracy of the drone during the shooting process.
  • the present invention provides a target tracking method, including:
  • a target control amount is obtained according to the current attitude information and the target posture information of the aircraft, and the target control amount is used to indicate an adjustment amount for the pan/tilt motion control.
  • the reference coordinate system is established according to the position information of the object and the aircraft, and the first position vector from the object to the aircraft in the reference coordinate system is obtained.
  • the reference coordinate system obtains the first position vector.
  • the reference coordinate system is established according to the position information of the object and the aircraft in the geographic coordinate system, and the first position vector is obtained, including:
  • lon0 2* ⁇ *a/360
  • lat0 (2* ⁇ *c+4*(ac))/360
  • is the pi
  • a is the long axis radius of the earth
  • c is the short axis radius of the earth
  • lon Longitude
  • lat represents dimension
  • hei represents height
  • t represents the subject
  • u represents the aircraft
  • cos() represents a cosine function
  • x 1 , y 1 , z 1 represent coordinate values in the reference coordinate system.
  • calculating a direction vector from the antenna position of the aircraft to the pan/tilt position in the reference coordinate system according to the current attitude information of the aircraft including:
  • a direction vector from the antenna phase center of the aircraft to the pan/tilt motion center in the reference coordinate system is calculated based on the current attitude information of the aircraft.
  • calculating a direction vector from the antenna phase center of the aircraft to the motion center of the pan-tilt in the reference coordinate system according to the current attitude information of the aircraft including:
  • Pitch u represents the current pitch angle of the aircraft u
  • Roll u represents the current roll angle of the aircraft u
  • Yaw u represents the current yaw angle of the aircraft u
  • sin() represents a sine function
  • dA(dx, dy, dz) represents the coordinate value of the direction vector of the antenna phase center to the motion center of the pan-tilt in the body coordinate system of the aircraft.
  • calculating a second position from the subject to the position of the gimbal in the reference coordinate system according to the first position vector and the direction vector Vector, and converting the second position vector into target attitude information of the aircraft including:
  • the second position vector is converted into the target posture information Atti c (Pitch c , Roll c , Yaw c ) of the aircraft in the body coordinate system by using Equation 6 below:
  • arcsin() is an inverse sine function.
  • the target control amount is obtained according to the current attitude information and the target attitude information of the aircraft, including:
  • the invention also provides a target tracking device, comprising:
  • a first position vector module configured to establish a reference coordinate system according to the position information of the object and the aircraft, to obtain a first position vector from the object to the aircraft in the reference coordinate system;
  • a direction vector module configured to calculate, according to current attitude information of the aircraft, a direction vector from an antenna position of the aircraft to a pan/tilt position in the reference coordinate system;
  • a second position vector module configured to calculate, according to the first position vector and the direction vector, a second position vector from the object to the position of the head in the reference coordinate system, and Translating the second position vector into target attitude information of the aircraft;
  • a control quantity module configured to obtain a target control amount according to the current attitude information and the target attitude information of the aircraft, where the target control quantity is used to indicate an adjustment amount for the pan/tilt motion control.
  • the first location vector module is further configured to acquire location information of the target in a geographic coordinate system from a GNSS beacon of the target;
  • the GNSS receiver and the antenna of the aircraft acquire position information of the aircraft in a geographic coordinate system; and establish the reference coordinate system according to the target and the position information of the aircraft in the geographic coordinate system, and obtain the The first position vector is described.
  • the first location vector module is further configured to: according to the location information of the target in the geographic coordinate system, P t (lon t , lat t , hei t And establishing, by using Equations P u (lon u , lat u , hei u ) in the geographic coordinate system of the aircraft, the reference coordinate system is established by using Equations 1 and 2 below, to obtain the first position vector A. 1 (x 1 , y 1 , hei u -hei t ),
  • lon0 2* ⁇ *a/360
  • lat0 (2* ⁇ *c+4*(ac))/360
  • is the pi
  • a is the long axis radius of the earth
  • c is the short axis radius of the earth
  • lon Longitude
  • lat represents dimension
  • hei represents height
  • t represents the subject
  • u represents the aircraft
  • cos() represents a cosine function
  • x 1 , y 1 , z 1 represent coordinate values in the reference coordinate system.
  • the direction vector module is further configured to obtain current attitude information of the aircraft from an attitude sensor; and calculate, in the reference coordinate system, according to current attitude information of the aircraft A direction vector from the antenna phase center of the aircraft to the center of motion of the gimbal.
  • the direction vector module is further configured to adopt the following Equations 8, 4, and 4 according to the current attitude information Atti u (Pitch u , Roll u , Yaw u ) of the aircraft. Equation 5, calculating a direction vector dA'(dx',dy',dz') from the antenna phase center of the aircraft to the motion center of the gimbal in the reference coordinate system,
  • Pitch u represents the current pitch angle of the aircraft u
  • Roll u represents the current roll angle of the aircraft u
  • Yaw u represents the current yaw angle of the aircraft u
  • sin() represents a sine function
  • dA(dx, dy, dz) represents the coordinate value of the direction vector of the antenna phase center to the motion center of the pan-tilt in the body coordinate system of the aircraft.
  • arcsin() is an inverse sine function.
  • control module is further configured according to the amount of information of the current attitude of the aircraft Atti u (Pitch u, Roll u, Yaw u) and the target posture information, using the formula 7 Obtain the target control amount Output of the three-axis pan/tilt,
  • the present invention also provides a target tracking system for an unmanned aerial vehicle, comprising: a pan-tilt controller disposed on the unmanned aerial vehicle and a positioning device disposed on the object to be photographed, wherein the pan-tilt controller adopts the present A target tracking device of any one of the embodiments of the invention.
  • the positioning device of the target is a GNSS beacon, and the GNSS beacon is used to acquire location information of the target;
  • the target tracking system further includes:
  • the airborne GNSS receiver and the antenna obtain the position information of the unmanned aerial vehicle by receiving the GNSS signal;
  • An attitude sensor for detecting posture information of the unmanned aerial vehicle
  • a pan/tilt head and a photographing device for controlling a photographing posture of a photographing device mounted on the pan head;
  • the pan/tilt controller is in communication with the GNSS receiver and antenna, the GNSS beacon, and the attitude sensor for receiving location information of the unmanned aerial vehicle from the GNSS receiver and antenna, Receiving, by the GNSS beacon, position information of the object to be photographed, and receiving posture information of the unmanned aerial vehicle from an attitude sensor;
  • the pan/tilt controller is further connected to the cloud platform for transmitting a target control amount to the pan/tilt to control a shooting posture of the photographing device on the pan/tilt.
  • the invention calculates a relative spatial pointing relationship according to the acquired position information, and combines the drone
  • the attitude of the aircraft obtained by the airborne sensor determines the pan/tilt output of the shooting device, and realizes the tracking of the target. Because of the influence of the height of the fuselage on the tracking algorithm, the tracking accuracy is high, and the all-weather autonomous tracking can be realized.
  • the present embodiment uses the high-precision GNSS positioning result to obtain accurate position information of the subject and the drone, which can ensure further improvement of the calculation accuracy.
  • FIG. 1 is a block diagram showing the structure of a target tracking system of an unmanned aerial vehicle according to an embodiment of the present invention
  • FIG. 2a is a schematic flow chart of a target tracking method according to an embodiment of the invention.
  • 2b is a schematic diagram showing the principle of a target tracking method according to an embodiment of the invention.
  • FIG. 3 is a schematic flowchart diagram of a target tracking method according to another embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a target tracking device according to an embodiment of the present invention.
  • FIG. 5 is a block diagram showing the structure of a target tracking device according to an embodiment of the present invention.
  • the GNSS Global Navigation Satellite System
  • PPP Precision Point Positioning
  • the invention is based on GNSS technology and is advantageous for achieving precise positioning.
  • the target tracking system of the unmanned aerial vehicle mainly includes: a pan-tilt controller 114 disposed on the unmanned aerial vehicle 11 (abbreviated as a drone) and a positioning device disposed on the object 13 to be photographed.
  • a pan-tilt controller 114 disposed on the unmanned aerial vehicle 11 (abbreviated as a drone) and a positioning device disposed on the object 13 to be photographed.
  • the pan/tilt controller 114 can be configured independently of the controller of the drone, or can be implemented by the controller of the drone, which is not limited by the present invention.
  • the UAV target tracking system of the embodiment includes: a GNSS beacon 131, communication devices 133, 113, an onboard GNSS receiver and antenna 111, a PTZ 112, a PTZ controller 114, and an attitude sensor 115. , shooting device 116.
  • the GNSS beacon 131 is equipped on the subject 13 and the GNSS can be used to obtain the precise position of the subject.
  • the onboard GNSS receiver and antenna 111, the pan/tilt head 112, the pan/tilt controller 114, the attitude sensor 115, and the photographing device 116 are provided on the target tracking device 11 based on the drone.
  • the GNSS beacon 131 communicates with the communication device 113 of the target tracking device 11 through the communication device 133, thereby providing the pan/tilt controller 114 with real-time accurate position information of the subject 13.
  • Communication device 113 communication device
  • the device 133 can be considered as both ends of a communication device for implementing communication between the GNSS beacon 131 and the PTZ controller 114, and communicating in the air by, for example, a 433 MHz radio signal.
  • the functions of the components installed in the drone are as follows:
  • the onboard GNSS receiver and antenna 111 is coupled to the pan/tilt controller 114 to obtain the precise position of the drone by receiving the GNSS signal, which can actually correspond to the spatial position of the phase center of the receiving antenna.
  • the pan/tilt head 112 is preferably constituted by a motor or a steering gear having a plurality of degrees of freedom, and can control the mounting device mounted thereon to perform multi-degree of freedom swinging.
  • the pan/tilt controller 114 is preferably composed of a microcontroller, its peripheral circuits, and a pan/tilt motor drive circuit, and receives the information transmitted from each part for calculation processing, and then controls the pan/tilt to swing.
  • the attitude sensor 115 is preferably constituted by a MEMS sensor, and provides the pan/tilt controller 114 with posture information of three directions of a Pitch (pitch angle), a Roll (rolling angle), and a Yaw (yaw angle) of the drone.
  • the photographing device 116 is preferably composed of a camera or a video camera.
  • the attitude sensor 115 is coupled to the pan/tilt controller 114 for detecting and providing the current attitude information of the drone.
  • the photographing device 116 for example, a camera is fixed on the pan/tilt head 112, and the pan-tilt controller 114 controls the wobble of the pan-tilt head 112 to control the photographing posture of the photographing device, thereby realizing tracking of the object to be photographed.
  • FIG. 2a is a flow chart showing a target tracking method according to an embodiment of the invention. As shown in FIG. 2a, the target tracking method may specifically include the following steps:
  • Step 201 Establish a reference coordinate system according to the position information of the target and the aircraft, and obtain a first position vector from the target to the aircraft in the reference coordinate system;
  • Step 202 Calculate, according to current posture information of the aircraft, a direction vector from an antenna position of the aircraft to a PTZ position in the reference coordinate system;
  • Step 203 Calculate, according to the first position vector and the direction vector, a second position vector from the target to the position of the head in the reference coordinate system, and the second position Converting the vector to target attitude information of the aircraft;
  • Step 204 Obtain a target control amount according to the current posture information and the target posture information of the aircraft, where the target control amount is used to indicate an adjustment amount for the pan/tilt motion control.
  • an airborne GNSS receiver and an antenna 43 are disposed above the body of the drone 41, and a pan/tilt head 45 and a photographing device (not shown) are disposed under the body of the drone 41.
  • the photographing device is usually loaded on the pan/tilt 45.
  • the first position vector from the target 47 to the antenna phase center (for example, the geometric center of the antenna) is A 1
  • the direction vector from the antenna phase center to the pan-tilt motion center (for example, the center of rotation of the pan-tilt) is dA'.
  • the second position vector of the target 47 to the PTZ motion center is A 2 .
  • the second position vector obtained by correcting the first position vector by the direction vector is a vector representation of the actual pan-tilt motion center and the target, and the position difference actually existing between the GNSS antenna and the pan-tilt motion center can be eliminated.
  • step 201 may include:
  • Step 2011 Obtain location information of the target in a geographic coordinate system from a GNSS beacon of the target;
  • Step 2012 Obtain location information of the aircraft in a geographic coordinate system from a GNSS receiver and an antenna of the aircraft;
  • Step 2013 Establish a reference coordinate system according to the target information and position information of the aircraft in the geographic coordinate system, to obtain the first position vector.
  • the step 2013 may include: according to the position information P t (lon t , lat t , hei t ) of the target in the geographic coordinate system, and the aircraft Position information P u (lon u , lat u , hei u ) in the geographic coordinate system, using the following formula 1 and formula 2 to establish a reference coordinate system, such as a Cartesian coordinate system, obtained from the referenced coordinate system Taking a first position vector A 1 (x 1 , y 1 , hei u - hei t ) of the target to the aircraft,
  • lon0 2* ⁇ *a/360
  • lat0 (2* ⁇ *c+4*(ac))/360
  • is the pi
  • a is the long axis radius of the earth
  • c is the short axis radius of the earth
  • lon is The longitude
  • lat represents the dimension
  • hei represents the height
  • t represents the subject
  • u represents the aircraft
  • cos() represents the cosine function
  • x 1 , y 1 , z 1 represent the coordinate values in the reference coordinate system.
  • step 202 may include:
  • Step 2021 Obtain current attitude information of the aircraft from an attitude sensor.
  • Step 2022 Calculate a direction vector from the antenna phase center of the aircraft to the motion center of the pan-tilt in the reference coordinate system according to the current posture information of the aircraft.
  • the step 2022 may include: calculating, according to the current posture information Atti u (Pitch u , Roll u , Yaw u ) of the aircraft, using Equations 8, 4 and 5 below The direction vector dA'(dx',dy',dz') from the antenna phase center of the aircraft to the motion center of the gimbal in the reference coordinate system,
  • Pitch u represents the current pitch angle of the aircraft u
  • Roll u represents the current roll angle of the aircraft u
  • Yaw u represents the current yaw angle of the aircraft u
  • sin() represents a sine function
  • dA(dx, dy, dz) represents the antenna
  • the direction vector of the phase center to the motion center of the pan-tilt is in the coordinate system of the aircraft body coordinate system.
  • step 203 may include:
  • Step 2031 calculating, according to the first position vector A 1 (x 1 , y 1 , hei u -hei t ) and the direction vector dA′ (dx′, dy′, dz′), from the target
  • Step 2032 Convert the second position vector into the target posture information Atti c (Pitch c , Roll c , Yaw c ) of the aircraft in the body coordinate system by using Equation 6 below:
  • arcsin() is an inverse sine function.
  • the step 204 may include: obtaining the target of the three-axis pan/tilt according to the following formula 7 according to the current attitude information Atti u (Pitch u , Roll u , Yaw u ) and the target attitude information of the aircraft.
  • Control quantity Output
  • the target tracking method of the embodiment calculates the relative spatial pointing relationship according to the acquired position information, and combines the attitude of the aircraft obtained by the unmanned aerial vehicle sensor to determine the camera pan/tilt output to achieve tracking of the target, since the fuselage is considered.
  • the height has a high impact on the tracking algorithm, and the tracking accuracy is high, enabling all-weather autonomous tracking.
  • the present embodiment uses high-precision GNSS positioning results to obtain accurate position information of the target and the drone, and obtains an accurate relative positional relationship between the target and the optical axis of the camera through a simple calculation method, and the sensor information of the drone. Tight coupling is achieved to control the camera pan/tilt, which ensures that the accuracy of the calculation is further improved and the tracking of the target is achieved.
  • FIG. 3 is a flow chart showing a target tracking method according to another embodiment of the present invention.
  • the GNSS technology is used to implement the photographic tracking method.
  • the same formulas in the embodiment have the same meanings and are not described here.
  • the target tracking method may specifically include the following steps:
  • Step 301 The GNSS beacon receives the GNSS signal for precise positioning, and obtains the position P t (lon t , lat t , hei t ) of the target (usually lat t , lon t is in degrees, and hei t is in meters).
  • Step 302 The communication device encodes and modulates the position of the target to 433 MHz wireless Transmitting on an electrical signal
  • Step 303 The PTZ controller receives the position P t of the target object transmitted from the communication device, and obtains the UW position P u (lon u , lat u , hei u ) from the airborne GNSS receiver and the antenna.
  • P t and P u are coordinate values in the geographic coordinate system. Then, use P t as the dot and establish the right-hand coordinate system in the positive east direction for the x-axis (in this case, it can be assumed that the target and the drone movement area are planes instead of ellipsoids), and the coordinate system can also be called local Cartesian coordinate system.
  • the parameters in the above formula may also adopt other standards similar to the WGS-84 standard such as Xi'an 54, Beijing 84, CGCS2000, etc., and the values calculated by different standards may be slightly different, and the present invention does not limit the specific calculation standard.
  • Step 304 the controller obtains the posture sensor head measured vehicle attitude Atti u (Pitch u, Roll u , Yaw u), the direction vector can be onboard GNSS receiver antenna phase center and the antenna module to the center of the head movement
  • the representation in the aircraft coordinate system: dA(dx, dy, dz) (determined by the installation position on the aircraft), then the direction vector can be expressed in the coordinate system established in step 303 as: dA'(dx', dy' , dz'), wherein dA'(dx', dy', dz') can be calculated by referring to Equation 8, Formula 4 and Formula 5 of Example 2.
  • the target control amount is output to the pan-tilt motor by, for example, a second-order control loop by the pan-tilt controller to achieve tracking of the target.
  • the target tracking method of the embodiment can use a high-precision GNSS positioning technology, such as PPP technology, to acquire the precise position of the target and the drone, and then calculate the relative spatial pointing relationship, and combine the aircraft attitude obtained by the unmanned aerial vehicle sensor.
  • the camera pan/tilt output is determined to achieve tracking of the target, and the tracking accuracy is high, and the tracking can be achieved all the time.
  • the existing visual tracking technology it has the characteristics of simple calculation, high real-time performance and wide application environment.
  • the existing target tracking method based on GPS positioning it has the characteristics of high tracking precision and stable tracking.
  • FIG. 4 is a block diagram showing the structure of a target tracking device according to an embodiment of the invention. As shown in FIG. 4, the target tracking device may include:
  • a first position vector module 51 configured to establish a reference coordinate system according to the position information of the object and the aircraft, to obtain a first position vector from the object to the aircraft in the reference coordinate system;
  • a direction vector module 53 configured to calculate, according to current attitude information of the aircraft, a direction vector from an antenna position of the aircraft to a pan/tilt position in the reference coordinate system;
  • a second position vector module 55 configured to calculate, according to the first position vector and the direction vector, a second position vector from the object to the position of the head in the reference coordinate system, and Converting the second position vector to target attitude information of the aircraft;
  • the control quantity module 57 is configured to obtain a target control amount according to the current attitude information and the target attitude information of the aircraft, and the target control quantity is used to indicate an adjustment amount for the pan/tilt motion control.
  • the first location vector module 51 is further configured to acquire location information of the target in a geographic coordinate system from a GNSS beacon of the target; from the fly The GNSS receiver and the antenna of the row obtain the position information of the aircraft in the geographic coordinate system; and establish the reference coordinate system according to the target and the position information of the aircraft in the geographic coordinate system, and obtain the The first position vector is described.
  • the first location vector module 51 is further configured to: according to the location information P t (lon t , lat t , hei t ) of the target in the geographic coordinate system, and Position information P u (lon u , lat u , hei u ) in the geographic coordinate system of the aircraft, and establishing the reference coordinate system by using Equations 1 and 2 below to obtain the first position vector A 1 (x 1 , y 1 , hei u -hei t ),
  • the direction vector module 53 is further configured to obtain current attitude information of the aircraft from the attitude sensor; and calculate, according to the current posture information of the aircraft, from the reference coordinate system The direction vector of the antenna phase center of the aircraft to the motion center of the gimbal.
  • the direction vector module 53 is further configured to calculate according to the current attitude information Atti u (Pitch u , Roll u , Yaw u ) of the aircraft by using Equations 8, 4 and 5 below. a direction vector dA'(dx',dy',dz') from the antenna phase center of the aircraft to the motion center of the gimbal in the reference coordinate system,
  • Pitch u represents the current pitch angle of the aircraft u
  • Roll u represents the current roll angle of the aircraft u
  • Yaw u represents the current yaw angle of the aircraft u
  • sin() represents a sine function
  • dA(dx, dy, dz) represents the coordinate value of the direction vector of the antenna phase center to the motion center of the pan-tilt in the body coordinate system of the aircraft.
  • the second position vector is converted into the target posture information Atti c (Pitch c , Roll c , Yaw c ) of the aircraft in the body coordinate system:
  • the module 57 is further configured to control the amount of the current attitude information of the aircraft Atti u (Pitch u, Roll u, Yaw u) and the target attitude information obtained using the following equation 7 triaxial Target control quantity of Yuntai,
  • Each module of the target tracking device of the embodiment can be implemented by a PTZ controller of the UAV, and the relative spatial pointing relationship is calculated according to the acquired position information, and the attitude of the aircraft obtained by the UAV onboard sensor is determined, and the camera is determined.
  • the cloud platform output realizes the tracking of the target. Because of the influence of the height of the fuselage on the tracking algorithm, the tracking accuracy is high, and the all-weather autonomous tracking can be realized.
  • FIG. 5 is a block diagram showing the structure of a target tracking device according to an embodiment of the present invention.
  • the target The trace device 1100 can be a host server with computing power, a personal computer PC, or a portable computer or terminal that can be carried.
  • the specific embodiments of the present invention do not limit the specific implementation of the computing node.
  • the target tracking device 1100 includes a processor 1110, a communications interface 1120, a memory 1130, and a bus 1140.
  • the processor 1110, the communication interface 1120, and the memory 1130 complete communication with each other through the bus 1140.
  • Communication interface 1120 is for communicating with network devices, including, for example, a virtual machine management center, shared storage, and the like.
  • the processor 1110 is configured to execute a program.
  • the processor 1110 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the present invention.
  • ASIC Application Specific Integrated Circuit
  • the memory 1130 is used to store files.
  • the memory 1130 may include a high speed RAM memory and may also include a non-volatile memory such as at least one disk memory.
  • Memory 1130 can also be a memory array.
  • the memory 1130 may also be partitioned, and the blocks may be combined into a virtual volume according to certain rules.
  • the above program may be program code including computer operating instructions.
  • the program can be specifically used to: perform the operations of the steps in Embodiment 2 or 3.
  • the function is implemented in the form of computer software and sold or used as a stand-alone product, it is considered to some extent that all or part of the technical solution of the present invention (for example, a part contributing to the prior art) is It is embodied in the form of computer software products.
  • the computer software The product is typically stored in a computer readable non-volatile storage medium, including instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods of various embodiments of the present invention. .
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • the invention calculates the relative spatial pointing relationship according to the acquired position information, and combines the attitude of the aircraft obtained by the unmanned aerial vehicle sensor, determines the pan/tilt output of the photographing device, realizes the tracking of the target, and considers the height of the body to track The influence of the algorithm, high tracking accuracy, and ability to achieve all-weather autonomous tracking.
  • the present embodiment uses the high-precision GNSS positioning result to obtain accurate position information of the subject and the drone, which can ensure further improvement of the calculation accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Variable-Direction Aerials And Aerial Arrays (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

一种目标跟踪方法、装置和系统,其中,该目标跟踪方法包括:根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量(201);根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量(202);根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息(203);根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量(204)。该方法跟踪精度高,能够实现全天候自主跟踪。

Description

目标跟踪方法、装置和系统
交叉引用
本申请主张2016年4月12日提交的中国专利申请号为201610225109.4的优先权,其全部内容通过引用包含于此。
技术领域
本发明涉及无人驾驶飞行器技术领域,尤其涉及一种目标跟踪方法、装置和系统。
背景技术
无人飞行器(也称为无人机、无人驾驶飞行器等)是一种以无线电遥控或者在自主、半自主程序控制下的不载人的飞行器。由于其成本较低,无人员伤亡风险,机动性好等优势,其在各类航空拍摄、地质测量、线路巡检、应急救援等领域应用广泛。其中,无人机凭借其独特的机动优势,大量应用于航空拍摄中。随着智能化技术发展,人们对无人飞行器的智能功能提出了更高的要求,如要求无人机对被摄目标自动跟踪等。
一般说来,飞行器对目标的跟踪是通过视觉传感器或者基于GPS标准定位实现的。例如专利申请CN105100728A、专利CN103149939B的方案,均使用视觉传感器作为目标跟踪传感器。专利申请CN104965522A的方案是基于GPS标准定位实现的。
但是,现有的视频识别系统构成复杂,在视线短时被遮挡或者目标高速运动的情况下算法会丢失目标,有一定的局限性。采用视觉传感器作为目标跟踪传感器,其跟踪算法冗长,不利于无人机机载系统集成。基于GPS定位的目标跟踪系统定位和跟踪精度不高,难以实现被摄目标在拍摄设备中的影 响稳定锁定。
发明内容
技术问题
有鉴于此,本发明要解决的技术问题是,提供一种新的目标跟踪方法、装置与系统,能够提高无人机在拍摄过程中的定位和跟踪精度。
解决方案
为了解决上述技术问题,本发明提供一种目标跟踪方法,包括:
根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息;
根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
对于上述方法,在一种可能的实现方式中,根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量,包括:
从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;
从所述飞行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;
根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立 所述参考坐标系,得到所述第一位置矢量。
对于上述方法,在一种可能的实现方式中,根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量,包括:
根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立所述参考坐标系,得到所述第一位置矢量A1(x1,y1,z1),
x1=(lonu-lont)*cos(latt)*lon0            式1,
y1=(latu-latt)*lat0           式2,
z1=heiu-heit                   式3,
其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示所述参考坐标系下的坐标值。
对于上述方法,在一种可能的实现方式中,根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量,包括:
从姿态传感器获得所述飞行器的当前姿态信息;
根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量。
对于上述方法,在一种可能的实现方式中,根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量,包括:
根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云 台运动中心的方向矢量dA’(dx’,dy’,dz’),
dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))   式8,
dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))   式4,
dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))  式5,
其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;
dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
对于上述方法,在一种可能的实现方式中,根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息,包括:
根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2=(x2,y2,z2)=(A1+dA’);
采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|)) 式6,
其中,arcsin()为反正弦函数。
对于上述方法,在一种可能的实现方式中,根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,包括:
根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)         式7。
本发明还提供一种目标跟踪装置,包括:
第一位置矢量模块,用于根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
方向矢量模块,用于根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
第二位置矢量模块,用于根据所述第一位置矢量和所述方向矢量,在所述参考坐标系下计算从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息;
控制量模块,用于根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
对于上述装置,在一种可能的实现方式中,所述第一位置矢量模块还用于从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;从所述飞行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量。
对于上述装置,在一种可能的实现方式中,所述第一位置矢量模块还用于根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立所述参考坐标系,得到所述第一位置矢量A1(x1,y1,heiu-heit),
x1=(lonu-lont)*cos(latt)*lon0       式1,
y1=(latu-latt)*lat0             式2,
z1=heiu-heit                    式3,
其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地 球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示所述参考坐标系下的坐标值。
对于上述装置,在一种可能的实现方式中,所述方向矢量模块还用于从姿态传感器获得所述飞行器的当前姿态信息;根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量。
对于上述装置,在一种可能的实现方式中,所述方向矢量模块还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量dA’(dx’,dy’,dz’),
dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))   式8,
dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))      式4,
dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))  式5,
其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;
dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
对于上述装置,在一种可能的实现方式中,所述第二位置矢量模块还用于根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2=(x2,y2,z2)=(A1+dA’);采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|)) 式6,
其中,arcsin()为反正弦函数。
对于上述装置,在一种可能的实现方式中,所述控制量模块还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)   式7。
本发明还提供一种无人驾驶飞行器的目标跟踪系统,包括设置于所述无人驾驶飞行器的云台控制器和设置于被摄目标的定位装置,其中,所述云台控制器,采用本发明实施例中任意一种结构的目标跟踪装置。
对于上述系统,在一种可能的实现方式中,所述被摄目标的定位装置为GNSS信标,所述GNSS信标用于获取所述被摄目标的位置信息;
所述目标跟踪系统还包括:
机载GNSS接收机及天线,通过接收GNSS信号获得所述无人驾驶飞行器的位置信息;
姿态传感器,用于检测所述无人驾驶飞行器的姿态信息;
云台和拍摄设备,所述云台用于控制安装于所述云台上的拍摄设备的拍摄姿态;
所述云台控制器与所述GNSS接收机及天线、所述GNSS信标和所述姿态传感器相通信,用于从所述GNSS接收机及天线接收所述无人驾驶飞行器的位置信息,从所述GNSS信标接收所述被摄目标的位置信息,从姿态传感器接收所述无人驾驶飞行器的姿态信息;
所述云台控制器还与所述云台连接,用于向所述云台发送目标控制量,以控制所述云台上的所述拍摄设备的拍摄姿态。
有益效果
本发明根据所获取的位置信息计算相对空间指向关系,同时结合无人机 机载传感器获得的飞行器姿态,决定拍摄设备的云台输出,实现被摄目标的跟踪,由于考虑了机身高度对跟踪算法的影响,跟踪精度高,能够实现全天候自主跟踪。
此外,本实施例利用高精度GNSS定位结果获取被摄目标和无人机的精确的位置信息,能够保证进一步提高计算的精确度。
根据下面参考附图对示例性实施例的详细说明,本发明的其它特征及方面将变得清楚。
附图说明
包含在说明书中并且构成说明书的一部分的附图与说明书一起示出了本发明的示例性实施例、特征和方面,并且用于解释本发明的原理。
图1示出根据本发明一实施例的无人驾驶飞行器的目标跟踪系统的结构示意图;
图2a示出根据本发明一实施例的目标跟踪方法的流程示意图;
图2b示出根据本发明一实施例的目标跟踪方法的原理示意图
图3示出根据本发明另一实施例的目标跟踪方法的流程示意图;
图4示出根据本发明一实施例的目标跟踪装置的结构示意图;
图5示出根据本发明一实施例的目标跟踪设备的结构框图。
具体实施方式
以下将参考附图详细说明本发明的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本发明,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本发明同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本发明的主旨。
实施例1
为了提高定位精度,在目标跟踪系统中可以采用GNSS(Global Navigation Satellite System,全球定位导航卫星系统)技术获取位置信息。优选地,可以采用PPP(Precise Point Positioning,单点精密定位)技术,获取的位置信息可以达到分米级,甚至是厘米级的高精度。本发明基于GNSS技术有利于实现精密定位。
图1示出根据本发明一实施例的无人驾驶飞行器的目标跟踪系统的结构示意图。如图1所示,该无人驾驶飞行器的目标跟踪系统主要包括:设置于无人驾驶飞行器11(简称无人机)的云台控制器114和设置于被摄目标13的定位装置。云台控制器114的具体功能可以参见实施例2、3中关于目标跟踪方法的相关描述。其中,云台控制器114可以独立于无人机的控制器设置,也可以通过无人机的控制器来实现云台控制器的功能,本发明对此不进行限定。
具体而言,本实施例的无人机目标跟踪系统,包括:GNSS信标131、通信设备133、113、机载GNSS接收机及天线111、云台112、云台控制器114、姿态传感器115、拍摄设备116。
其中,GNSS信标131装备于被摄目标13上,可以利用GNSS获得被摄目标的精确位置。机载GNSS接收机及天线111、云台112、云台控制器114、姿态传感器115、拍摄设备116,设置基于无人机的目标跟踪装置11上。GNSS信标131通过通信设备133与目标跟踪装置11的通信设备113相通信,从而向云台控制器114提供被摄目标13的实时精确位置信息。通信设备113、通信设 备133可以认为是一种通信设备的两端,用于实现GNSS信标131与云台控制器114的通信,空中通过例如433MHz无线电信号实现通信。具体而言,设置于无人机的各部件功能如下:
机载GNSS接收机及天线111与云台控制器114相连,通过接收GNSS信号获得无人机的精确位置,该精确位置可以实际对应接收天线相位中心的空间位置。
云台112优选为由若干自由度的电机或舵机构成,可以控制安装于上的拍摄设备进行多自由度的摆动。
云台控制器114优选为由微控制器及其外围电路、云台电机驱动电路构成,接收各部分传来的信息进行计算处理后控制云台进行摆动。
姿态传感器115优选为由MEMS传感器构成,为云台控制器114提供无人机的Pitch(俯仰角)、Roll(翻滚角)、Yaw(偏航角)三个方向的姿态信息。
拍摄设备116优选为由相机或摄像机组成。姿态传感器115与云台控制器114相连,用于检测并提供无人机当前姿态信息。拍摄设备116例如摄像头固定于云台112上,云台控制器114控制云台112的摆动从而控制拍摄设备的拍摄姿态,实现对被摄目标的跟踪。
实施例2
图2a示出根据本发明一实施例的目标跟踪方法的流程示意图。如图2a所示,该目标跟踪方法具体可以包括以下步骤:
步骤201、根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
步骤202、根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
步骤203、根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置 矢量转换为所述飞行器的目标姿态信息;
步骤204、根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
具体而言,如图2b所示,在无人机41的机体上方设置机载GNSS接收机及天线43,在无人机41的机体下方设置云台45和拍摄设备(图中未示出),拍摄设备通常装载于云台45上。从被摄目标47到天线相位中心(例如天线的几何中心)的第一位置矢量为A1,从天线相位中心到云台运动中心(例如云台的旋转中心)的方向矢量为dA’,从被摄目标47到云台运动中心的第二位置矢量为A2。通过所述方向矢量校正第一位置矢量所得出的第二位置矢量,为实际云台运动中心与被摄目标的矢量表示,可以消除GNSS天线与云台运动中心在安装上实际存在的位置差异。
在一种可能的实现方式中,步骤201可以包括:
步骤2011、从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;
步骤2012、从所述飞行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;
步骤2013、根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立参考坐标系,得到在所述第一位置矢量。
在一种可能的实现方式中,步骤2013可以包括:,根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立参考坐标系,例如笛卡尔坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量A1(x1,y1,heiu-heit),
x1=(lonu-lont)*cos(latt)*lon0      式1,
y1=(latu-latt)*lat0           式2,
z1=heiu-heit               式3,
其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示参考坐标系下的坐标值。
在一种可能的实现方式中,步骤202可以包括:
步骤2021、从姿态传感器获得所述飞行器的当前姿态信息;
步骤2022、根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量。
在一种可能的实现方式中,步骤2022可以包括:根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量dA’(dx’,dy’,dz’),
dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))   式8,
dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))     式4,
dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))  式5,
其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
在一种可能的实现方式中,步骤203可以包括:
步骤2031、根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2 =(x2,y2,z2)=(A1+dA’);
步骤2032、采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|)) 式6。
其中,arcsin()为反正弦函数。
在一种可能的实现方式中,步骤204可以包括:根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)        式7。
本实施例的目标跟踪方法根据所获取的位置信息计算相对空间指向关系,同时结合无人机机载传感器获得的飞行器姿态,决定摄像头云台输出,实现被摄目标的跟踪,由于考虑了机身高度对跟踪算法的影响,跟踪精度高,能够实现全天候自主跟踪。
此外,本实施例利用高精度GNSS定位结果获取被摄目标和无人机的精确的位置信息,通过简洁的计算方法获取目标与摄像机光轴的精确相对位置关系,并与无人机的传感器信息进行紧耦合,实现对摄像机云台的控制,能够保证进一步提高计算的精确度,实现对目标的跟踪。
实施例3
图3示出根据本发明另一实施例的目标跟踪方法的流程示意图。如图3所示,本实施例时以GNSS技术实现拍摄跟踪方法为例进行说明,本实施例中与实施2相同的公式具有相同的含义,在此不再赘述。该目标跟踪方法具体可以包括以下步骤:
步骤301:GNSS信标接收GNSS信号进行精密定位,获得被摄目标的位置Pt(lont,latt,heit)(通常latt,lont单位为度,heit单位为米)。
步骤302:通信设备将被摄目标的位置进行编码并调制到433MHz无线 电信号上进行发射;
步骤303:云台控制器接收通信设备传来的被摄目标的位置Pt,同时获得机载GNSS接收机及天线传来的无人机位置Pu(lonu,latu,heiu),Pt、Pu为地理坐标系下的坐标值。然后,以Pt为圆点,以正东向为x轴建立右手坐标系(此时可以假设被摄目标和无人机运动区域为平面而不是椭球面),该坐标系也可以称为本地笛卡尔坐标系。
参见上述的式1和式2,在WGS-84坐标系下a=6378137.0,c=6356752.3,计算所得lat0=111183.865,lon0=111319.491,从而可以得到无人机的位置矢量为A1(x1,y1,heiu-heit),其中:
x1=(lonu-lont)*cos(latt)*111319.491      式1-1,
y1=(latu-latt)*111183.865            式2-1,
上式中的参数也可以采用与WGS-84标准类似的其他标准例如西安54、北京84、CGCS2000等,采用不同标准计算出的数值可能稍有不同,本发明不限定具体计算标准。
步骤304:云台控制器获得姿态传感器测得的飞行器姿态Attiu(Pitchu,Rollu,Yawu),可以由机载GNSS接收机及天线模块中天线相位中心到云台运动中心的方向矢量在飞行器坐标系下的表示:dA(dx,dy,dz)(由飞行器上安装位置决定),则该方向矢量在步骤303中建立的坐标系中可以表示为:dA’(dx’,dy’,dz’),其中dA’(dx’,dy’,dz’)的计算方法可以参见实施例2的式8、式4和式5。
步骤305:由步骤303所得A1和步骤304所得dA’可以计算出被摄目标到云台运动中心方向矢量,即云台运动中心位置矢量为A2=(x2,y2,z2)=(A1+dA’),进一步可以计算出该方向矢量在机体坐标系下所表示的目标姿态信息Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|))。
步骤306:根据步骤305所得Attic和步骤304中Attiu可以得出三轴云台目标控制量Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)。
由云台控制器将该目标控制量通过例如二阶控制环路输出到云台电机,实现对目标的跟踪。
本实施例的目标跟踪方法,可以利用高精度GNSS定位技术,例如PPP技术获取被摄目标与无人机的精确位置,再计算相对空间指向关系,同时结合无人机机载传感器获得的飞行器姿态,决定摄像头云台输出,实现被摄目标的跟踪,具有跟踪精度高,能够实现全天候自主跟踪。相较于已有视觉跟踪技术具有计算简单、实时性高、应用环境广等特点;相较于已有基于GPS定位的目标跟踪方法,具有跟踪精度高、跟踪稳定的特点。
实施例4
图4示出根据本发明一实施例的目标跟踪装置的结构示意图。如图4所示,该目标跟踪装置可以,包括:
第一位置矢量模块51,用于根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
方向矢量模块53,用于根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
第二位置矢量模块55,用于根据所述第一位置矢量和所述方向矢量,在所述参考坐标系下计算从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息;
控制量模块57,用于根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
在一种可能的实现方式中,所述第一位置矢量模块51还用于从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;从所述飞 行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量。
在一种可能的实现方式中,所述第一位置矢量模块51还用于根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立所述参考坐标系,得到所述第一位置矢量A1(x1,y1,heiu-heit),
x1=(lonu-lont)*cos(latt)*lon0       式1,
y1=(latu-latt)*lat0           式2,
z1=heiu-heit                 式3,
其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示所述参考坐标系下的坐标值。在WGS-84坐标系下的具体示例可以参见上述式1-1和式2-1。当然,也可以采用其他标准进行计算,本发明不限定具体计算标准。
在一种可能的实现方式中,所述方向矢量模块53还用于从姿态传感器获得所述飞行器的当前姿态信息;根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量。
在一种可能的实现方式中所述方向矢量模块53还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量dA’(dx’,dy’,dz’),
dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))    式8,
dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))   式4,
dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))  式5,
其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;
dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
在一种可能的实现方式中,所述第二位置矢量模块55还用于根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2=(x2,y2,z2)=(A1+dA’);采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|)) 式6。
在一种可能的实现方式中,所述控制量模块57还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)          式7。
本实施例的目标跟踪装置的各模块可以通过无人机的云台控制器来实现,根据所获取的位置信息计算相对空间指向关系,同时结合无人机机载传感器获得的飞行器姿态,决定摄像头云台输出,实现被摄目标的跟踪,由于考虑了机身高度对跟踪算法的影响,跟踪精度高,能够实现全天候自主跟踪。
利用高精度GNSS定位结果获取被摄目标和无人机的精确的位置信息,能够保证进一步提高计算的精确度。
实施例5
图5示出根据本发明一实施例的目标跟踪设备的结构框图。所述目标跟 踪设备1100可以是具备计算能力的主机服务器、个人计算机PC、或者可携带的便携式计算机或终端等。本发明具体实施例并不对计算节点的具体实现做限定。
所述目标跟踪设备1100包括处理器(processor)1110、通信接口(Communications Interface)1120、存储器(memory)1130和总线1140。其中,处理器1110、通信接口1120、以及存储器1130通过总线1140完成相互间的通信。
通信接口1120用于与网络设备通信,其中网络设备包括例如虚拟机管理中心、共享存储等。
处理器1110用于执行程序。处理器1110可能是一个中央处理器CPU,或者是专用集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本发明实施例的一个或多个集成电路。
存储器1130用于存放文件。存储器1130可能包含高速RAM存储器,也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。存储器1130也可以是存储器阵列。存储器1130还可能被分块,并且所述块可按一定的规则组合成虚拟卷。
在一种可能的实施方式中,上述程序可为包括计算机操作指令的程序代码。该程序具体可用于:实现实施例2或3中各步骤的操作。
本领域普通技术人员可以意识到,本文所描述的实施例中的各示例性单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件形式来实现,取决于技术方案的特定应用和设计约束条件。专业技术人员可以针对特定的应用选择不同的方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
如果以计算机软件的形式来实现所述功能并作为独立的产品销售或使用时,则在一定程度上可认为本发明的技术方案的全部或部分(例如对现有技术做出贡献的部分)是以计算机软件产品的形式体现的。该计算机软件产 品通常存储在计算机可读取的非易失性存储介质中,包括若干指令用以使得计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本发明各实施例方法的全部或部分步骤。而前述的存储介质包括U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。
实用性
本发明根据所获取的位置信息计算相对空间指向关系,同时结合无人机机载传感器获得的飞行器姿态,决定拍摄设备的云台输出,实现被摄目标的跟踪,由于考虑了机身高度对跟踪算法的影响,跟踪精度高,能够实现全天候自主跟踪。
此外,本实施例利用高精度GNSS定位结果获取被摄目标和无人机的精确的位置信息,能够保证进一步提高计算的精确度。

Claims (16)

  1. 一种目标跟踪方法,其特征在于,包括:
    根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
    根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
    根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息;
    根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
  2. 根据权利要求1所述的方法,其特征在于,根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量,包括:
    从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;
    从所述飞行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;
    根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量。
  3. 根据权利要求2所述的方法,其特征在于,根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量,包括:
    根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立所述参考坐标系,得到所述第一位置矢量A1(x1,y1,z1),
    x1=(lonu-lont)*cos(latt)*lon0                   式1,
    y1=(latu-latt)*lat0                             式2,
    z1=heiu-heit                                    式3,
    其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示所述参考坐标系下的坐标值。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量,包括:
    从姿态传感器获得所述飞行器的当前姿态信息;
    根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量。
  5. 根据权利要求4所述的方法,其特征在于,根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量,包括:
    根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量dA’(dx’,dy’,dz’),
    dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))    式8,
    dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))    式4,
    dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))    式5,
    其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚 角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;
    dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
  6. 根据权利要求5所述的方法,其特征在于,根据所述第一位置矢量和所述方向矢量,计算在所述参考坐标系下从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息,包括:
    根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2=(x2,y2,z2)=(A1+dA’);
    采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
    Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|))  式6,
    其中,arcsin()为反正弦函数。
  7. 根据权利要求6所述的方法,其特征在于,根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,包括:
    根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
    Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)                   式7。
  8. 一种目标跟踪装置,其特征在于,包括:
    第一位置矢量模块,用于根据被摄目标和飞行器的位置信息建立参考坐标系,得到在所述参考坐标系下从所述被摄目标到所述飞行器的第一位置矢量;
    方向矢量模块,用于根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线位置到云台位置的方向矢量;
    第二位置矢量模块,用于根据所述第一位置矢量和所述方向矢量,在所述参考坐标系下计算从所述被摄目标到所述云台位置的第二位置矢量,并将所述第二位置矢量转换为所述飞行器的目标姿态信息;
    控制量模块,用于根据所述飞行器的当前姿态信息和目标姿态信息,得到目标控制量,所述目标控制量用于指示对云台运动控制的调整量。
  9. 根据权利要求8所述的装置,其特征在于,所述第一位置矢量模块还用于从所述被摄目标的GNSS信标获取所述被摄目标在地理坐标系下的位置信息;从所述飞行器的GNSS接收机及天线获取所述飞行器在地理坐标系下的位置信息;根据所述被摄目标和所述飞行器在所述地理坐标系下的位置信息建立所述参考坐标系,得到所述第一位置矢量。
  10. 根据权利要求9所述的装置,其特征在于,所述第一位置矢量模块还用于根据所述被摄目标在所述地理坐标系下的位置信息Pt(lont,latt,heit),和所述飞行器所述地理坐标系下的位置信息Pu(lonu,latu,heiu),采用下式1和式2建立所述参考坐标系,得到所述第一位置矢量A1(x1,y1,heiu-heit),
    x1=(lonu-lont)*cos(latt)*lon0                   式1,
    y1=(latu-latt)*lat0                             式2,
    z1=heiu-heit                                    式3,
    其中,lon0=2*π*a/360,lat0=(2*π*c+4*(a-c))/360,π为圆周率,a为地球长轴半径,c为地球短轴半径,lon表示经度、lat表示维度、hei表示高度,t表示所述被摄目标,u表示所述飞行器,cos()表示余弦函数,x1、y1、z1表示所述参考坐标系下的坐标值。
  11. 根据权利要求8至10中任一项所述的装置,其特征在于,所述方向矢量模块还用于从姿态传感器获得所述飞行器的当前姿态信息;根据所述飞行器的当前姿态信息,计算在所述参考坐标系下从所述飞行器的天线相位中 心到云台运动中心的方向矢量。
  12. 根据权利要求11所述的装置,其特征在于,所述方向矢量模块还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu),采用下式8、式4和式5,计算在所述参考坐标系下从所述飞行器的天线相位中心到云台运动中心的方向矢量dA’(dx’,dy’,dz’),
    dx’=dx(cos(Rollu)cos(Yawu)-sin(Pitchu)sin(Rollu)sin(Yawu))-dy(cos(Pitchu)sin(Yawu))+dz(sin(Rollu)cos(Yawu)+sin(Pitchu)cos(Rollu)sin(Yawu))    式8,
    dy’=dx(cos(Rollu)sin(Yawu)-sin(Pitchu)sin(Rollu)cos(Yawu))-dy(cos(Pitchu)cos(Yawu))+dz(sin(Rollu)sin(Yawu)+sin(Pitchu)cos(Rollu)cos(Yawu))    式4,
    dz’=dx(-cos(Pitchu)sin(Rollu))-dy(sin(Pitchu)+dz(cos(Pitchu)cos(Rollu))    式5,
    其中,Pitchu表示飞行器u的当前俯仰角,Rollu表示飞行器u的当前翻滚角,Yawu表示飞行器u的当前偏航角,sin()表示正弦函数;
    dA(dx,dy,dz)表示天线相位中心到云台运动中心的方向矢量在所述飞行器的机体坐标系下坐标值。
  13. 根据权利要求12所述的装置,其特征在于,所述第二位置矢量模块还用于根据所述第一位置矢量A1(x1,y1,heiu-heit)和所述方向矢量dA’(dx’,dy’,dz’),计算从所述被摄目标到云台运动中心的第二位置矢量A2=(x2,y2,z2)=(A1+dA’);采用下式6,将所述第二位置矢量转换为所述飞行器的在机体坐标系下的目标姿态信息Attic(Pitchc,Rollc,Yawc):
    Attic(Pitchc,Rollc,Yawc)=(arcsin(x2/|A2|),arcsin(y2/|A2|),arcsin(z2/|A2|))  式6,
    其中,arcsin()为反正弦函数。
  14. 根据权利要求13所述的装置,其特征在于,所述控制量模块还用于根据所述飞行器的当前姿态信息Attiu(Pitchu,Rollu,Yawu)和目标姿态信息,采用下式7得到三轴云台的目标控制量Output,
    Output=(Pitchc-Pitchu,Rollc-Rollu,Yawc-Yawu)        式7。
  15. 一种无人驾驶飞行器的目标跟踪系统,其特征在于,包括设置于所述无人驾驶飞行器的云台控制器和设置于被摄目标的定位装置,
    其中,所述云台控制器,采用如权利要求8至14中任一项所述的目标跟踪装置。
  16. 根据权利要求15所述的系统,其特征在于,所述被摄目标的定位装置为GNSS信标,所述GNSS信标用于获取所述被摄目标的位置信息;
    所述目标跟踪系统还包括:
    机载GNSS接收机及天线,通过接收GNSS信号获得所述无人驾驶飞行器的位置信息;
    姿态传感器,用于检测所述无人驾驶飞行器的姿态信息;
    云台和拍摄设备,所述云台用于控制安装于所述云台上的拍摄设备的拍摄姿态;
    所述云台控制器与所述GNSS接收机及天线、所述GNSS信标和所述姿态传感器相通信,用于从所述GNSS接收机及天线接收所述无人驾驶飞行器的位置信息,从所述GNSS信标接收所述被摄目标的位置信息,从姿态传感器接收所述无人驾驶飞行器的姿态信息;
    所述云台控制器还与所述云台连接,用于向所述云台发送目标控制量,以控制所述云台上的所述拍摄设备的拍摄姿态。
PCT/CN2016/086312 2016-04-12 2016-06-17 目标跟踪方法、装置和系统 WO2017177542A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610225109.4 2016-04-12
CN201610225109.4A CN105676865B (zh) 2016-04-12 2016-04-12 目标跟踪方法、装置和系统

Publications (1)

Publication Number Publication Date
WO2017177542A1 true WO2017177542A1 (zh) 2017-10-19

Family

ID=56310229

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/086312 WO2017177542A1 (zh) 2016-04-12 2016-06-17 目标跟踪方法、装置和系统

Country Status (2)

Country Link
CN (1) CN105676865B (zh)
WO (1) WO2017177542A1 (zh)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510833A (zh) * 2018-05-31 2018-09-07 中国人民解放军第四军医大学 模拟夜视镜下飞行的训练器材和提高夜间作训能力的方法
CN110337624A (zh) * 2018-05-31 2019-10-15 深圳市大疆创新科技有限公司 姿态转换方法、姿态显示方法及云台系统
CN110622091A (zh) * 2018-03-28 2019-12-27 深圳市大疆创新科技有限公司 云台的控制方法、装置、系统、计算机存储介质及无人机
CN110637268A (zh) * 2018-01-23 2019-12-31 深圳市大疆创新科技有限公司 目标检测方法、装置和可移动平台
CN111596693A (zh) * 2020-06-17 2020-08-28 中国人民解放军国防科技大学 基于云台相机的无人机对地面目标跟踪控制方法及系统
CN111798514A (zh) * 2020-06-29 2020-10-20 山东大学日照智能制造研究院 一种海洋牧场智能运动目标跟踪监测方法及系统
CN112486198A (zh) * 2020-12-11 2021-03-12 西安电子科技大学 一种具备自主性的模块化飞行阵列控制方法
CN113506340A (zh) * 2021-06-15 2021-10-15 浙江大华技术股份有限公司 一种云台位姿预测的方法、设备和计算机可读存储介质
CN113658225A (zh) * 2021-08-19 2021-11-16 天之翼(苏州)科技有限公司 基于航拍监控的运动对象识别方法及系统
CN113938610A (zh) * 2021-11-16 2022-01-14 云南电网有限责任公司电力科学研究院 一种无人机监管方法及系统
CN114285459A (zh) * 2021-12-27 2022-04-05 北京微纳星空科技有限公司 一种卫星信号收发系统及其数据处理方法
CN114489102A (zh) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 一种电力杆塔自巡检方法、装置、无人机及存储介质

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105676865B (zh) * 2016-04-12 2018-11-16 北京博瑞云飞科技发展有限公司 目标跟踪方法、装置和系统
CN109564434B (zh) 2016-08-05 2023-07-25 深圳市大疆创新科技有限公司 用于定位可移动物体的系统和方法
CN106370184A (zh) * 2016-08-29 2017-02-01 北京奇虎科技有限公司 无人机自动跟踪拍摄的方法、无人机和移动终端设备
CN106454069B (zh) * 2016-08-31 2019-11-08 歌尔股份有限公司 一种控制无人机拍摄的方法、装置和可穿戴设备
WO2018053877A1 (zh) * 2016-09-26 2018-03-29 深圳市大疆创新科技有限公司 控制方法、控制设备和运载系统
KR20180080892A (ko) * 2017-01-05 2018-07-13 삼성전자주식회사 전자 장치 및 그 제어 방법
CN106878613B (zh) * 2017-01-13 2021-04-20 河北雄安远度科技有限公司 数据通信装置、方法及无人机
CN106814753B (zh) * 2017-03-20 2020-11-06 成都通甲优博科技有限责任公司 一种目标位置矫正方法、装置及系统
WO2018205133A1 (en) * 2017-05-09 2018-11-15 Zepp Labs, Inc. Direction finding of wireless communication devices
CN107908195B (zh) * 2017-11-06 2021-09-21 深圳市道通智能航空技术股份有限公司 目标追踪方法、装置、追踪器及计算机可读存储介质
CN107992068A (zh) * 2017-11-29 2018-05-04 天津聚飞创新科技有限公司 目标跟踪方法、装置及飞行器
CN109032166B (zh) * 2018-03-08 2020-01-21 深圳中琛源科技股份有限公司 基于无人机即时跟踪行驶车辆的方法
CN108573498B (zh) * 2018-03-08 2019-04-26 上海申雪供应链管理有限公司 基于无人机的行驶车辆即时跟踪系统
CN110262540A (zh) * 2018-03-12 2019-09-20 杭州海康机器人技术有限公司 对飞行器进行飞行控制的方法和装置
CN108645403B (zh) * 2018-05-15 2021-03-23 中国人民解放军海军工程大学 一种基于卫星导航及姿态测量的无人机自动跟拍系统
CN108650494B (zh) * 2018-05-29 2020-08-04 青岛一舍科技有限公司 基于语音控制的可即时获取高清照片的直播系统
CN110225249B (zh) * 2019-05-30 2021-04-06 深圳市道通智能航空技术有限公司 一种对焦方法、装置、航拍相机以及无人飞行器
CN110427050A (zh) * 2019-09-09 2019-11-08 深圳市科卫泰实业发展有限公司 一种基于差分定位定向的云台自动跟随系统
CN110716579B (zh) * 2019-11-20 2022-07-29 深圳市道通智能航空技术股份有限公司 目标跟踪方法及无人飞行器
CN114126964A (zh) * 2020-03-31 2022-03-01 深圳市大疆创新科技有限公司 可移动平台的控制方法、装置、可移动平台及存储介质
CN111510624A (zh) * 2020-04-10 2020-08-07 瞬联软件科技(北京)有限公司 目标跟踪系统及目标跟踪方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6650287B1 (en) * 1999-07-29 2003-11-18 Anatoly Stepanovich Karpov Method for determining the position of reference axes in an inertial navigation system of an object in respect with the basic coordinates and embodiments thereof
EP2144038A2 (en) * 2008-07-10 2010-01-13 Lockheed Martin Corporation Inertial measurement using an imaging sensor and a digitized map
CN103604427A (zh) * 2013-12-10 2014-02-26 中国航天空气动力技术研究院 对地面移动目标动态定位的无人机系统和方法
CN104820434A (zh) * 2015-03-24 2015-08-05 南京航空航天大学 一种无人机对地面运动目标的测速方法
CN105184776A (zh) * 2015-08-17 2015-12-23 中国测绘科学研究院 目标跟踪方法
CN105676865A (zh) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 目标跟踪方法、装置和系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101344589A (zh) * 2008-08-29 2009-01-14 北京航空航天大学 基于gnss反射信号的空间飞行器探测装置
IT1401374B1 (it) * 2010-08-09 2013-07-18 Selex Sistemi Integrati Spa Tracciamento tridimensionale multisensore basato su tracce bidimensionali acquisite da tracciatori di sensori di localizzazione di bersagli
CN102033222B (zh) * 2010-11-17 2013-02-13 吉林大学 大范围多目标超声跟踪定位系统和方法
CN102522631B (zh) * 2011-12-12 2014-01-29 中国航空无线电电子研究所 基于扩频和数字导引的双制式天线跟踪系统
US20140218242A1 (en) * 2013-02-01 2014-08-07 NanoSatisfi Inc. Computerized nano-satellite platform for large ocean vessel tracking

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6650287B1 (en) * 1999-07-29 2003-11-18 Anatoly Stepanovich Karpov Method for determining the position of reference axes in an inertial navigation system of an object in respect with the basic coordinates and embodiments thereof
EP2144038A2 (en) * 2008-07-10 2010-01-13 Lockheed Martin Corporation Inertial measurement using an imaging sensor and a digitized map
CN103604427A (zh) * 2013-12-10 2014-02-26 中国航天空气动力技术研究院 对地面移动目标动态定位的无人机系统和方法
CN104820434A (zh) * 2015-03-24 2015-08-05 南京航空航天大学 一种无人机对地面运动目标的测速方法
CN105184776A (zh) * 2015-08-17 2015-12-23 中国测绘科学研究院 目标跟踪方法
CN105676865A (zh) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 目标跟踪方法、装置和系统

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110637268A (zh) * 2018-01-23 2019-12-31 深圳市大疆创新科技有限公司 目标检测方法、装置和可移动平台
CN110622091A (zh) * 2018-03-28 2019-12-27 深圳市大疆创新科技有限公司 云台的控制方法、装置、系统、计算机存储介质及无人机
CN110337624A (zh) * 2018-05-31 2019-10-15 深圳市大疆创新科技有限公司 姿态转换方法、姿态显示方法及云台系统
CN108510833A (zh) * 2018-05-31 2018-09-07 中国人民解放军第四军医大学 模拟夜视镜下飞行的训练器材和提高夜间作训能力的方法
CN111596693B (zh) * 2020-06-17 2023-05-26 中国人民解放军国防科技大学 基于云台相机的无人机对地面目标跟踪控制方法及系统
CN111596693A (zh) * 2020-06-17 2020-08-28 中国人民解放军国防科技大学 基于云台相机的无人机对地面目标跟踪控制方法及系统
CN111798514A (zh) * 2020-06-29 2020-10-20 山东大学日照智能制造研究院 一种海洋牧场智能运动目标跟踪监测方法及系统
CN112486198A (zh) * 2020-12-11 2021-03-12 西安电子科技大学 一种具备自主性的模块化飞行阵列控制方法
CN112486198B (zh) * 2020-12-11 2022-03-04 西安电子科技大学 一种具备自主性的模块化飞行阵列控制方法
CN113506340A (zh) * 2021-06-15 2021-10-15 浙江大华技术股份有限公司 一种云台位姿预测的方法、设备和计算机可读存储介质
CN113658225A (zh) * 2021-08-19 2021-11-16 天之翼(苏州)科技有限公司 基于航拍监控的运动对象识别方法及系统
CN113938610A (zh) * 2021-11-16 2022-01-14 云南电网有限责任公司电力科学研究院 一种无人机监管方法及系统
CN114285459A (zh) * 2021-12-27 2022-04-05 北京微纳星空科技有限公司 一种卫星信号收发系统及其数据处理方法
CN114285459B (zh) * 2021-12-27 2024-01-19 北京微纳星空科技有限公司 一种卫星信号收发系统及其数据处理方法
CN114489102A (zh) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 一种电力杆塔自巡检方法、装置、无人机及存储介质

Also Published As

Publication number Publication date
CN105676865B (zh) 2018-11-16
CN105676865A (zh) 2016-06-15

Similar Documents

Publication Publication Date Title
WO2017177542A1 (zh) 目标跟踪方法、装置和系统
US11218689B2 (en) Methods and systems for selective sensor fusion
US20210293977A1 (en) Systems and methods for positioning of uav
US11822353B2 (en) Simple multi-sensor calibration
US20200191556A1 (en) Distance mesurement method by an unmanned aerial vehicle (uav) and uav
JP6138326B1 (ja) 移動体、移動体の制御方法、移動体を制御するプログラム、制御システム、及び情報処理装置
US20190385322A1 (en) Three-dimensional shape identification method, aerial vehicle, program and recording medium
EP3734394A1 (en) Sensor fusion using inertial and image sensors
CN111966133A (zh) 一种云台视觉伺服控制系统
US11875519B2 (en) Method and system for positioning using optical sensor and motion sensors
US20080118104A1 (en) High fidelity target identification and acquisition through image stabilization and image size regulation
WO2020103049A1 (zh) 旋转微波雷达的地形预测方法、装置、系统和无人机
CN107192377B (zh) 远程测量物体坐标的方法、装置及飞行器
WO2021087701A1 (zh) 起伏地面的地形预测方法、装置、雷达、无人机和作业控制方法
JPWO2018193574A1 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
CN107192330A (zh) 远程测量物体坐标的方法、装置及飞行器
JPWO2021090352A1 (ja) 航空機の飛行制御を行う制御装置、及び制御方法
WO2018094576A1 (zh) 无人飞行器的控制方法、飞行控制器及无人飞行器
CN113340272B (zh) 一种基于无人机微群的地面目标实时定位方法
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2020042159A1 (zh) 一种云台的转动控制方法、装置及控制设备、移动平台
JP2019191888A (ja) 無人飛行体、無人飛行方法及び無人飛行プログラム
JP2019188965A (ja) 無人飛行体、学習結果情報、無人飛行方法及び無人飛行プログラム
CN114494423A (zh) 一种无人平台载荷非中心目标经纬度定位方法及系统

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16898357

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16898357

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16898357

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/08/2019)

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/08/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16898357

Country of ref document: EP

Kind code of ref document: A1