WO2018014420A1 - Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method - Google Patents

Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method Download PDF

Info

Publication number
WO2018014420A1
WO2018014420A1 PCT/CN2016/097249 CN2016097249W WO2018014420A1 WO 2018014420 A1 WO2018014420 A1 WO 2018014420A1 CN 2016097249 W CN2016097249 W CN 2016097249W WO 2018014420 A1 WO2018014420 A1 WO 2018014420A1
Authority
WO
WIPO (PCT)
Prior art keywords
illuminating target
target
camera
illuminating
drone
Prior art date
Application number
PCT/CN2016/097249
Other languages
French (fr)
Chinese (zh)
Inventor
王军
Original Assignee
深圳曼塔智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳曼塔智能科技有限公司 filed Critical 深圳曼塔智能科技有限公司
Publication of WO2018014420A1 publication Critical patent/WO2018014420A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to drone technology, and more particularly to a drone tracking control system and method based on illuminating target recognition.
  • the automatic tracking function is a very popular and popular feature in drones in the past two years, especially for entertainment aerial drones. With this function, the drone becomes more intelligent, it can Automatically tracking the specified targets, realizing the functions of shooting, interaction, etc., is an indispensable intelligent function of entertainment drones.
  • the target object to be tracked needs to carry a communication module with GPS and communication function with the drone, such as GPS-free.
  • the human-machine remote control, the smart phone, the smart bracelet with GPS, etc., the communication module transmits the G PS information of the location of the target to the drone, and the drone relies on the location information to implement the tracking function, and the advantage is that the technology Mature and simple, the price is moderate.
  • the second method is a target tracking technology based on binocular visual recognition.
  • the core of the technology is to capture the tracking target through a dual camera on the drone, and then determine the target and the drone through a complex algorithm. The previous distance, as well as the amount and direction of the target's movement, the drone control system obtains this information and then adjusts the flight direction and flight speed to achieve the tracking function.
  • the advantage of this technology is that it is practical both indoors and outdoors. It does not require any equipment to track the target, and it can also be used to avoid obstacles.
  • the above two existing technologies have their own advantages and have many shortcomings.
  • the disadvantage of the GPS tracking method is that it can only be used outdoors in an open GPS signal, which is susceptible to environmental interference, and is subject to environmental interference.
  • GPS positioning accuracy its tracking accuracy is not high, usually in the error range of more than 1 meter;
  • the shortcoming of tracking mode based on binocular recognition is that its computational complexity is very large, it needs a powerful processor, and its power consumption is also Higher.
  • the current technology is not mature enough, it is easy to be disturbed by the environment and light.
  • the tracking accuracy is not very high, it is prone to loss, and the cost is very high. It is not suitable for low-cost entertainment aerial drones.
  • the object of the present invention is to provide a drone tracking control system and method based on illuminating target recognition, which aims to solve the problem that the tracking accuracy of the current UAV tracking control system is not high and stable.
  • the present invention provides a drone tracking control system based on illuminating target recognition, including a illuminating target and a drone, and the drone includes:
  • At least two cameras two cameras for performing real-time imaging of the illuminating target
  • an operation processing unit connected to the camera, identifying the illuminating target in the image captured by the camera, and calculating motion information of the illuminating target;
  • the flight control unit is connected to the operation processing unit, and controls the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target.
  • the present invention also provides a drone tracking control method based on illuminating target recognition, the method is based on at least one illuminating target recognition tracking control, and the method comprises the following steps:
  • the above-described drone tracking control system and method based on illuminating target recognition adopts two common cameras to perform real-time imaging on the illuminating target, and simultaneously calculate the distance and displacement amount of the illuminating target in the image, and exercise The direction is analyzed and judged, and the motion quantity and orientation information are output to the flight control system, and the flight control system performs corresponding motion according to the change amount, thereby implementing the tracking function. Since the position coordinates of the illuminating target can be calculated, the trajectory of the illuminating target can be determined, and the trajectory and flight control can be performed.
  • the preset trajectory comparison in the system determines the running command of the corresponding trajectory, and achieves the purpose of controlling the drone flight by the illuminating target motion, for example, the illuminating target moves vertically upwards, indicating that the aircraft is rising, and the illuminating target is moving vertically downward. Said the aircraft is falling and so on.
  • the invention solves the problems of instability, complicated system, high cost and low precision. In addition to more accurate tracking, it can also be used as a new control method, so that the drone only needs to add a set.
  • the device can realize a variety of intelligent functions, which is very suitable for popular aerial entertainment drones.
  • FIG. 1 is a schematic diagram of a module of a drone tracking control system based on illuminating target recognition in accordance with a preferred embodiment of the present invention
  • FIG. 2 is a schematic diagram of a module of an arithmetic processing unit in the UAV tracking control system based on the illuminating target recognition shown in FIG. 1;
  • FIG. 3 is a schematic diagram of a coordinate system of a illuminating target
  • FIG. 4 is a schematic diagram of a module of a flight control unit in the UAV tracking control system based on the illuminating target recognition shown in FIG. 1.
  • FIG. 5 is a flowchart of a drone tracking control method based on illuminating target recognition according to a preferred embodiment of the present invention.
  • FIG. 6 is a flow chart of a method for calculating motion information of a illuminating target in a preferred embodiment of the present invention
  • FIG. 7 is a flow chart of a method for controlling a drone to track a trajectory of a illuminating target in accordance with a preferred embodiment of the present invention.
  • a drone tracking control system based on illuminating target recognition in a preferred embodiment of the present invention includes at least one illuminating target 10 and a drone 20.
  • the drone 20 includes a first camera 21, a second camera 22, an arithmetic processing unit 23, and a flight control unit. twenty four.
  • Two cameras 21, 22 are used for performing actual imaging of the illuminating target 10; an arithmetic processing unit 23 is connected to the first camera 21 and the second camera 22, and the first camera 21 and the first camera are identified.
  • the illumination target 10 in the image captured by the second camera 22, and calculating motion information of the illumination target 10;
  • the flight control unit 24 is connected to the operation processing unit 23, and controls the location according to the motion information of the illumination target 10.
  • the drone 20 follows the movement trajectory of the illuminating target.
  • the illuminant of the illuminating target 10 at least one side (visual cross section) uniformly emits light, such as a photosphere.
  • the reason why the photosphere is used is that the illuminating of the photosphere is very uniform, and it is difficult to find such uniformity in nature.
  • the light source is therefore not easily disturbed by ambient light.
  • its characteristics are very obvious, and it is easy to capture during image analysis and processing, which greatly reduces the computational complexity and difficulty of the software.
  • the light ball is an illuminant, it can be captured by the camera in the case of poor light at night, so that this tracking method can be used during day and night, which is more advantageous than the tracking of binocular recognition.
  • the arithmetic processing unit 23 employs a general performance DS P (Digital Signal Processor) having a simple image algorithm, which is sufficient to accurately calculate the direction of motion and the amount of motion of the light sphere.
  • the flight control unit 24 is a central processing unit of the drone 20, and uses chips such as ARM, Intel, and AMD.
  • the flight control unit 24 searches for a pre-stored, corresponding predetermined flight trajectory based on the motion information of the illuminating target 10, and controls the drone 20 to fly with the predetermined flight trajectory.
  • the predetermined flight trajectory found according to the motion information of the current illuminating target 10 is: a flight path simulating a moving trajectory of the current illuminating target, and the motion information of the flight path has a corresponding relationship with the motion information of the current illuminating target, if no The movement speed, motion acceleration, movement distance, etc. of the man-machine are n times of the illumination target, and the motion direction is the same as the illumination target, opposite or offset by the preset angle.
  • the motion information of the illuminating target 10 includes: one or more of the moving speed, the moving distance, the moving direction, and the motion acceleration of the illuminating target 10.
  • the drone 20 activates the tracking or illuminating target (light ball) control mode
  • the flight control unit 24 notifies the arithmetic processing unit 23 to start the operation
  • the arithmetic processing unit 23 activates the left camera.
  • the best way to set the distance between the photosphere and the drone 20 is 5 meters. Of course, the user can change the distance and height, orientation, etc. It is guaranteed that the dual camera can shoot the light ball at the same time.
  • the operation processing unit 23 includes an acquisition module 231, an identification module 232, and a calculation module 233.
  • the obtaining module 231 is configured to continuously acquire images captured by the first camera 21 and the second camera 22, and for continuous acquisition, the inter-turn interval may be continuously acquired, or the shooting interval of the camera itself may be directly taken as an example. That is, each image needs to be acquired; it is also possible to use several images at intervals as the required elements.
  • the identification module 232 is configured to identify the illuminating target 10 in each of the images, and recognize the illuminating target 10 in the captured image by using an image recognition technology and mark the coordinates of the illuminating target 10 to be calculated.
  • the calculation module 233 is configured to calculate coordinate information of the illumination target 10 relative to the corresponding camera in each of the images, and calculate an preset time according to the coordinate information of the illumination target 10 in each of the images.
  • the motion information of the illuminating target 10 is described.
  • the arithmetic processing unit 23 needs to continuously calculate the movement trace (motion information) of the illuminating target 10, and the cumulative displacement amount of the illuminating target 10 is outputted for a period of time, and the flight control unit of the drone 20 24 responded again.
  • the tracking control it is limited to a preset time (same period of time), the parameters of the predetermined flight trajectory match the motion information of the preset daylighting target 10, then the tracking control is performed, if not, then Abandon tracking control, you can also fly with the default flight path, such as hovering, landing, and so on.
  • the preset time can be 0.2, 0.5 seconds, etc.
  • the coordinate information (X, ⁇ , Z) of the illuminating target 10 satisfies the following formula:
  • the coordinate system of the illuminating target 10 is zero with the first camera 21, and in other embodiments, the other camera 22 may be selected as the zero point of the coordinate system, or other positions.
  • the first camera 21 and the second camera 22 fall on the X axis, that is, the X axis of the coordinate system of the first camera 21 and the second camera 22; the Z axis of the coordinate system is the optical axis, and the camera The direction of pointing (the direction perpendicular to the mirror of the camera).
  • the Y axis of the coordinate system is perpendicular to the plane in which the x and z axes are located.
  • X _camL respectively the value y_camL emission target 10 (P point) X-axis coordinate of the first point in the imaging camera 21, Y-axis coordinate;
  • X _camR value of said target luminescent 10 is the X-axis coordinate of the imaged point of the second camera 22; since the P point is constantly changing, the values of x_camL, x_camR, and y_cam L are also constantly changing, and the values of x_camL, x_camR, and y_camL are expressed in pixel coordinates.
  • Method of collecting the value of x_camL The image captured by the first camera 21, in OpenCV (the full name of OpenCV is: Open Source Computer Vision Library. OpenCV is a cross-platform computer vision library based on BSD license ( ⁇ ))
  • OpenCV is a cross-platform computer vision library based on BSD license ( ⁇ )
  • three values of XL, YL, and ZL can be obtained by the function of cvFindStereoCorresponden ceBM, and XL, YL, and ZL are respectively the imaging points of the illuminating target 10 in the first camera 21 on the X-axis, the Y-axis, and the Z-axis.
  • the operation processing unit 23 can calculate the coordinates of the photosphere in each image by the above formula and technically output the optical ball motion information of a certain interval, and transmit the information to the flight control unit 24, and the fly control system can Accurately follow the light ball, and the same purpose is to control the drone 20 through the motion trajectory by judging the trajectory of the light ball.
  • the flight control unit 24 includes a storage module 241, a lookup module 242, and a control module 243.
  • the storage module 241 is configured to pre-store the motion information of the illuminating target and the table data of the predetermined flight trajectory of the drone corresponding thereto;
  • the searching module 242 is configured to check the table according to the motion information of the illuminating target 10, and obtain the corresponding information corresponding to the motion information.
  • the drone 20 predetermines the flight trajectory;
  • the control module 243 is configured to output the corresponding flight control command according to the found predetermined flight trajectory to control the drone 20 to fly with the preset flight information.
  • the flight control unit 24 pre-stores the following predetermined flight trajectories of the drone: the light ball moves upwards by 10 to 30 CM, correspondingly, the drone 20 rises by 0.5 m; The downward movement moves upwards by 10 to 30 CM, correspondingly, indicating that the drone 20 is lowered by 0.5 m; the light sphere is moved to the left by 10 to 30 CM, correspondingly, indicating that the drone 20 flies 1 m to the left; the photosphere is 10 to the right 30CM, correspondingly, means to fly to the right for 1 meter; the light ball is drawn in place, correspondingly, indicating that the drone 20 is shaking about 5 times, the above parameters are default values, and this value can be passed through the APP of the drone 20 The settings are the same as below. It should be noted that this example contains these kinds of control commands, but is not limited to these kinds of control commands.
  • the arithmetic processing unit 23 calculates the position of the light sphere in real terms, and if it is determined that there is motion, information such as the direction and speed of the motion amount is randomly transmitted to the flight control unit 24.
  • An example of pre-storing the motion information of each illuminating target and the table data corresponding to the predetermined flight trajectory of the UAV 20 is as follows:
  • the flight control unit 24 receives this information, and controls the aircraft to operate at the same speed and orientation, thus making the light ball and the drone 20 The distance between them is relatively fixed, thus achieving a more accurate tracking.
  • the speed of the light ball is less than 1 m / s, the following accuracy can be 10 CM.
  • the user can also operate through two light balls, so that the defined action and control manners are more, and the two light balls are used as reference objects before each other, so that the DSP operation is performed. More accurate, which makes the recognition rate higher, can complete the identification of more difficult movements, and can bring more fun to the user.
  • the UAV tracking control system realizes accurate tracking by binocularly identifying a uniform specific illuminant; achieving light ball control by means of preset different motion trajectories in the flight control system to represent different control commands combined with photosphere positioning technology
  • the drone 20 solves the problems of instability, complicated system, high cost and low precision.
  • a drone tracking control method based on illuminating target recognition is disclosed, which method is based on at least one illuminating target recognition tracking control, and the illuminating target is preferably a uniformly illuminating light sphere.
  • the UAV starts tracking or illuminating the target (light ball) control mode, and the flight control system notifies the DSP operation processing system to start working, and the DSP starts the left camera module L and the right camera module.
  • Group R two camera modules start searching for the light ball.
  • This process requires the flight control unit and the arithmetic processing unit to communicate with each other.
  • the flight control unit continuously adjusts the height and direction of the drone until the light ball is captured at the optimal angle. (It is usually the best way to place the ball in the middle of the two cameras.)
  • the default distance between the ball and the drone is 5 meters. Of course, you can change the distance and height, orientation, etc.
  • the camera can shoot the light ball at the same time.
  • the method includes the following steps:
  • Step S110 two cameras are used for performing real-time imaging of the illuminating target, and the two cameras include a first camera and a second camera.
  • the focal lengths of the two cameras are the same.
  • Step S120 Identify the illuminating target in the image captured by the camera, and calculate motion information of the illuminating target.
  • the motion information of the illuminating target includes: one or more of a moving speed, a moving path, a moving direction, a moving time, and a moving acceleration of the illuminating target.
  • Step S130 controlling the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target.
  • the tracking and simple control system formed by the two cameras and the DSP computing system plus a specific illuminant can greatly enhance the entertainment and intelligence of the drone.
  • the core elements are two points. The first is The light ball, the reason why the light ball is used is that the light ball is very uniform, and it is difficult to find such a uniform light source in nature, so it is not easily interfered by ambient light. In addition, its characteristics are very obvious, and the image analysis processing is performed. The process is easy to capture, greatly reducing the amount of computation and difficulty of the software. Using the general performance DSP and simple image algorithm, the direction and amount of movement of the light sphere can be accurately calculated.
  • the flight control system controls the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target to: search for pre-stored according to the motion information of the illuminating target, Corresponding predetermined flight trajectory, and controlling the drone to fly with the predetermined flight trajectory.
  • the predetermined flight trajectory found according to the motion information of the current illuminating target is: a flight path simulating a moving trajectory of the current illuminating target, and the motion information of the flight path has a corresponding relationship with the motion information of the current illuminating target, such as no one.
  • the motion speed, motion acceleration, motion distance, etc. of the machine are n times of the illumination target, and the motion direction is the same as the illumination target, or opposite to the preset angle.
  • step S120 specifically includes:
  • Step S121 the images captured by the first camera 21 and the second camera 22 are continuously acquired.
  • Step S122 identifying the illuminating target in each of the images.
  • the image recognition technology is used to identify the illuminating target in the captured image and mark it to the coordinates of the technical illuminating target.
  • Step S123 Calculate coordinate information of the illuminating target relative to the corresponding camera in each of the images.
  • the coordinate information (x, ⁇ , ⁇ ) of the illuminating target satisfies the following formula:
  • the coordinate system of the illuminating target is zero point with the first camera 21, and in other embodiments, the other camera may be selected as the zero point of the coordinate system, or other positions.
  • the first camera 21 and the second camera 22 are on the X-axis, that is, the X-axis of the coordinate system of the first camera 21 and the second camera 22; the Z-axis of the coordinate system is the optical axis, and the camera is pointed The direction (the direction perpendicular to the mirror of the camera).
  • the Y axis of the coordinate system is perpendicular to the plane of the X and Z axes.
  • X _camL, y_camL values are X axis coordinate of the light emitting point in the imaging target (P point) of the first camera 21, Y-axis coordinate;
  • x_camR value is the target of light emission The X-axis coordinate of the imaging point of the two cameras 22; Since the P point is constantly changing, the values of x_camL, x_camR, and y_camL are also constantly changing, and the values of x_camL, x_camR, and y_camL are expressed in pixel coordinates.
  • Step S124 Calculate motion information of the illumination target in the preset time according to the coordinate information of the illumination target in each of the images.
  • the coordinates of the photosphere in each image can be calculated and the optical ball motion information of a certain interval can be outputted by the technology, and the information can be transmitted to the flight control system, and the flight control system can accurately follow the light ball.
  • the purpose of controlling the drone through the motion trajectory is realized.
  • tracking control it is limited to a preset time (same interval), the parameters of the predetermined flight trajectory match the motion information of the preset daytime illuminating target, then the tracking control is performed, and if it does not match, the data is discarded.
  • Tracking control you can also fly with the default flight path, such as hovering, landing, and so on.
  • the preset time can be 0.2, 0.5 seconds, etc.
  • searching for a pre-stored, corresponding predetermined flight trajectory according to the motion information of the illuminating target, and controlling the drone to fly the predetermined flight trajectory specifically includes:
  • Step S131 pre-storing the motion information of the illuminating target and the unmanned flight scheduled trajectory data table corresponding thereto.
  • Step S132 Obtain a predetermined flight trajectory of the UAV corresponding to the motion information according to the motion information of the illuminating target.
  • Step S133 Output the corresponding flight control command according to the found predetermined flight trajectory to control the non-human machine to fly with the preset flight information.
  • the flight control system receives this information and controls the aircraft to operate at the same speed and orientation, thus causing the ball between the light ball and the drone.
  • the distance is relatively fixed, thus achieving a more accurate tracking.
  • the speed of the light ball is less than 1 m / sec, the following accuracy can be 10 CM.
  • the following motion trajectories are defined in the flight control system:
  • the light ball moves upward 10 to 30 CM Indicates that the drone rises by 0.5 m; the downward movement of the light ball moves upwards by 10 to 30 CM, indicating that the drone is lowered by 0.5 m; the movement of the photosphere to the left by 10 to 30 CM means 1 m to the left; and the photosphere to the right by 10 to 30 CM.
  • the light ball in-situ circle indicates that the drone is shaking 5 times.
  • the above parameters are default values. This value can be set by the drone's APP. The same is true. It should be noted that this example contains these kinds of control commands, but is not limited to these kinds of control commands.
  • the user can also operate through two light balls, so that the defined actions and control methods are more, and the two light balls are used as reference objects before each other, so that the DSP operation is performed. More accurate, which makes the recognition rate higher, can complete the identification of more difficult movements, and can bring more fun to the user.
  • the DSP system will calculate the position of the light sphere, and if it is determined that there is motion, information such as the direction and speed of the motion amount is randomly transmitted to the flight control system. Accurate tracking is achieved by binocular recognition of a specific illuminant; precise control is achieved by presetting different motion trajectories in the flight control system to represent different control commands in combination with the photosphere positioning technology to realize the optical ball control drone. Unstable, complex system, high cost, and low accuracy.
  • each functional unit in the embodiment may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit, and the integrated unit may be implemented in the form of hardware. , can also be implemented in the form of software functional units.
  • the specific names of the functional units are only for convenience of distinguishing from each other, and are not intended to limit the scope of protection of the present application.
  • control system and method of the embodiment of the present invention adopts two common cameras to perform real-time imaging on the illuminating target, and simultaneously calculate the distance and displacement amount of the illuminating target in the image, and the moving direction.
  • the analysis and judgment are made, and the motion quantity and the position information are output to the flight control system, and the flight control system performs corresponding motion according to the change amount, thereby implementing the tracking function. Since the position coordinates of the illuminating target can be calculated, the trajectory of the illuminating target can be determined, and the trajectory is compared with the preset trajectory in the flight control system, thereby finding the running instruction of the corresponding trajectory, and achieving the movement through the illuminating target.
  • the purpose of controlling the flight of the drone such as the vertical movement of the illuminating target, indicates that the aircraft is rising, the illuminating target is moving vertically downward, indicating that the aircraft is descending, and the like.
  • the invention solves the problems of instability, complicated system, high cost and low precision. In addition to more accurate tracking, it can also be used as a new control method, so that the drone only needs to add a set.
  • the device can realize a variety of intelligent functions, which is very suitable for popular aerial entertainment drones.
  • the disclosed apparatus and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division, and the actual implementation may have another division manner, for example, multiple units or components may be Combined or can be integrated into another system, or some features can be ignored, or not executed.
  • the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in electrical, mechanical or other form.
  • the unit described as a separate component may or may not be physically distributed, and the component displayed as a unit may or may not be a physical unit, that is, may be located in one place, or may be distributed to multiple On the network unit. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the medium includes a plurality of instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods of the various embodiments of the embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (R 0M, Read-Only Memory), a random access memory (RAM), a magnetic disk or an optical disk, and the like, which can store program codes. medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An unmanned aerial vehicle (20) tracking control system and method based on recognition of a light-emitting target (10), the system comprising a light-emitting target (10) and an unmanned aerial vehicle (20), the unmanned aerial vehicle (20) comprising: at least two cameras (21, 22), the two cameras (21, 22) being used to image the light-emitting target (10) in real time; an operation processing unit (23) connected to the cameras (21, 22), which recognizes the light-emitting target (10) in images shot by the cameras (21, 22) and calculates motion information of the light-emitting target (10); a flight control unit (24) connected to the operation processing unit (23), which controls the unmanned aerial vehicle (20) to fly following the light-emitting target (10) according to the motion information of the light-emitting target (10). The present system solves well the problems of instability, complicated systems, high cost and low accuracy, and in addition to relatively accurate tracking, the system can also be a completely new control method, so that an unmanned aerial vehicle (20) only needs an added device to realize multiple types of smart functionality, making the system very suitable for popularizing aerial photography entertainment unmanned aerial vehicles.

Description

一种基于发光目标识别的无人机跟踪控制系统及方法 技术领域  UAV tracking control system and method based on illuminating target recognition
[0001] 本发明涉及无人机技术, 特别是涉及一种基于发光目标识别的无人机跟踪控制 系统及方法。  [0001] The present invention relates to drone technology, and more particularly to a drone tracking control system and method based on illuminating target recognition.
背景技术  Background technique
[0002] 自动跟踪功能是这两年来在无人机上应用非常广泛也是特别受欢迎的功能, 特 别是对于娱乐航拍无人机来说, 有了这个功能之后, 无人机变得更加智能, 它 能自动跟踪指定的目标, 实现跟拍, 互动等功能, 是目前娱乐无人机必不可少 的一项智能功能。  [0002] The automatic tracking function is a very popular and popular feature in drones in the past two years, especially for entertainment aerial drones. With this function, the drone becomes more intelligent, it can Automatically tracking the specified targets, realizing the functions of shooting, interaction, etc., is an indispensable intelligent function of entertainment drones.
[0003] 目前实现这个功能主要有两种方式, 第一种是基于 GPS定位技术, 跟踪的目标 物上需要携带一个带有 GPS和能与无人机通信功能的通信模块, 比如带 GPS的无 人机遥控器, 智能手机, 带 GPS的智能手环等, 通信模块将目标物所在位置的 G PS信息实吋发送给无人机, 无人机依靠这个位置信息实现跟踪功能, 其优点是 技术成熟简单, 价格适中。  [0003] There are two main ways to implement this function. The first one is based on GPS positioning technology. The target object to be tracked needs to carry a communication module with GPS and communication function with the drone, such as GPS-free. The human-machine remote control, the smart phone, the smart bracelet with GPS, etc., the communication module transmits the G PS information of the location of the target to the drone, and the drone relies on the location information to implement the tracking function, and the advantage is that the technology Mature and simple, the price is moderate.
[0004] 第二种方式是基于双目视觉识别方式的目标跟踪技术, 该技术核心是通过无人 机上的双摄像头, 实吋拍摄跟踪目标, 再通过一套复杂的算法确定目标与无人 机之前的距离, 以及目标的运动量及方向, 无人机控制系统获得此信息后实吋 调整飞行方向和飞行速度, 实现跟踪功能。 该技术的优点是室内室外都实用, 跟踪目标不需要佩戴任何装置, 而且还能用于避障。  [0004] The second method is a target tracking technology based on binocular visual recognition. The core of the technology is to capture the tracking target through a dual camera on the drone, and then determine the target and the drone through a complex algorithm. The previous distance, as well as the amount and direction of the target's movement, the drone control system obtains this information and then adjusts the flight direction and flight speed to achieve the tracking function. The advantage of this technology is that it is practical both indoors and outdoors. It does not require any equipment to track the target, and it can also be used to avoid obstacles.
[0005] 上述的现有的两种技术各有优点, 也有很多不足之处, 基于 GPS跟踪方式的不 足之处是只能在空旷的 GPS信号良好的户外使用, 容易受到环境干扰, 另外受制 于 GPS定位精度, 其跟踪精度也不高, 通常在 1米以上误差范围; 基于双目识别 的跟踪模式的不足之处在于, 其运算量非常大, 需要配备强大的处理器, 同吋 功耗也比较高。 另外, 目前技术还不够成熟, 容易受到环境及光线干扰, 跟踪 精度不是很高, 容易出现跟丢现象, 成本非常高, 不适合用于低成本的娱乐航 拍无人机。 [0006] 就目前来说, 针对日趋大众化的航拍娱乐无人机来说, 急需寻找一种稳定、 简 单、 成本较低、 精度较高的跟踪方式, 以满足用户更高的需求。 [0005] The above two existing technologies have their own advantages and have many shortcomings. The disadvantage of the GPS tracking method is that it can only be used outdoors in an open GPS signal, which is susceptible to environmental interference, and is subject to environmental interference. GPS positioning accuracy, its tracking accuracy is not high, usually in the error range of more than 1 meter; the shortcoming of tracking mode based on binocular recognition is that its computational complexity is very large, it needs a powerful processor, and its power consumption is also Higher. In addition, the current technology is not mature enough, it is easy to be disturbed by the environment and light. The tracking accuracy is not very high, it is prone to loss, and the cost is very high. It is not suitable for low-cost entertainment aerial drones. [0006] For the time being, in view of the increasingly popular aerial entertainment drones, it is urgent to find a stable, simple, low-cost, high-precision tracking method to meet the higher demands of users.
技术问题  technical problem
[0007] 本发明目的在于提供一种基于发光目标识别的无人机跟踪控制系统及方法, 旨 在解决目前的无人机跟踪控制系统跟踪精度不高、 稳定性地的问题。  [0007] The object of the present invention is to provide a drone tracking control system and method based on illuminating target recognition, which aims to solve the problem that the tracking accuracy of the current UAV tracking control system is not high and stable.
问题的解决方案  Problem solution
技术解决方案  Technical solution
[0008] 本发明提供了一种基于发光目标识别的无人机跟踪控制系统, 包括发光目标和 无人机, 所述无人机包括:  [0008] The present invention provides a drone tracking control system based on illuminating target recognition, including a illuminating target and a drone, and the drone includes:
[0009] 至少两个摄像头, 两个摄像头用于对所述发光目标的进行实吋摄像; [0009] at least two cameras, two cameras for performing real-time imaging of the illuminating target;
[0010] 运算处理单元, 与所述摄像头连接, 识别所述摄像头拍摄的图像中的所述发光 目标, 并计算所述发光目标的运动信息; [0010] an operation processing unit, connected to the camera, identifying the illuminating target in the image captured by the camera, and calculating motion information of the illuminating target;
[0011] 飞行控制单元, 与所述运算处理单元连接, 根据所述发光目标的运动信息控制 所述无人机跟随所述发光目标的移动轨迹飞行。 [0011] The flight control unit is connected to the operation processing unit, and controls the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target.
[0012] 本发明还提供了一种基于发光目标识别的无人机跟踪控制方法, 所述方法基于 至少一个发光目标识别跟踪控制, 所述方法包括以下步骤:  [0012] The present invention also provides a drone tracking control method based on illuminating target recognition, the method is based on at least one illuminating target recognition tracking control, and the method comprises the following steps:
[0013] 采用至少两个摄像头用于对所述发光目标的进行实吋摄像; [0013] employing at least two cameras for performing real-time imaging of the illuminating target;
[0014] 识别所述摄像头拍摄的图像中的所述发光目标, 并计算所述发光目标的运动信 息; [0014] identifying the illuminating target in the image captured by the camera, and calculating motion information of the illuminating target;
[0015] 根据所述发光目标的运动信息査控制所述无人机跟随所述发光目标的移动轨迹 飞行。  [0015] controlling, according to the motion information of the illuminating target, the drone to follow the moving trajectory of the illuminating target to fly.
发明的有益效果  Advantageous effects of the invention
有益效果  Beneficial effect
[0016] 上述的基于发光目标识别的无人机跟踪控制系统及方法采用两个普通摄像头, 对发光目标进行实吋摄像, 同吋对图像中的发光目标进行距离和位移量的计算 , 对运动方向做出分析判断, 输出运动量和方位信息给飞行控制系统, 飞行控 制系统依据变化量做出对应的运动, 从而实现跟踪功能。 由于能够计算出发光 目标实吋的位置坐标, 从而能够判断发光目标的运动轨迹, 将此轨迹与飞行控 制系统中预设的轨迹比较, 从而找出对应轨迹的运行指令, 达到通过发光目标 运动来控制无人机飞行的目的, 比如发光目标垂直向上运动, 表示飞机上升, 发光目标垂直向下运动, 表示飞机下降等等。 本发明很好的解决了不稳定、 系 统复杂、 成本高、 精度较低的问题, 除了能较为精确的跟踪之外, 还可以作为 一种全新的控制方式, 使得无人机只需要增加一套装置就能实现多种智能功能 , 非常适合大众化的航拍娱乐无人机。 [0016] The above-described drone tracking control system and method based on illuminating target recognition adopts two common cameras to perform real-time imaging on the illuminating target, and simultaneously calculate the distance and displacement amount of the illuminating target in the image, and exercise The direction is analyzed and judged, and the motion quantity and orientation information are output to the flight control system, and the flight control system performs corresponding motion according to the change amount, thereby implementing the tracking function. Since the position coordinates of the illuminating target can be calculated, the trajectory of the illuminating target can be determined, and the trajectory and flight control can be performed. The preset trajectory comparison in the system determines the running command of the corresponding trajectory, and achieves the purpose of controlling the drone flight by the illuminating target motion, for example, the illuminating target moves vertically upwards, indicating that the aircraft is rising, and the illuminating target is moving vertically downward. Said the aircraft is falling and so on. The invention solves the problems of instability, complicated system, high cost and low precision. In addition to more accurate tracking, it can also be used as a new control method, so that the drone only needs to add a set. The device can realize a variety of intelligent functions, which is very suitable for popular aerial entertainment drones.
对附图的简要说明  Brief description of the drawing
附图说明  DRAWINGS
[0017] 图 1为本发明较佳实施例中基于发光目标识别的无人机跟踪控制系统的模块示 意图;  1 is a schematic diagram of a module of a drone tracking control system based on illuminating target recognition in accordance with a preferred embodiment of the present invention;
[0018] 图 2为图 1所示基于发光目标识别的无人机跟踪控制系统中运算处理单元的模块 示意图;  2 is a schematic diagram of a module of an arithmetic processing unit in the UAV tracking control system based on the illuminating target recognition shown in FIG. 1;
[0019] 图 3为发光目标的坐标系示意图;  3 is a schematic diagram of a coordinate system of a illuminating target;
[0020] 图 4为图 1所示基于发光目标识别的无人机跟踪控制系统中飞行控制单元的模块 示意图。  4 is a schematic diagram of a module of a flight control unit in the UAV tracking control system based on the illuminating target recognition shown in FIG. 1.
[0021] 图 5为本发明较佳实施例中基于发光目标识别的无人机跟踪控制方法的流程图  5 is a flowchart of a drone tracking control method based on illuminating target recognition according to a preferred embodiment of the present invention.
[0022] 图 6为本发明较佳实施例中计算发光目标的运动信息的方法流程图; 6 is a flow chart of a method for calculating motion information of a illuminating target in a preferred embodiment of the present invention;
[0023] 图 7为本发明较佳实施例中控制无人机跟踪发光目标运动轨迹飞行的方法流程 图。  7 is a flow chart of a method for controlling a drone to track a trajectory of a illuminating target in accordance with a preferred embodiment of the present invention.
本发明的实施方式 Embodiments of the invention
[0024] 为了使本发明要解决的技术问题、 技术方案及有益效果更加清楚明白, 以下结 合附图及实施例, 对本发明进行进一步详细说明。 应当理解, 此处所描述的具 体实施例仅仅用以解释本发明, 并不用于限定本发明。  [0024] In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention more clearly, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
[0025] 请参阅图 1, 本发明较佳实施例中基于发光目标识别的无人机跟踪控制系统包 括至少一个发光目标 10和无人机 20。  Referring to FIG. 1, a drone tracking control system based on illuminating target recognition in a preferred embodiment of the present invention includes at least one illuminating target 10 and a drone 20.
[0026] 无人机 20包括第一摄像头 21、 第二摄像头 22、 运算处理单元 23和飞行控制单元 24。 The drone 20 includes a first camera 21, a second camera 22, an arithmetic processing unit 23, and a flight control unit. twenty four.
[0027] 两个摄像头 21、 22用于对所述发光目标 10的进行实吋摄像; 运算处理单元 23与 所述第一摄像头 21和第二摄像头 22连接, 识别所述第一摄像头 21和第二摄像头 2 2拍摄的图像中的所述发光目标 10, 并计算所述发光目标 10的运动信息; 飞行控 制单元 24与所述运算处理单元 23连接, 根据所述发光目标 10的运动信息控制所 述无人机 20跟随所述发光目标的移动轨迹飞行。  [0027] Two cameras 21, 22 are used for performing actual imaging of the illuminating target 10; an arithmetic processing unit 23 is connected to the first camera 21 and the second camera 22, and the first camera 21 and the first camera are identified. The illumination target 10 in the image captured by the second camera 22, and calculating motion information of the illumination target 10; the flight control unit 24 is connected to the operation processing unit 23, and controls the location according to the motion information of the illumination target 10. The drone 20 follows the movement trajectory of the illuminating target.
[0028] 本实施例中, 发光目标 10至少一面 (视觉截面) 均匀发光的发光体, 如光球, 之所以用光球原因是光球放光非常均匀, 在自然界很难找到类似这种均匀的光 源, 因而不容易受到环境光干扰, 另外, 它的特征非常明显, 在对图像分析处 理的过程中很容易捕获, 大大减少了软件的运算量和难度。 此外, 由于光球是 发光体, 在晚上光线不好的情况下也能被摄像头捕获, 使得这种跟踪方式在白 天黑夜都能使用, 比双目识别方式的跟踪更有优势。 如此, 运算处理单元 23采 用具有简单的图像算法的一般性能 DSP (Digital Signal Processor, 数字信号处理 器) , 足够精确计算出光球的运动方向和运动量。 飞行控制单元 24为无人机 20 的中央处理器, 采用 ARM、 intel及 AMD等芯片。 [0028] In this embodiment, the illuminant of the illuminating target 10 at least one side (visual cross section) uniformly emits light, such as a photosphere. The reason why the photosphere is used is that the illuminating of the photosphere is very uniform, and it is difficult to find such uniformity in nature. The light source is therefore not easily disturbed by ambient light. In addition, its characteristics are very obvious, and it is easy to capture during image analysis and processing, which greatly reduces the computational complexity and difficulty of the software. In addition, since the light ball is an illuminant, it can be captured by the camera in the case of poor light at night, so that this tracking method can be used during day and night, which is more advantageous than the tracking of binocular recognition. Thus, the arithmetic processing unit 23 employs a general performance DS P (Digital Signal Processor) having a simple image algorithm, which is sufficient to accurately calculate the direction of motion and the amount of motion of the light sphere. The flight control unit 24 is a central processing unit of the drone 20, and uses chips such as ARM, Intel, and AMD.
[0029] 在更具体的实施方式中, 飞行控制单元 24根据发光目标 10的运动信息査找预存 储的、 对应的预定飞行轨迹, 并控制无人机 20以该预定飞行轨迹飞行。 其中, 根据当前发光目标 10的运动信息査找到的预定飞行轨迹为: 模拟当前发光目标 的移动轨迹的飞行路径, 该飞行路径的运动信息与当前发光目标的运动信息有 相互对应的关系, 如无人机的运动速度、 运动加速度、 运动路程等是发光目标 n 倍, 运动方向是与发光目标相同、 相反或偏移预设角度。 如此, 发光目标 10的 运动信息包括: 所述发光目标 10的运动速度、 运动路程、 运动方向及运动加速 度中的一种或多种。  [0029] In a more specific embodiment, the flight control unit 24 searches for a pre-stored, corresponding predetermined flight trajectory based on the motion information of the illuminating target 10, and controls the drone 20 to fly with the predetermined flight trajectory. The predetermined flight trajectory found according to the motion information of the current illuminating target 10 is: a flight path simulating a moving trajectory of the current illuminating target, and the motion information of the flight path has a corresponding relationship with the motion information of the current illuminating target, if no The movement speed, motion acceleration, movement distance, etc. of the man-machine are n times of the illumination target, and the motion direction is the same as the illumination target, opposite or offset by the preset angle. Thus, the motion information of the illuminating target 10 includes: one or more of the moving speed, the moving distance, the moving direction, and the motion acceleration of the illuminating target 10.
[0030] 在更具体的实施例中, 首先无人机 20幵启跟踪或者发光目标 (光球) 控制模式 , 飞行控制单元 24通知运算处理单元 23幵始工作, 运算处理单元 23幵启左摄像 头模组 L (第一摄像头 21) 和右摄像头模组 R (第二摄像头 22) , 两个摄像模组 幵始搜索光球, 这个过程需要飞行控制单元 24和运算处理单元 23相互通信, 飞 行控制单元 24不断调整无人机 20的高度及方向, 直到以最佳角度拍摄到光球为 止 (通常将光球处在两个摄像头中间为止为最佳方式) , 光球与无人机 20的距 离默认参数为 5米, 当然也可以用户自己来改变这个距离及高度, 方位等, 只有 保证双摄像有能同吋拍摄到光球就可以。 [0030] In a more specific embodiment, first, the drone 20 activates the tracking or illuminating target (light ball) control mode, the flight control unit 24 notifies the arithmetic processing unit 23 to start the operation, and the arithmetic processing unit 23 activates the left camera. The module L (first camera 21) and the right camera module R (second camera 22), the two camera modules start searching for the light ball, this process requires the flight control unit 24 and the arithmetic processing unit 23 to communicate with each other, flight control Unit 24 continuously adjusts the height and direction of the drone 20 until the photosphere is captured at the optimal angle. The best way to set the distance between the photosphere and the drone 20 is 5 meters. Of course, the user can change the distance and height, orientation, etc. It is guaranteed that the dual camera can shoot the light ball at the same time.
[0031] 在具体的实施例中, 请参阅图 2, 运算处理单元 23包括获取模块 231、 识别模块 232和计算模块 233。 [0031] In a specific embodiment, referring to FIG. 2, the operation processing unit 23 includes an acquisition module 231, an identification module 232, and a calculation module 233.
[0032] 获取模块 231用于连续获取所述第一摄像头 21和第二摄像头 22拍摄的图像, 对 于连续获取, 可以预设吋间间隔连续获取, 也可以直接以摄像头本身的拍摄间 隔为例, 即每张图像都需要获取; 也可以每间隔若干张的图像作为被需要的元 素。  [0032] The obtaining module 231 is configured to continuously acquire images captured by the first camera 21 and the second camera 22, and for continuous acquisition, the inter-turn interval may be continuously acquired, or the shooting interval of the camera itself may be directly taken as an example. That is, each image needs to be acquired; it is also possible to use several images at intervals as the required elements.
[0033] 识别模块 232用于识别各个所述图像中的所述发光目标 10, 利用图像识别技术 识别拍摄的图像中发光目标 10并标记, 以待计算发光目标 10的坐标。  [0033] The identification module 232 is configured to identify the illuminating target 10 in each of the images, and recognize the illuminating target 10 in the captured image by using an image recognition technology and mark the coordinates of the illuminating target 10 to be calculated.
[0034] 计算模块 233用于计算所述发光目标 10在各个所述图像中相对相应摄像头的坐 标信息, 并根据所述发光目标 10在各个所述图像中的坐标信息计算预设吋间内 所述发光目标 10的运动信息。 特别地, 对于识别来说, 运算处理单元 23需要连 续不断的计算发光目标 10的移动痕迹 (运动信息) , 隔一段吋间输出发光目标 1 0的累计位移量, 无人机 20的飞行控制单元 24再响应。 对于跟踪控制, 是限定在 一个预设吋间 (同隔一段吋间) 内, 预定飞行轨迹的参数与预设吋间内发光目 标 10的运动信息匹配吋, 则响应跟踪控制, 若不匹配则放弃跟踪控制, 也可以 以默认的飞行轨迹飞行, 如悬停、 降落等。 预设吋间可以为 0.2、 0.5秒等。  [0034] The calculation module 233 is configured to calculate coordinate information of the illumination target 10 relative to the corresponding camera in each of the images, and calculate an preset time according to the coordinate information of the illumination target 10 in each of the images. The motion information of the illuminating target 10 is described. In particular, for the recognition, the arithmetic processing unit 23 needs to continuously calculate the movement trace (motion information) of the illuminating target 10, and the cumulative displacement amount of the illuminating target 10 is outputted for a period of time, and the flight control unit of the drone 20 24 responded again. For tracking control, it is limited to a preset time (same period of time), the parameters of the predetermined flight trajectory match the motion information of the preset daylighting target 10, then the tracking control is performed, if not, then Abandon tracking control, you can also fly with the default flight path, such as hovering, landing, and so on. The preset time can be 0.2, 0.5 seconds, etc.
[0035] 本实施例中, 请参阅图 3, 所述发光目标 10的坐标信息 (X , Υ , Z) 满足以下 公式:  [0035] In this embodiment, referring to FIG. 3, the coordinate information (X, Υ, Z) of the illuminating target 10 satisfies the following formula:
[0036] Z = (b * focal—length) I (x_camL - x_camR);  Z = (b * focal - length) I (x_camL - x_camR);
[0037] X = x_camL * Z I focal—length; [0037] X = x_camL * Z I focal_length;
[0038] Y = y_camL * Z I focal—length; Y = y_camL * Z I focal_length;
[0039] 其中: 发光目标 10的坐标系以第一摄像头 21为零点, 在其他实施方式中可以选 择另一个摄像头 22为坐标系的零点, 或其他位置。 第一摄像头 21和第二摄像头 2 2落在 X轴上, 即以第一摄像头 21和第二摄像头 22所在的直线为坐标系的 X轴; 坐 标系的 Z轴是光轴, 为所述摄像头指向的方向 (与摄像头的镜面垂直的方向) 。 坐标系的 Y轴与所述 x轴及 z轴所在平面垂直。 [0039] wherein: the coordinate system of the illuminating target 10 is zero with the first camera 21, and in other embodiments, the other camera 22 may be selected as the zero point of the coordinate system, or other positions. The first camera 21 and the second camera 22 fall on the X axis, that is, the X axis of the coordinate system of the first camera 21 and the second camera 22; the Z axis of the coordinate system is the optical axis, and the camera The direction of pointing (the direction perpendicular to the mirror of the camera). The Y axis of the coordinate system is perpendicular to the plane in which the x and z axes are located.
[0040] 更具体地, X_camL、 y_camL的值分别是所述发光目标 10 (P点) 在第一摄像头 21的成像点的 X轴坐标、 Y轴坐标; X_camR的值是所述发光目标 10在第二摄像头 22成像点的 X轴坐标; 由于 P点在一直变化的, 所以, x_camL、 x_camR、 y_cam L的值也是一直在变化的, x_camL、 x_camR、 y_camL的值是用像素坐标表示的 [0040] More specifically, X _camL, respectively the value y_camL emission target 10 (P point) X-axis coordinate of the first point in the imaging camera 21, Y-axis coordinate; X _camR value of said target luminescent 10 is the X-axis coordinate of the imaged point of the second camera 22; since the P point is constantly changing, the values of x_camL, x_camR, and y_cam L are also constantly changing, and the values of x_camL, x_camR, and y_camL are expressed in pixel coordinates.
[0041] 采集 x_camL的值的方法: 第一摄像头 21采集的图像, 在 OpenCV (OpenCV的 全称是: Open Source Computer Vision Library。 OpenCV是一个基于 BSD许可 ( 幵源) 发行的跨平台计算机视觉库) 下经过处理, 通过 cvFindStereoCorresponden ceBM的函数可以获得 XL、 YL、 ZL三个值, XL、 YL、 ZL分别是发光目标 10在 第一摄像头 21的成像点在 X轴、 Y轴、 Z轴上与第一摄像头 21 (零点) 的距离。 此日寸 x_camL =XL, y_camL=YL。 [0041] Method of collecting the value of x_camL: The image captured by the first camera 21, in OpenCV (the full name of OpenCV is: Open Source Computer Vision Library. OpenCV is a cross-platform computer vision library based on BSD license (幵源)) After processing, three values of XL, YL, and ZL can be obtained by the function of cvFindStereoCorresponden ceBM, and XL, YL, and ZL are respectively the imaging points of the illuminating target 10 in the first camera 21 on the X-axis, the Y-axis, and the Z-axis. The distance from a camera 21 (zero point). This day is x_camL = XL, y_camL = YL.
[0042] 采集 X_camR的值的方法: 第二摄像头 22采集的图像, 在 opencv下经过处理, 通 过 cvFindStereoCorrespondenceBM函数获得一个值 XR, XR是发光目标 10在第二 摄像头 22的成像点在 X轴上与第二摄像头 22的距离。 因此, X_camR=b+XR。 b的 值是两个摄像头之间的距离; 本实施例中, 第一摄像头 21和第二摄像头 22的焦 距相同, focal—length的值是摄像头的焦距。 The method of values [0042] Acquisition of X _camR: 22 second camera image acquisition, at opencv processed, a value obtained by cvFindStereoCorrespondenceBM function XR, XR is a light emitting point of the second target 10 in the imaging camera 22 on the X axis The distance from the second camera 22. Therefore, X _camR = b + XR. The value of b is the distance between the two cameras; in this embodiment, the focal lengths of the first camera 21 and the second camera 22 are the same, and the value of focal-length is the focal length of the camera.
[0043] 运算处理单元 23通过以上公式能计算出光球在各个图像中的坐标并技术输出一 定吋间段的光球运动信息, 将这个信息实吋传给飞行控制单元 24, 飞控制系统 就能精确的跟随光球, 同吋通过对光球运动轨迹的进行判断, 实现通过运动轨 迹来控制无人机 20的目的。  [0043] The operation processing unit 23 can calculate the coordinates of the photosphere in each image by the above formula and technically output the optical ball motion information of a certain interval, and transmit the information to the flight control unit 24, and the fly control system can Accurately follow the light ball, and the same purpose is to control the drone 20 through the motion trajectory by judging the trajectory of the light ball.
[0044] 在具体的实施例中, 请参阅图 4, 飞行控制单元 24包括存储模块 241、 査找模块 242和控制模块 243。 存储模块 241用于预先存储所述发光目标的运动信息和与其 对应的无人机预定飞行轨迹的表格数据; 査找模块 242用于根据所述发光目标 10 的运动信息査表, 得到与该运动信息对应的无人机 20预定飞行轨迹; 控制模块 2 43用于根据该査找到的预定飞行轨迹输出对应的飞行控制指令控制所述无人机 2 0以该预设飞行信息飞行。 比如: 飞行控制单元 24中预先存储如下几种无人机预 定飞行轨迹: 光球向上运动 10到 30CM, 对应地, 表示无人机 20上升 0.5米; 光球 向下运动向上运动 10到 30CM, 对应地, 表示无人机 20下降 0.5米; 光球向左运动 10到 30CM, 对应地, 表示无人机 20向左飞行 1米; 光球向右 10到 30CM, 对应地 , 表示向右飞行 1米; 光球原地画圈, 对应地, 表示无人机 20左右晃动 5下, 以 上参数都是默认值, 此值可以通过无人机 20的 APP来设置, 如下也是一样。 应当 说明此例子包含这几种控制命令, 但不限于这几种控制命令。 [0044] In a specific embodiment, referring to FIG. 4, the flight control unit 24 includes a storage module 241, a lookup module 242, and a control module 243. The storage module 241 is configured to pre-store the motion information of the illuminating target and the table data of the predetermined flight trajectory of the drone corresponding thereto; the searching module 242 is configured to check the table according to the motion information of the illuminating target 10, and obtain the corresponding information corresponding to the motion information. The drone 20 predetermines the flight trajectory; the control module 243 is configured to output the corresponding flight control command according to the found predetermined flight trajectory to control the drone 20 to fly with the preset flight information. For example: the flight control unit 24 pre-stores the following predetermined flight trajectories of the drone: the light ball moves upwards by 10 to 30 CM, correspondingly, the drone 20 rises by 0.5 m; The downward movement moves upwards by 10 to 30 CM, correspondingly, indicating that the drone 20 is lowered by 0.5 m; the light sphere is moved to the left by 10 to 30 CM, correspondingly, indicating that the drone 20 flies 1 m to the left; the photosphere is 10 to the right 30CM, correspondingly, means to fly to the right for 1 meter; the light ball is drawn in place, correspondingly, indicating that the drone 20 is shaking about 5 times, the above parameters are default values, and this value can be passed through the APP of the drone 20 The settings are the same as below. It should be noted that this example contains these kinds of control commands, but is not limited to these kinds of control commands.
[0045] 运算处理单元 23会实吋计算光球位置, 如果判断有运动, 随机将运动量方向和 速度等信息发送给飞行控制单元 24。 预先存储各个发光目标的运动信息与无人 机 20预定飞行轨迹对应的表格数据举例如下:  [0045] The arithmetic processing unit 23 calculates the position of the light sphere in real terms, and if it is determined that there is motion, information such as the direction and speed of the motion amount is randomly transmitted to the flight control unit 24. An example of pre-storing the motion information of each illuminating target and the table data corresponding to the predetermined flight trajectory of the UAV 20 is as follows:
[0046] 比如检测到光球以 0.5米 /秒的速度再向无人前方运行, 飞行控制单元 24收到此 信息, 控制飞机以相同速度和方位运行, 这样就使得光球和无人机 20之间的距 离相对固定, 从而实现了比较精准的跟踪。 通常如果光球速度低于 1米 /秒速度运 动, 跟随精度能做到 10CM。  [0046] For example, it is detected that the light ball is again operated to the front of the unmanned person at a speed of 0.5 m/s, and the flight control unit 24 receives this information, and controls the aircraft to operate at the same speed and orientation, thus making the light ball and the drone 20 The distance between them is relatively fixed, thus achieving a more accurate tracking. Usually, if the speed of the light ball is less than 1 m / s, the following accuracy can be 10 CM.
[0047] 进一步的如果用户用手拿着光球, 向上运动 10到 30CM, 再向下运动 10到 30CM , 再向左运动 10到 30CM, 再向右 10到 30CM, 再原地画圈。 运算处理单元 23计 算出这些位移量后会通知飞行控制单元 24, 飞行控制单元 24对比找到对应的控 制方式, 执行控制命令, 无人机 20就会先向上运动 0.5米, 再向下运行 0.5米, 再 向左运行 1米, 再向右运行 1米, 再左右晃动 5下。 这样就是实现了通过光球控制 无人机 20运动的目的, 给用户带来全新的体验。  [0047] Further, if the user holds the light ball by hand, move upward 10 to 30 CM, then move downward 10 to 30 CM, then move left to 10 to 30 CM, then right to 10 to 30 CM, and then draw the circle in place. After the calculation processing unit 23 calculates the displacement amounts, the flight control unit 24 is notified, the flight control unit 24 compares and finds the corresponding control mode, and executes the control command, and the drone 20 first moves upward by 0.5 meters and then runs downward by 0.5 meters. , Run 1 meter to the left, then 1 meter to the right, and then 5 times to the left and right. This is to achieve the purpose of controlling the drone 20 movement through the light ball, bringing a new experience to the user.
[0048] 进一步的为了提高娱乐性和精度, 用户可还可以通过两个光球操作, 这样定义 的动作和控制方式就会更多, 同吋两个光球相互之前作为参照物, 使得 DSP运算 更精确, 从而使得识别率更高, 能完成更多高难度动作的识别, 能给用户带来 更多乐趣。  [0048] Further, in order to improve entertainment and precision, the user can also operate through two light balls, so that the defined action and control manners are more, and the two light balls are used as reference objects before each other, so that the DSP operation is performed. More accurate, which makes the recognition rate higher, can complete the identification of more difficult movements, and can bring more fun to the user.
[0049] 无人机跟踪控制系统通过双目识别均匀的特定发光体, 实现精确的跟踪; 通过 在飞行控制系统中预设不同运动轨迹表示不同控制命令的方式结合光球定位技 术实现光球控制无人机 20, 很好的解决了不稳定、 系统复杂、 成本高、 精度较 低的问题。  [0049] The UAV tracking control system realizes accurate tracking by binocularly identifying a uniform specific illuminant; achieving light ball control by means of preset different motion trajectories in the flight control system to represent different control commands combined with photosphere positioning technology The drone 20 solves the problems of instability, complicated system, high cost and low precision.
[0050] 此外, 还公幵了一种基于发光目标识别的无人机跟踪控制方法, 所述方法基于 至少一个发光目标识别跟踪控制, 发光目标优选为均匀发光的光球。 [0051] 优选地实施例中, 首先无人机幵启跟踪或者发光目标 (光球) 控制模式, 飞行 控制系统通知 DSP运算处理系统幵始工作, DSP幵启左摄像头模组 L和右摄像头 模组 R, 两个摄像模组幵始搜索光球, 这个过程需要飞行控制单元和运算处理单 元相互通信, 飞行控制单元不断调整无人机的高度及方向, 直到以最佳角度拍 摄到光球为止 (通常将光球处在两个摄像头中间为止为最佳方式) , 光球与无 人机的距离默认参数为 5米, 当然也可以用户自己来改变这个距离及高度, 方位 等, 只有保证双摄像有能同吋拍摄到光球就可以。 [0050] Furthermore, a drone tracking control method based on illuminating target recognition is disclosed, which method is based on at least one illuminating target recognition tracking control, and the illuminating target is preferably a uniformly illuminating light sphere. [0051] In a preferred embodiment, first, the UAV starts tracking or illuminating the target (light ball) control mode, and the flight control system notifies the DSP operation processing system to start working, and the DSP starts the left camera module L and the right camera module. Group R, two camera modules start searching for the light ball. This process requires the flight control unit and the arithmetic processing unit to communicate with each other. The flight control unit continuously adjusts the height and direction of the drone until the light ball is captured at the optimal angle. (It is usually the best way to place the ball in the middle of the two cameras.) The default distance between the ball and the drone is 5 meters. Of course, you can change the distance and height, orientation, etc. The camera can shoot the light ball at the same time.
[0052] 请参阅图 5, 在较佳的实施方式中, 本方法包括以下步骤:  [0052] Referring to FIG. 5, in a preferred embodiment, the method includes the following steps:
[0053] 步骤 S110, 采用两个摄像头用于对所述发光目标的进行实吋摄像, 所述两个摄 像头包括第一摄像头和第二摄像头。 两个摄像头的焦距相同。  [0053] Step S110, two cameras are used for performing real-time imaging of the illuminating target, and the two cameras include a first camera and a second camera. The focal lengths of the two cameras are the same.
[0054] 步骤 S120, 识别所述摄像头拍摄的图像中的所述发光目标, 并计算所述发光目 标的运动信息。 发光目标的运动信息包括: 所述发光目标的运动速度、 运动路 程、 运动方向、 运动吋间及运动加速度中的一种或多种。  [0054] Step S120: Identify the illuminating target in the image captured by the camera, and calculate motion information of the illuminating target. The motion information of the illuminating target includes: one or more of a moving speed, a moving path, a moving direction, a moving time, and a moving acceleration of the illuminating target.
[0055] 步骤 S130, 根据所述发光目标的运动信息控制所述无人机跟随所述发光目标的 移动轨迹飞行。  [0055] Step S130, controlling the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target.
[0056] 通过两个摄像头及 DSP运算系统加一个特定发光体的方式构成的这套追踪及简 单控制系统能大大增强无人机的娱乐性和智能性, 其核心要素在于两点, 第一 是光球, 之所以用光球原因是光球放光非常均匀, 在自然界很难找到类似这种 均匀的光源, 因而不容易受到环境光干扰, 另外, 它的特征非常明显, 在对图 像分析处理的过程中很容易捕获, 大大减少了软件的运算量和难度, 用一般性 能的 DSP和简单的图像算法就能精确计算出光球的运动方向和运动量。  [0056] The tracking and simple control system formed by the two cameras and the DSP computing system plus a specific illuminant can greatly enhance the entertainment and intelligence of the drone. The core elements are two points. The first is The light ball, the reason why the light ball is used is that the light ball is very uniform, and it is difficult to find such a uniform light source in nature, so it is not easily interfered by ambient light. In addition, its characteristics are very obvious, and the image analysis processing is performed. The process is easy to capture, greatly reducing the amount of computation and difficulty of the software. Using the general performance DSP and simple image algorithm, the direction and amount of movement of the light sphere can be accurately calculated.
[0057] 在更具体的实施方式中, 飞行控制系统根据所述发光目标的运动信息控制所述 无人机跟随所述发光目标的移动轨迹飞行为: 根据发光目标的运动信息査找预 存储的、 对应的预定飞行轨迹, 并控制无人机以该预定飞行轨迹飞行。 其中, 根据当前发光目标的运动信息査找到的预定飞行轨迹为: 模拟当前发光目标的 移动轨迹的飞行路径, 该飞行路径的运动信息与当前发光目标的运动信息有相 互对应的关系, 如无人机的运动速度、 运动加速度、 运动路程等是发光目标 n倍 , 运动方向是与发光目标相同、 相反或偏移预设角度。 [0058] 在更详细的实施方式中, 请参阅图 6和图 3, 步骤 S120具体包括: [0057] In a more specific embodiment, the flight control system controls the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target to: search for pre-stored according to the motion information of the illuminating target, Corresponding predetermined flight trajectory, and controlling the drone to fly with the predetermined flight trajectory. The predetermined flight trajectory found according to the motion information of the current illuminating target is: a flight path simulating a moving trajectory of the current illuminating target, and the motion information of the flight path has a corresponding relationship with the motion information of the current illuminating target, such as no one. The motion speed, motion acceleration, motion distance, etc. of the machine are n times of the illumination target, and the motion direction is the same as the illumination target, or opposite to the preset angle. [0058] In a more detailed embodiment, referring to FIG. 6 and FIG. 3, step S120 specifically includes:
[0059] 步骤 S121, 连续获取所述第一摄像头 21和第二摄像头 22拍摄的图像。 对于连续 获取, 可以预设吋间间隔连续获取, 也可以直接以摄像头本身的拍摄间隔为例 , 即每张图像都需要获取; 也可以每间隔若干张的图像作为被需要的元素。  [0059] Step S121, the images captured by the first camera 21 and the second camera 22 are continuously acquired. For continuous acquisition, you can preset the inter-turn interval to obtain continuously, or you can directly take the camera's own shooting interval as an example, that is, each image needs to be acquired; or several images per interval can be used as the required elements.
[0060] 步骤 S122, 识别各个所述图像中的所述发光目标。 利用图像识别技术识别拍摄 的图像中发光目标并标记, 以待技术发光目标的坐标。  [0060] Step S122, identifying the illuminating target in each of the images. The image recognition technology is used to identify the illuminating target in the captured image and mark it to the coordinates of the technical illuminating target.
[0061] 步骤 S123, 计算所述发光目标在各个所述图像中相对相应摄像头的坐标信息。  [0061] Step S123: Calculate coordinate information of the illuminating target relative to the corresponding camera in each of the images.
本实施例中, 所述发光目标的坐标信息 (x, Υ, ζ) 满足以下公式:  In this embodiment, the coordinate information (x, Υ, ζ) of the illuminating target satisfies the following formula:
[0062] Ζ = (b * focal—length) I (x_camL - x_camR);  006 = (b *focal_length) I (x_camL - x_camR);
[0063] X = x_camL * Z I focal—length; X = x_camL * Z I focal_length;
[0064] Y = y_camL * Z I focal—length; Y = y_camL * Z I focal_length;
[0065] 其中: 发光目标的坐标系以第一摄像头 21为零点, 在其他实施方式中可以选择 另一个摄像头为坐标系的零点, 或其他位置。 第一摄像头 21和第二摄像头 22落 在 X轴上, 即以第一摄像头 21和第二摄像头 22所在的直线为坐标系的 X轴; 坐标 系的 Z轴是光轴, 为所述摄像头指向的方向 (与摄像头的镜面垂直的方向) 。 坐 标系的 Y轴与所述 X轴及 Z轴所在平面垂直。  [0065] wherein: the coordinate system of the illuminating target is zero point with the first camera 21, and in other embodiments, the other camera may be selected as the zero point of the coordinate system, or other positions. The first camera 21 and the second camera 22 are on the X-axis, that is, the X-axis of the coordinate system of the first camera 21 and the second camera 22; the Z-axis of the coordinate system is the optical axis, and the camera is pointed The direction (the direction perpendicular to the mirror of the camera). The Y axis of the coordinate system is perpendicular to the plane of the X and Z axes.
[0066] 更具体地, X_camL、 y_camL的值分别是所述发光目标在第一摄像头 21的成像 点 (P点) 的 X轴坐标、 Y轴坐标; x_camR的值是所述发光目标在第二摄像头 22 成像点的 X轴坐标; 由于 P点在一直变化的, 所以, x_camL、 x_camR、 y_camL 的值也是一直在变化的, x_camL、 x_camR、 y_camL的值是用像素坐标表示的。 [0066] More specifically, X _camL, y_camL values are X axis coordinate of the light emitting point in the imaging target (P point) of the first camera 21, Y-axis coordinate; x_camR value is the target of light emission The X-axis coordinate of the imaging point of the two cameras 22; Since the P point is constantly changing, the values of x_camL, x_camR, and y_camL are also constantly changing, and the values of x_camL, x_camR, and y_camL are expressed in pixel coordinates.
[0067] 采集 x_camL的值的方法: 第一摄像头 21采集的图像, 在 OpenCV (OpenCV的 全称是: Open Source Computer Vision Library。 OpenCV是一个基于 BSD许可 ( 幵源) 发行的跨平台计算机视觉库) 下经过处理, 通过 cvFindStereoCorresponden ceBM的函数可以获得 XL、 YL、 ZL三个值, XL、 YL、 ZL分别是发光目标在第 一摄像头 21的成像点在 X轴、 Y轴、 Z轴上与第一摄像头 21 (零点) 的距离。 此 日寸 x_camL =XL, y_camL=YL。  [0067] Method of collecting the value of x_camL: The image captured by the first camera 21, in OpenCV (the full name of OpenCV is: Open Source Computer Vision Library. OpenCV is a cross-platform computer vision library based on BSD license (幵源) release) After processing, three values of XL, YL, and ZL can be obtained by the function of cvFindStereoCorresponden ceBM, and XL, YL, and ZL are the imaging points of the illuminating target in the first camera 21 on the X-axis, the Y-axis, and the Z-axis, respectively. The distance from the camera 21 (zero point). This day is x_camL = XL, y_camL = YL.
[0068] 采集 X_camR的值的方法: 第二摄像头 22采集的图像, 在 opencv下经过处理, 通 过 cvFindStereoCorrespondenceBM函数获得一个值 XR, XR是发光目标在第二摄 像头 22的成像点在 X轴上与第二摄像头 22的距离。 因此, X_camR=b+XR。 b的值 是两个摄像头之间的距离; 本实施例中, 第一摄像头 21和第二摄像头 22的焦距 相同, focal—length的值是摄像头的焦距。 The method of values [0068] Acquisition of X _camR: 22 second camera image acquisition, at opencv processed, a value obtained by cvFindStereoCorrespondenceBM function XR, XR is a light emitting target at a second camera The imaged point of the head 22 is on the X-axis from the second camera 22. Therefore, X _camR = b + XR. The value of b is the distance between the two cameras; in this embodiment, the focal lengths of the first camera 21 and the second camera 22 are the same, and the value of focal-length is the focal length of the camera.
[0069] 步骤 S124, 根据所述发光目标在各个所述图像中的坐标信息计算预设吋间内所 述发光目标的运动信息。 通过以上公式能计算出光球在各个图像中的坐标并技 术输出一定吋间段的光球运动信息, 将这个信息实吋传给飞行控制系统, 飞控 制系统就能精确的跟随光球, 同吋通过对光球运动轨迹的进行判断, 实现通过 运动轨迹来控制无人机的目的。 特别地, 对于识别来说, 需要连续不断的计算 发光目标的移动痕迹 (运动信息) , 隔一段吋间输出发光目标的累计位移量, 无人机的飞行控制系统再响应。 对于跟踪控制, 是限定在一个预设吋间 (同隔 一段吋间) 内, 预定飞行轨迹的参数与预设吋间内发光目标的运动信息匹配吋 , 则响应跟踪控制, 若不匹配则放弃跟踪控制, 也可以以默认的飞行轨迹飞行 , 如悬停、 降落等。 预设吋间可以为 0.2、 0.5秒等。  [0069] Step S124: Calculate motion information of the illumination target in the preset time according to the coordinate information of the illumination target in each of the images. Through the above formula, the coordinates of the photosphere in each image can be calculated and the optical ball motion information of a certain interval can be outputted by the technology, and the information can be transmitted to the flight control system, and the flight control system can accurately follow the light ball. By judging the trajectory of the light sphere, the purpose of controlling the drone through the motion trajectory is realized. In particular, for the recognition, it is necessary to continuously calculate the movement trace (motion information) of the illuminating target, and the cumulative displacement of the illuminating target is outputted for a period of time, and the flight control system of the drone responds again. For tracking control, it is limited to a preset time (same interval), the parameters of the predetermined flight trajectory match the motion information of the preset daytime illuminating target, then the tracking control is performed, and if it does not match, the data is discarded. Tracking control, you can also fly with the default flight path, such as hovering, landing, and so on. The preset time can be 0.2, 0.5 seconds, etc.
[0070] 在更详细的实施方式中, 请参阅图 7, 根据发光目标的运动信息査找预存储的 、 对应的预定飞行轨迹, 并控制无人机以该预定飞行轨迹飞行具体包括:  [0070] In a more detailed embodiment, referring to FIG. 7, searching for a pre-stored, corresponding predetermined flight trajectory according to the motion information of the illuminating target, and controlling the drone to fly the predetermined flight trajectory specifically includes:
[0071] 步骤 S131, 预先存储所述发光目标的运动信息和与其对应的无人机预定飞行轨 迹数据表格。  [0071] Step S131, pre-storing the motion information of the illuminating target and the unmanned flight scheduled trajectory data table corresponding thereto.
[0072] 步骤 S132, 根据所述发光目标的运动信息査表, 得到与该运动信息对应的无人 机预定飞行轨迹。  [0072] Step S132: Obtain a predetermined flight trajectory of the UAV corresponding to the motion information according to the motion information of the illuminating target.
[0073] 步骤 S133, 根据该査找到的预定飞行轨迹输出对应的飞行控制指令控制所述无 人机以该预设飞行信息飞行。  [0073] Step S133: Output the corresponding flight control command according to the found predetermined flight trajectory to control the non-human machine to fly with the preset flight information.
[0074] 预先存储发各个光目标的运动信息与无人机预定飞行轨迹对应的表格数据举例 如下: [0074] An example of pre-storing table data corresponding to the motion information of the respective light targets and the predetermined flight trajectory of the drone is as follows:
[0075] 比如检测到光球以 0.5米 /秒的速度再向无人前方运行, 飞行控制系统收到此信 息, 控制飞机以相同速度和方位运行, 这样就使得光球和无人机之间的距离相 对固定, 从而实现了比较精准的跟踪。 通常如果光球速度低于 1米 /秒速度运动, 跟随精度能做到 10CM。  [0075] For example, it is detected that the light ball is operated to the front of the unmanned person at a speed of 0.5 m/s. The flight control system receives this information and controls the aircraft to operate at the same speed and orientation, thus causing the ball between the light ball and the drone. The distance is relatively fixed, thus achieving a more accurate tracking. Usually, if the speed of the light ball is less than 1 m / sec, the following accuracy can be 10 CM.
[0076] 进一步的在飞行控制系统中定义如下几种运动轨迹: 光球向上运动 10到 30CM 表示无人机上升 0.5米; 光球向下运动向上运动 10到 30CM表示无人机下降 0.5米 ; 光球向左运动 10到 30CM表示向左飞行 1米; 光球向右 10到 30CM表示向右飞行 1米; 光球原地画圈表示无人机左右晃动 5下, 以上参数都是默认值, 此值可以 通过无人机的 APP来设置, 如下也是一样。 应当说明此例子包含这几种控制命令 , 但不限于这几种控制命令。 [0076] Further, the following motion trajectories are defined in the flight control system: The light ball moves upward 10 to 30 CM Indicates that the drone rises by 0.5 m; the downward movement of the light ball moves upwards by 10 to 30 CM, indicating that the drone is lowered by 0.5 m; the movement of the photosphere to the left by 10 to 30 CM means 1 m to the left; and the photosphere to the right by 10 to 30 CM. Flying right 1 meter; The light ball in-situ circle indicates that the drone is shaking 5 times. The above parameters are default values. This value can be set by the drone's APP. The same is true. It should be noted that this example contains these kinds of control commands, but is not limited to these kinds of control commands.
[0077] 进一步的如果用户用手拿着光球, 向上运动 10到 30CM, 再向下运动 10到 30CM , 再向左运动 10到 30CM, 再向右 10到 30CM, 再原地画圈。 DSP计算出这些位移 量后会通知飞行控制系统, 飞行控制单系统对比找到对应的控制方式, 执行控 制命令, 无人机就会先向上运动 0.5米, 再向下运行 0.5米, 再向左运行 1米, 再 向右运行 1米, 再左右晃动 5下。 这样就是实现了通过光球控制无人机运动的目 的, 给用户带来全新的体验。  [0077] Further, if the user holds the light ball by hand, move upward 10 to 30 CM, then move 10 to 30 CM downward, then move 10 to 30 CM to the left, then 10 to 30 CM to the right, and then circle the original. After the DSP calculates these displacements, it will notify the flight control system. The flight control system compares and finds the corresponding control mode. When the control command is executed, the drone will move upwards by 0.5 meters, then run down 0.5 meters, and then run to the left. 1 meter, then run 1 meter to the right, then shake it 5 times. This is to achieve the purpose of controlling the drone movement through the light ball, bringing a new experience to the user.
[0078] 进一步的为了提高娱乐性和精度, 用户可还可以通过两个光球操作, 这样定义 的动作和控制方式就会更多, 同吋两个光球相互之前作为参照物, 使得 DSP运算 更精确, 从而使得识别率更高, 能完成更多高难度动作的识别, 能给用户带来 更多乐趣。  [0078] Further, in order to improve entertainment and precision, the user can also operate through two light balls, so that the defined actions and control methods are more, and the two light balls are used as reference objects before each other, so that the DSP operation is performed. More accurate, which makes the recognition rate higher, can complete the identification of more difficult movements, and can bring more fun to the user.
[0079] DSP系统会实吋计算光球位置, 如果判断有运动, 随机将运动量方向和速度等 信息发送给飞行控制系统。 通过双目识别均匀的特定发光体, 实现精确的跟踪 ; 通过在飞行控制系统中预设不同运动轨迹表示不同控制命令的方式结合光球 定位技术实现光球控制无人机, 很好的解决了不稳定、 系统复杂、 成本高、 精 度较低的问题。  [0079] The DSP system will calculate the position of the light sphere, and if it is determined that there is motion, information such as the direction and speed of the motion amount is randomly transmitted to the flight control system. Accurate tracking is achieved by binocular recognition of a specific illuminant; precise control is achieved by presetting different motion trajectories in the flight control system to represent different control commands in combination with the photosphere positioning technology to realize the optical ball control drone. Unstable, complex system, high cost, and low accuracy.
[0080] 所属领域的技术人员可以清楚地了解到, 为了描述的方便和简洁, 仅以上述各 功能单元的划分进行举例说明, 实际应用中, 可以根据需要而将上述功能分配 由不同的功能单元完成, 即将所述装置的内部结构划分成不同的功能单元或模 块, 以完成以上描述的全部或者部分功能。 实施例中的各功能单元可以集成在 一个处理单元中, 也可以是各个单元单独物理存在, 也可以两个或两个以上单 元集成在一个单元中, 上述集成的单元既可以采用硬件的形式实现, 也可以采 用软件功能单元的形式实现。 另外, 各功能单元的具体名称也只是为了便于相 互区分, 并不用于限制本申请的保护范围。 上述装置中单元的具体工作过程, 可以参考前述方法实施例中的对应过程, 在此不再赘述。 [0080] It will be clearly understood by those skilled in the art that, for convenience and brevity of description, only the division of each functional unit described above is exemplified. In practical applications, the above functions may be assigned to different functional units as needed. Upon completion, the internal structure of the device is divided into different functional units or modules to perform all or part of the functions described above. Each functional unit in the embodiment may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit, and the integrated unit may be implemented in the form of hardware. , can also be implemented in the form of software functional units. In addition, the specific names of the functional units are only for convenience of distinguishing from each other, and are not intended to limit the scope of protection of the present application. The specific working process of the unit in the above device, Reference may be made to the corresponding process in the foregoing method embodiments, and details are not described herein again.
[0081] 综上所述, 本发明实施例的控制系统及方法采用两个普通摄像头, 对发光目标 进行实吋摄像, 同吋对图像中的发光目标进行距离和位移量的计算, 对运动方 向做出分析判断, 输出运动量和方位信息给飞行控制系统, 飞行控制系统依据 变化量做出对应的运动, 从而实现跟踪功能。 由于能够计算出发光目标实吋的 位置坐标, 从而能够判断发光目标的运动轨迹, 将此轨迹与飞行控制系统中预 设的轨迹比较, 从而找出对应轨迹的运行指令, 达到通过发光目标运动来控制 无人机飞行的目的, 比如发光目标垂直向上运动, 表示飞机上升, 发光目标垂 直向下运动, 表示飞机下降等等。 本发明很好的解决了不稳定、 系统复杂、 成 本高、 精度较低的问题, 除了能较为精确的跟踪之外, 还可以作为一种全新的 控制方式, 使得无人机只需要增加一套装置就能实现多种智能功能, 非常适合 大众化的航拍娱乐无人机。  [0081] In summary, the control system and method of the embodiment of the present invention adopts two common cameras to perform real-time imaging on the illuminating target, and simultaneously calculate the distance and displacement amount of the illuminating target in the image, and the moving direction. The analysis and judgment are made, and the motion quantity and the position information are output to the flight control system, and the flight control system performs corresponding motion according to the change amount, thereby implementing the tracking function. Since the position coordinates of the illuminating target can be calculated, the trajectory of the illuminating target can be determined, and the trajectory is compared with the preset trajectory in the flight control system, thereby finding the running instruction of the corresponding trajectory, and achieving the movement through the illuminating target. The purpose of controlling the flight of the drone, such as the vertical movement of the illuminating target, indicates that the aircraft is rising, the illuminating target is moving vertically downward, indicating that the aircraft is descending, and the like. The invention solves the problems of instability, complicated system, high cost and low precision. In addition to more accurate tracking, it can also be used as a new control method, so that the drone only needs to add a set. The device can realize a variety of intelligent functions, which is very suitable for popular aerial entertainment drones.
[0082] 本领域普通技术人员可以意识到, 结合本文中所公幵的实施例描述的各示例的 单元及算法步骤, 能够以电子硬件、 或者计算机软件和电子硬件的结合来实现 。 这些功能究竟以硬件还是软件方式来执行, 取决于技术方案的特定应用和设 计约束条件。 专业技术人员可以对每个特定的应用来使用不同方法来实现所描 述的功能, 但是这种实现不应认为超出本发明的范围。  [0082] Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed herein can be implemented in electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods for implementing the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present invention.
[0083] 在本发明所提供的实施例中, 应该理解到, 所揭露的装置和方法, 可以通过其 它的方式实现。 例如, 以上所描述的装置实施例仅仅是示意性的, 例如, 所述 模块或单元的划分, 仅仅为一种逻辑功能划分, 实际实现吋可以有另外的划分 方式, 例如多个单元或组件可以结合或者可以集成到另一个系统, 或一些特征 可以忽略, 或不执行。 另一点, 所显示或讨论的相互之间的耦合或直接耦合或 通讯连接可以是通过一些接口, 装置或单元的间接耦合或通讯连接, 可以是电 性, 机械或其它的形式。  [0083] In the embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the device embodiments described above are merely illustrative. For example, the division of the modules or units is only a logical function division, and the actual implementation may have another division manner, for example, multiple units or components may be Combined or can be integrated into another system, or some features can be ignored, or not executed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in electrical, mechanical or other form.
[0084] 所述作为分离部件说明的单元可以是或者也可以不是物理上分幵的, 作为单元 显示的部件可以是或者也可以不是物理单元, 即可以位于一个地方, 或者也可 以分布到多个网络单元上。 可以根据实际的需要选择其中的部分或者全部单元 来实现本实施例方案的目的。 [0085] 另外, 在本发明各个实施例中的各功能单元可以集成在一个处理单元中, 也可 以是各个单元单独物理存在, 也可以两个或两个以上单元集成在一个单元中。 上述集成的单元既可以采用硬件的形式实现, 也可以采用软件功能单元的形式 实现。 [0084] The unit described as a separate component may or may not be physically distributed, and the component displayed as a unit may or may not be a physical unit, that is, may be located in one place, or may be distributed to multiple On the network unit. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit. The above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
[0086] 所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用 吋, 可以存储在一个计算机可读取存储介质中。 基于这样的理解, 本发明实施 例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部 或部分可以以软件产品的形式体现出来, 该计算机软件产品存储在一个存储介 质中, 包括若干指令用以使得一台计算机设备 (可以是个人计算机, 服务器, 或者网络设备等) 或处理器 (processor) 执行本发明实施例各个实施例所述方法 的全部或部分步骤。 而前述的存储介质包括: U盘、 移动硬盘、 只读存储器 (R 0M, Read-Only Memory) 、 随机存取存储器 (RAM, Random Access Memory ) 、 磁碟或者光盘等各种可以存储程序代码的介质。  [0086] The integrated unit, if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present invention may contribute to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage. The medium includes a plurality of instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods of the various embodiments of the embodiments of the present invention. The foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (R 0M, Read-Only Memory), a random access memory (RAM), a magnetic disk or an optical disk, and the like, which can store program codes. medium.
以上所述仅为本发明的较佳实施例而已, 并不用以限制本发明, 凡在本发明的 精神和原则之内所作的任何修改、 等同替换和改进等, 均应包含在本发明的保 护范围之内。  The above is only the preferred embodiment of the present invention, and is not intended to limit the present invention. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the protection of the present invention. Within the scope.

Claims

权利要求书 Claim
[权利要求 1] 一种基于发光目标识别的无人机跟踪控制系统, 其特征在于, 包括无 人机和至少一个发光目标, 所述无人机包括:  [Attachment 1] A UAV tracking control system based on illuminating target recognition, comprising: a human-machineless machine and at least one illuminating target, the drone comprising:
第一摄像头和第二摄像头, 用于对所述发光目标的进行实吋摄像; 运算处理单元, 与所述第一摄像头和第二摄像头连接, 识别所述第一 摄像头和第二摄像头拍摄的图像中的所述发光目标, 并计算所述发光 目标的运动信息;  a first camera and a second camera, configured to perform actual imaging on the illuminating target; an arithmetic processing unit connected to the first camera and the second camera to identify images captured by the first camera and the second camera The illuminating target in the middle, and calculating motion information of the illuminating target;
飞行控制单元, 与所述运算处理单元连接, 根据所述发光目标的运动 信息控制所述无人机跟随所述发光目标的移动轨迹飞行。  The flight control unit is connected to the operation processing unit, and controls the drone to follow the movement trajectory of the illuminating target according to the motion information of the illuminating target.
[权利要求 2] 如权利要求 1所述的基于发光目标识别的无人机跟踪控制系统, 其特 征在于, 所述发光目标为至少一面均匀发光的发光体。  [Claim 2] The illuminating target recognition-based drone tracking control system according to claim 1, wherein the illuminating target is an illuminant that emits light uniformly on at least one side.
[权利要求 3] 如权利要求 1所述的基于发光目标识别的无人机跟踪控制系统, 其特 征在于, 所述运算处理单元包括:  [Claim 3] The illuminating target recognition-based drone tracking control system according to claim 1, wherein the arithmetic processing unit comprises:
获取模块, 用于连续获取所述第一摄像头和第二摄像头拍摄的图像; 识别模块, 用于识别各个所述图像中的所述发光目标;  An acquiring module, configured to continuously acquire images captured by the first camera and the second camera; and an identifying module, configured to identify the illuminating target in each of the images;
计算模块, 用于计算所述发光目标在各个所述图像中相对相应摄像头 的坐标信息, 并根据该坐标信息计算预设吋间内所述发光目标的运动 f π息。  And a calculation module, configured to calculate coordinate information of the illumination target relative to the corresponding camera in each of the images, and calculate, according to the coordinate information, a motion f π of the illumination target in the preset time.
[权利要求 4] 如权利要求 3所述的基于发光目标识别的无人机跟踪控制系统, 其特 征在于, 所述发光目标的坐标信息 (Χ, Υ, Ζ) 满足以下公式: Ζ = (b * focal—length) I (x_camL - x_camR);  [Claim 4] The illuminating target recognition-based drone tracking control system according to claim 3, wherein the coordinate information (Χ, Υ, Ζ) of the illuminating target satisfies the following formula: Ζ = (b *focal_length) I (x_camL - x_camR);
X = x_camL * Z / focal—length;  X = x_camL * Z / focal_length;
Y = y_camL * Z / focal—length;  Y = y_camL * Z / focal_length;
其中: 发光目标的坐标系以其中一个摄像头为坐标系的零点, 两个所 述摄像头所在的直线为坐标系的 X轴, 坐标系的 z轴是两个所述摄像 头指向的方向, 坐标系的 Y轴与 X轴及 Z轴所在平面垂直; X_camL、 y _camL的值分别是所述发光目标在第一摄像头的成像点的 X轴坐标、 Y轴坐标; x_camR的值是所述发光目标在第二摄像头成像点的 X轴坐 标; focaUength的值是两个所述摄像头的焦距。 Wherein: the coordinate system of the illuminating target is the zero point of the coordinate system of one of the cameras, the straight line of the two cameras is the X axis of the coordinate system, and the z axis of the coordinate system is the direction pointed by the two cameras, the coordinate system the Y-axis and X-axis and the Z-axis perpendicular to the plane; X _camL, the value of y _camL are X-axis coordinate of the emission point of the first target in the imaging of the camera, Y-axis coordinate; x_camR value of said target luminescent The X-axis of the second camera imaging point The value of focaUength is the focal length of the two cameras.
[权利要求 5] 如权利要求 1所述的基于发光目标识别的无人机跟踪控制系统, 其特 征在于, 所述飞行控制单元根据所述发光目标的运动信息控制所述无 人机跟随所述发光目标的移动轨迹飞行具体为: 根据所述发光目标的 运动信息査找预存储的、 对应的预定飞行轨迹, 并控制所述无人机以 该预定飞行轨迹飞行。 [Claim 5] The illuminating target recognition-based drone tracking control system according to claim 1, wherein the flight control unit controls the drone to follow the motion information according to the motion information of the illuminating target The moving trajectory of the illuminating target is specifically: searching for a pre-stored, corresponding predetermined flight trajectory according to the motion information of the illuminating target, and controlling the drone to fly with the predetermined flight trajectory.
[权利要求 6] 如权利要求 5所述的基于发光目标识别的无人机跟踪控制系统, 其特 征在于, 飞行控制单元包括:  [Claim 6] The illuminating target recognition-based drone tracking control system according to claim 5, wherein the flight control unit comprises:
存储模块, 用于预先存储所述发光目标的运动信息和与其对应的无人 机预定飞行轨迹的数据表格;  a storage module, configured to pre-store motion information of the illuminating target and a data table of a predetermined flight trajectory of the unmanned aircraft;
査找模块, 用于根据所述发光目标的运动信息査表, 得到与该运动信 息对应的无人机预定飞行轨迹;  a search module, configured to obtain a predetermined flight trajectory of the drone corresponding to the motion information according to the motion information of the illuminating target;
控制模块, 用于根据该査找到的预定飞行轨迹输出对应的飞行控制指 令控制所述无人机以该预设飞行信息飞行。  And a control module, configured to output, according to the found predetermined flight trajectory, a corresponding flight control command to control the drone to fly with the preset flight information.
[权利要求 7] 如权利要求 1至 6任一项所述的基于发光目标识别的无人机跟踪控制系 统, 其特征在于, 所述发光目标的运动信息包括: 所述发光目标的运 动速度、 运动路程、 运动方向、 运动吋间及运动加速度中的一种或多 种。 The illuminating target recognition-based drone tracking control system according to any one of claims 1 to 6, wherein the illuminating target motion information comprises: a moving speed of the illuminating target, One or more of a moving path, a moving direction, a moving day, and an acceleration of motion.
[权利要求 8] —种基于发光目标识别的无人机跟踪控制方法, 其特征在于, 所述方 法基于至少一个发光目标识别跟踪控制, 所述方法包括以下步骤: 采用两个摄像头用于对所述发光目标的进行实吋摄像, 所述两个摄像 头包括第一摄像头和第二摄像头;  [Attachment 8] A drone tracking control method based on illuminating target recognition, wherein the method is based on at least one illuminating target recognition tracking control, the method comprising the following steps: using two cameras for aligning Performing a real imaging of the illuminating target, the two cameras including a first camera and a second camera;
识别所述摄像头拍摄的图像中的所述发光目标, 并计算所述发光目标 的运动信息;  Identifying the illuminating target in the image captured by the camera, and calculating motion information of the illuminating target;
根据所述发光目标的运动信息控制所述无人机跟随所述发光目标的移 动轨迹飞行。  The drone is controlled to follow the moving trajectory of the illuminating target according to the motion information of the illuminating target.
[权利要求 9] 如权利要求 8所述的基于发光目标识别的无人机跟踪控制方法, 其特 征在于, 所述发光目标为至少一面均匀发光的发光体。 如权利要求 8所述的基于发光目标识别的无人机跟踪控制方法, 其特 征在于, 所述识别所述摄像头拍摄的图像中的所述发光目标, 并计算 所述发光目标的运动信息, 具体包括: The method according to claim 8, wherein the illuminating target is an illuminant that emits light uniformly on at least one surface. The illuminating target recognition-based drone tracking control method according to claim 8, wherein the illuminating target in the image captured by the camera is recognized, and motion information of the illuminating target is calculated, specifically include:
连续获取所述第一摄像头和第二摄像头拍摄的图像; And continuously acquiring images captured by the first camera and the second camera;
识别各个所述图像中的所述发光目标; Identifying the illuminating target in each of the images;
计算所述发光目标在各个所述图像中相对相应摄像头的坐标信息; 根据所述发光目标在各个所述图像中的坐标信息计算预设吋间内所述 发光目标的运动信息。 Calculating coordinate information of the illuminating target relative to the corresponding camera in each of the images; calculating motion information of the illuminating target within the preset time according to coordinate information of the illuminating target in each of the images.
如权利要求 10所述的基于发光目标识别的无人机跟踪控制方法, 其特 征在于, 所述发光目标的坐标信息 (Χ, Υ, Ζ) 满足以下公式: Ζ = (b * focal—length) I (x_camL - x_camR); The illuminating target recognition-based drone tracking control method according to claim 10, wherein the coordinate information (Χ, Υ, Ζ) of the illuminating target satisfies the following formula: Ζ = (b * focal_length) I (x_camL - x_camR);
X = x_camL * Z / focal—length; X = x_camL * Z / focal_length;
Y = y_camL * Z / focal—length; Y = y_camL * Z / focal_length;
其中: 发光目标的坐标系以其中一个摄像头为坐标系的零点, 两个所 述摄像头所在的直线为坐标系的 X轴, 坐标系的 z轴是两个所述摄像 头指向的方向, 坐标系的 Y轴与 X轴及 Z轴所在平面垂直; X_camL、 y _camL的值分别是所述发光目标在第一摄像头的成像点的 X轴坐标、 Y轴坐标; x_camR的值是所述发光目标在第二摄像头成像点的 X轴坐 标; focaUength的值是两个所述摄像头的焦距。 Wherein: the coordinate system of the illuminating target is the zero point of the coordinate system of one of the cameras, the straight line of the two cameras is the X axis of the coordinate system, and the z axis of the coordinate system is the direction pointed by the two cameras, the coordinate system The Y axis is perpendicular to the plane in which the X axis and the Z axis are located; the values of X _camL and y _camL are the X-axis coordinate and the Y-axis coordinate of the imaging point of the illuminating target at the first camera, respectively; the value of x_camR is the illuminating target at The X-axis coordinate of the second camera imaging point; the value of focaUength is the focal length of the two cameras.
如权利要求 8所述的基于发光目标识别的无人机跟踪控制方法, 其特 征在于, 所述根据所述发光目标的运动信息控制所述无人机跟随所述 发光目标的移动轨迹飞行具体为: The illuminating target recognition-based drone tracking control method according to claim 8, wherein the controlling the movement trajectory of the drone following the illuminating target according to the motion information of the illuminating target is specifically :
根据所述发光目标的运动信息査找预存储的、 对应的预定飞行轨迹, 并控制所述无人机以该预定飞行轨迹飞行。 Searching for a pre-stored, corresponding predetermined flight trajectory according to the motion information of the illuminating target, and controlling the drone to fly with the predetermined flight trajectory.
如权利要求 12所述的基于发光目标识别的无人机跟踪控制方法, 其特 征在于, 所述根据所述发光目标的运动信息査找预存储的、 对应的预 定飞行轨迹, 并控制所述无人机以该预定飞行轨迹飞行的步骤具体为 预先存储所述发光目标的运动信息和与其对应的无人机预定飞行轨迹 的数据表格; The illuminating target recognition-based drone tracking control method according to claim 12, wherein the searching for a pre-stored, corresponding predetermined flight trajectory according to the motion information of the illuminating target, and controlling the unmanned person The step of flying the machine on the predetermined flight trajectory is specifically Pre-storing motion information of the illuminating target and a data table of a predetermined flight trajectory of the drone corresponding thereto;
根据所述发光目标的运动信息査表, 得到与该运动信息对应的无人机 预定飞行轨迹;  Obtaining a predetermined flight trajectory of the drone corresponding to the motion information according to the motion information of the illuminating target;
根据该査找到的预定飞行轨迹输出对应的飞行控制指令控制所述无人 机以该预设飞行信息飞行。  And outputting the corresponding flight control command according to the found predetermined flight trajectory to control the drone to fly with the preset flight information.
[权利要求 14] 如权利要求 8至 13任一项所述的基于发光目标识别的无人机跟踪控制 方法, 其特征在于, 所述发光目标的运动信息包括: 所述发光目标的 运动速度、 运动路程、 运动方向、 运动吋间及运动加速度中的一种或 多种。 The method for controlling the tracking of the vehicle based on the illuminating target according to any one of claims 8 to 13, wherein the motion information of the illuminating target comprises: a moving speed of the illuminating target, One or more of a moving path, a moving direction, a moving day, and an acceleration of motion.
PCT/CN2016/097249 2016-07-21 2016-08-30 Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method WO2018014420A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610578483.2A CN106598075A (en) 2016-07-21 2016-07-21 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification
CN201610578483.2 2016-07-21

Publications (1)

Publication Number Publication Date
WO2018014420A1 true WO2018014420A1 (en) 2018-01-25

Family

ID=58556015

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/097249 WO2018014420A1 (en) 2016-07-21 2016-08-30 Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method

Country Status (2)

Country Link
CN (1) CN106598075A (en)
WO (1) WO2018014420A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759986A (en) * 2021-09-27 2021-12-07 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle monitoring and tracking method, device, equipment and storage medium

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108496138B (en) 2017-05-25 2022-04-22 深圳市大疆创新科技有限公司 Tracking method and device
CN109388151A (en) * 2017-08-04 2019-02-26 深圳曼塔智能科技有限公司 Method, apparatus, system and the terminal device of unmanned plane target tracking
CN108168522A (en) * 2017-12-11 2018-06-15 宁波亿拍客网络科技有限公司 A kind of unmanned plane observed object method for searching and correlation technique again
JP6652979B2 (en) * 2018-02-20 2020-02-26 ソフトバンク株式会社 Image processing device, flying object and program
CN110262565B (en) * 2019-05-28 2023-03-21 深圳市吉影科技有限公司 Target tracking motion control method and device applied to underwater six-push unmanned aerial vehicle
CN110956642A (en) * 2019-12-03 2020-04-03 深圳市未来感知科技有限公司 Multi-target tracking identification method, terminal and readable storage medium
CN113721661B (en) * 2021-09-03 2022-02-25 中国人民解放军32802部队 Cooperative unmanned aerial vehicle cluster observation device
CN114979611A (en) * 2022-05-19 2022-08-30 国网智能科技股份有限公司 Binocular sensing system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
CN105517664A (en) * 2014-05-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for uav docking
CN105550670A (en) * 2016-01-27 2016-05-04 兰州理工大学 Target object dynamic tracking and measurement positioning method
US9342746B1 (en) * 2011-03-17 2016-05-17 UtopiaCompression Corporation Maneuverless passive range estimation using monocular image sequences
CN105678289A (en) * 2016-03-07 2016-06-15 谭圆圆 Control method and device of unmanned aerial vehicle
CN105739520A (en) * 2016-01-29 2016-07-06 余江 Unmanned aerial vehicle identification system and identification method thereof

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2939325B1 (en) * 2008-12-04 2015-10-16 Parrot DRONES SYSTEM WITH RECONNAISSANCE BEACONS
CN102012706B (en) * 2010-10-01 2015-06-24 苏州佳世达电通有限公司 Electronic device capable of automatically positioning and moving and method for automatically returning moving element thereof
CN102871784B (en) * 2012-09-21 2015-04-08 中国科学院深圳先进技术研究院 Positioning controlling apparatus and method
US9932110B2 (en) * 2014-07-22 2018-04-03 Jonathan McNally Method for installing an object using an unmanned aerial vehicle
CN104820435A (en) * 2015-02-12 2015-08-05 武汉科技大学 Quadrotor moving target tracking system based on smart phone and method thereof
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105100728A (en) * 2015-08-18 2015-11-25 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle video tracking shooting system and method
CN105108757B (en) * 2015-09-09 2016-09-07 三峡大学 Wheeled Soccer Robot based on smart mobile phone and method of operating thereof
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342746B1 (en) * 2011-03-17 2016-05-17 UtopiaCompression Corporation Maneuverless passive range estimation using monocular image sequences
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
CN105517664A (en) * 2014-05-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for uav docking
CN105550670A (en) * 2016-01-27 2016-05-04 兰州理工大学 Target object dynamic tracking and measurement positioning method
CN105739520A (en) * 2016-01-29 2016-07-06 余江 Unmanned aerial vehicle identification system and identification method thereof
CN105678289A (en) * 2016-03-07 2016-06-15 谭圆圆 Control method and device of unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759986A (en) * 2021-09-27 2021-12-07 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle monitoring and tracking method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN106598075A (en) 2017-04-26

Similar Documents

Publication Publication Date Title
WO2018014420A1 (en) Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method
CN108476288B (en) Shooting control method and device
US11797009B2 (en) Unmanned aerial image capture platform
US11644832B2 (en) User interaction paradigms for a flying digital assistant
CN109241820B (en) Unmanned aerial vehicle autonomous shooting method based on space exploration
CN102169366B (en) Multi-target tracking method in three-dimensional space
JP2021530814A (en) Methods and systems for resolving hemispherical ambiguities using position vectors
KR20180044279A (en) System and method for depth map sampling
US20210112194A1 (en) Method and device for taking group photo
US20200372715A1 (en) Real-world object recognition for computing device
CN113116224B (en) Robot and control method thereof
CN103619090A (en) System and method of automatic stage lighting positioning and tracking based on micro inertial sensor
EP3127586B1 (en) Interactive system, remote controller and operating method thereof
WO2021127888A1 (en) Control method, smart glasses, mobile platform, gimbal, control system, and computer-readable storage medium
CN108564657A (en) A kind of map constructing method, electronic equipment and readable storage medium storing program for executing based on high in the clouds
CN109002059A (en) A kind of multi-rotor unmanned aerial vehicle object real-time tracking camera system and method
CN108885487A (en) A kind of gestural control method of wearable system and wearable system
US20120002044A1 (en) Method and System for Implementing a Three-Dimension Positioning
WO2022082440A1 (en) Method, apparatus and system for determining target following strategy, and device and storage medium
US20190228583A1 (en) Systems and methods for tracking object location and orientation in virtual reality environments using ultra-wideband signals, inertia measurement units, and reflective markers
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
WO2022148419A1 (en) Quadrupedal robot positioning apparatus and quadrupedal robot formation
CN113221729A (en) Unmanned aerial vehicle cluster control method and system based on gesture human-computer interaction
CN111752386A (en) Space positioning method and system and head-mounted equipment
CN114326757A (en) Precise landing control method and system for unmanned aerial vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16909347

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/07/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16909347

Country of ref document: EP

Kind code of ref document: A1