WO2019006769A1 - 无人机跟拍方法及装置 - Google Patents

无人机跟拍方法及装置 Download PDF

Info

Publication number
WO2019006769A1
WO2019006769A1 PCT/CN2017/092296 CN2017092296W WO2019006769A1 WO 2019006769 A1 WO2019006769 A1 WO 2019006769A1 CN 2017092296 W CN2017092296 W CN 2017092296W WO 2019006769 A1 WO2019006769 A1 WO 2019006769A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
drone
video image
flight
uav
Prior art date
Application number
PCT/CN2017/092296
Other languages
English (en)
French (fr)
Inventor
杨顺伟
Original Assignee
杨顺伟
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杨顺伟 filed Critical 杨顺伟
Publication of WO2019006769A1 publication Critical patent/WO2019006769A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a method and apparatus for following a drone.
  • the drone is referred to as the drone, and is generally operated by using the radio remote control device and the drone's own program control device.
  • UAVs are widely used in film and television shooting, street scene shooting, remote sensing mapping, express delivery, power inspection, crop monitoring, environmental monitoring, and post-disaster rescue.
  • the embodiment of the invention provides a method and a device for following the drone, which can solve the problem that the drone path is unstable.
  • an embodiment of the present invention provides a method for following a drone, including:
  • a drone flight line is generated based on the motion trajectory of the target object.
  • the performing target object detection on the video image includes:
  • the object includes: any one or any combination of pedestrians, animals, and vehicles.
  • the generating a UAV flight line based on a motion trajectory of the target object includes:
  • a UAV flight line that is consistent with the motion trajectory of the target object is generated.
  • the generating a UAV flight line based on a motion trajectory of the target object includes:
  • the generating the After the flight line it also includes:
  • the flight direction and the flight speed of the UAV are controlled, and the flight direction is determined at least by the relative direction, and the flight speed is determined at least by the relative distance.
  • an embodiment of the present invention provides a drone tracking device, including:
  • a detecting module configured to perform target object detection on the video image
  • An acquiring module configured to acquire a motion track of the target object when the target object is detected
  • a generating module configured to generate a drone flight line based on a motion trajectory of the target object.
  • the detecting module includes:
  • a detecting submodule configured to perform target object detection on the video image based on a depth neural network
  • the target object includes any one or any combination of pedestrians, animals, and vehicles.
  • the generating module includes:
  • a first generating submodule configured to generate a UAV flight line that is consistent with a motion trajectory of the target object based on a motion trajectory of the target object.
  • the generating module includes:
  • Obtaining a sub-module configured to acquire a real-time geographic location of the target object and a real-time geographic location of the drone;
  • a calculation submodule configured to calculate a relative direction and a relative distance of the target object relative to the drone according to a real-time geographic location of the target object and a real-time geographic location of the drone;
  • a prediction submodule configured to predict, according to the video image, a motion trajectory of the target object according to the depth neural network, to obtain a target object prediction trajectory
  • a second generation submodule configured to generate the UAV flight line according to the target object prediction trajectory, the relative direction, and a relative distance.
  • the root device further includes:
  • control module configured to control a flight direction and a flight speed of the drone according to the UAV flight line, wherein the flight direction is determined at least by the relative direction, and the flight speed is at least through the relative distance Ok to get.
  • the method and device for following the drone of the embodiment of the present invention by taking a video image; performing target object detection on the video image; and acquiring the motion trajectory of the target object when the target object is detected;
  • the trajectory of the target object is generated to generate a drone flight line.
  • FIG. 1 is a schematic flow chart of a method for following a drone of a drone according to an embodiment of the present invention
  • FIG. 2 is another schematic flow chart of a method for following a drone of the embodiment of the present invention.
  • FIG. 3 is a schematic structural view of a UAV follow-up device according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a detection module according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a generation module according to an embodiment of the present invention.
  • FIG. 6 is another schematic structural diagram of a generating module according to an embodiment of the present invention.
  • FIG. 7 is another schematic structural view of a UAV follow-up device according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural view of a drone follower device 800 according to an embodiment of the present invention.
  • An embodiment of the present invention provides a method for following a drone, as shown in FIG. 1 , the method includes:
  • the video image may be captured by the camera module or the camera device integrated on the drone, or the video image may be captured by an external camera (for example, a motion camera) on the drone, and the video image may be transmitted.
  • an external camera for example, a motion camera
  • the embodiment of the present invention is not limited to the drone.
  • the video image can be analyzed and detected by the processing chip having the computing capability on the drone, and whether the target object exists; or after the drone is sent to the established mobile terminal or server, Video map of a mobile terminal or server that has established a connection with the drone
  • the embodiment of the present invention is not limited.
  • the feature extraction may be performed in the video image through the deep learning network, and each key point information of the target object is acquired, thereby implementing detection of the target object in the video image.
  • the location information of the target object and the time information corresponding to the video frame are acquired; and the motion track of the target object is generated based on each location information and each time information.
  • the location information may be the actual geographical location information of the target object; or may be the location information of the target object relative to the drone (ie, taking the center of the drone as the coordinate origin, the target object is in the coordinate system) Position); may also be centered on the position of the target object in the first video frame of the detected target object, and the position of the target object in the other video frames relative to the center (ie: the target in the first video frame)
  • the center of the object is the position of the coordinate origin, the position of the target object in the other video frames in the coordinate system).
  • a UAV flight line that is consistent with the motion trajectory of the target object may be generated.
  • the UAV flight line that is consistent with the motion trajectory of the target object means that the motion trajectories of the two are exactly the same, that is, the motion direction, the motion speed, and the motion distance of the two are completely the same.
  • the UAV By generating a UAV flight line consistent with the target object's motion trajectory, the UAV can achieve stable tracking of the target object, that is, during the shooting of the target object by the drone, the drone is in the video picture.
  • the position is always in the same position on the screen, which can improve the stability of the drone's video image taken by the target object.
  • the drone when detecting the first frame video image of the target object, the drone adjusts the range and angle of the video capture image to realize that the target object is in the middle of the video image captured by the drone.
  • the drone by adjusting the target object to the middle of the video image captured by the drone when the target object is first captured, the drone is always maintained during the subsequent shooting of the target object.
  • the synchronous movement of the target object enables the target object to always be in the middle of the video image captured by the drone, thereby further improving the picture quality of the drone to the target object.
  • the motion trajectory of the target object within the preset time period range may also be acquired, and the flight line of the drone is generated based thereon.
  • the motion trajectory of the target 2 seconds before the target object is acquired, and the current flight path of the drone is generated based on this.
  • the embodiment of the present invention captures a video image; performs target object detection on the video image; and when the target object is detected, acquires a motion trajectory of the target object; based on the target object Motion trajectory, generating drone flight lines.
  • shooting during the flight of the drone it is possible to accurately follow the subject to be photographed without manual control, and the follow-up path is stable, thereby improving the quality of the photographed picture.
  • Another embodiment of the present invention provides a method for photographing a drone. As shown in FIG. 2, the method includes:
  • the video image may be captured by the camera module or the camera device integrated on the drone, or the video image may be captured by an external camera (for example, a motion camera) on the drone, and the video image may be transmitted.
  • an external camera for example, a motion camera
  • the embodiment of the present invention is not limited to the drone.
  • the target object includes any one or any combination of pedestrians, animals, and vehicles.
  • the video image can be analyzed and detected by the processing chip having the computing capability on the drone, and whether the target object exists; or after the drone is sent to the established mobile terminal or server,
  • the mobile terminal or the server that has established a connection with the drone analyzes and detects the video image, and whether the target object exists, which is not limited in the embodiment of the present invention.
  • the location information of the target object and the time information corresponding to the video frame are acquired; and the motion track of the target object is generated based on each location information and each time information.
  • the location information may be the actual geographical location information of the target object; or may be the location information of the target object relative to the drone (ie, taking the center of the drone as the coordinate origin, the target object is in the coordinate system) Position); may also be centered on the position of the target object in the first video frame in which the target object is detected, and the position of the target object in the other video frames relative to the center (ie: The position of the target object in the other video frames in the coordinate system, taking the center of the target object in the first video frame as the coordinate origin.
  • the feature extraction may be performed in the video image through the deep neural network, and each key point information of the target object is acquired, thereby realizing detection of the target object in the video image.
  • the steps 204-205 may be replaced by: obtaining a relative direction and a relative distance of the target object with respect to the drone, that is, taking the center of the drone as a coordinate origin, the target object is in the The position in the coordinate system.
  • Steps 204-207 are replaced by: generating a drone flight line that is consistent with the motion trajectory of the target object based on the motion trajectory of the target object.
  • the UAV flight line that is consistent with the motion trajectory of the target object means that the motion trajectories of the two are exactly the same, that is, the motion direction, the motion speed, and the motion distance of the two are completely the same.
  • the UAV By generating a UAV flight line consistent with the target object's motion trajectory, the UAV can achieve stable tracking of the target object, that is, during the shooting of the target object by the drone, the drone is in the video picture.
  • the position is always in the same position on the screen, which can improve the stability of the drone's video image taken by the target object.
  • the drone when detecting the first frame video image of the target object, the drone adjusts the range and angle of the video capture image to realize that the target object is in the middle of the video image captured by the drone.
  • the drone by adjusting the target object to the drone during the first shooting In the middle of the captured video image, during the subsequent shooting of the target object, the drone always keeps moving in synchronization with the target object, enabling the target object to always be in the middle of the video image captured by the drone, thereby further improving The picture quality of the drone to follow the target object.
  • the motion trajectory of the target object within the preset time period range may also be acquired, and the flight line of the drone is generated based thereon.
  • the motion trajectory of the target 2 seconds before the target object is acquired, and the current flight path of the drone is generated based on this.
  • the flight direction is controlled by the relative direction
  • the flight speed is controlled by the relative distance
  • the embodiment of the present invention captures a video image; performs target object detection on the video image; and when the target object is detected, acquires a motion trajectory of the target object; based on the target object Motion trajectory, generating drone flight lines.
  • shooting during the flight of the drone it is possible to accurately follow the subject to be photographed without manual control, and the follow-up path is stable, thereby improving the quality of the photographed picture.
  • the device includes:
  • a shooting module 31 configured to capture a video image
  • a detecting module 32 configured to perform target object detection on the video image
  • the obtaining module 33 is configured to acquire a motion track of the target object when the target object is detected;
  • the generating module 34 is configured to generate a drone flight line based on a motion trajectory of the target object.
  • the detecting module 32 includes:
  • the detecting sub-module 321 is configured to perform target object detection on the video image based on a depth neural network; wherein the target object comprises: any one or any combination of a pedestrian, an animal, and a vehicle.
  • the generating module 34 includes:
  • the first generation sub-module 341 is configured to generate a UAV flight line that is consistent with the motion trajectory of the target object based on the motion trajectory of the target object.
  • the generating module 34 includes:
  • the obtaining sub-module 342 is configured to acquire a real-time geographic location of the target object and a real-time geographic location of the drone;
  • a calculation sub-module 343, configured to calculate a relative direction and a relative distance of the target object relative to the UAV according to a real-time geographic location of the target object and a real-time geographic location of the UAV;
  • a prediction sub-module 344 configured to predict, according to the video image, a motion trajectory of the target object according to the depth neural network, to obtain a target object prediction trajectory
  • the second generation sub-module 345 is configured to generate the UAV flight line according to the target object prediction trajectory, the relative direction, and a relative distance.
  • the root device further includes:
  • control module 71 configured to control a flight direction and a flight speed of the drone according to the UAV flight line, where the flight direction is determined at least by the relative direction, and the flight speed is at least through the relative The distance is determined.
  • the embodiment of the present invention captures a video image; performs target object detection on the video image; and when the target object is detected, acquires a motion trajectory of the target object; based on the target object Motion trajectory, generating drone flight lines.
  • shooting during the flight of the drone it is possible to accurately follow the subject to be photographed without manual control, and the follow-up path is stable, thereby improving the quality of the photographed picture.
  • the unmanned aerial vehicle tracking method and apparatus provided by the embodiments of the present invention can be applied to control the drone to carry out the aircraft, but is not limited thereto.
  • the drone tracking device 800 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a personal digital assistant, and the like.
  • the drone tracking device 800 can include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812. , sensor component 814, and communication component 818.
  • Processing component 802 typically controls the overall operation of drone with device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 802 can include a One or more processors 820 execute the instructions.
  • processing component 802 can include one or more modules to facilitate interaction between component 802 and other components.
  • processing component 802 can include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
  • the memory 804 is configured to store various types of data to support operation of the drone tracking device 800. Examples of such data include instructions for any application or method operating on the drone tracking device 800, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 804 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 806 provides power to the various components of drone device 800.
  • Power component 806 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for drone tracking device 800.
  • the multimedia component 808 includes a screen that provides an output interface between the drone tracking device 800 and the user.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 808 includes a front camera and/or a rear camera. When the drone with the beat device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 810 is configured to output and/or input an audio signal.
  • the audio component 810 includes a microphone (MIC) that is configured to receive an external audio signal when the drone is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 804 or transmitted via communication component 818.
  • the audio Component 810 also includes a speaker for outputting an audio signal.
  • the I/O interface 812 provides an interface between the processing component 802 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for drone tracking device 800.
  • the sensor assembly 814 can detect the open/closed state of the drone with the slap device 800, the relative positioning of the components, such as the display and the keypad of the drone tracking device 800, and the sensor assembly 814 can also detect The position of one component of the drone device 600 or the drone device 800 is changed, the presence or absence of the user's contact with the drone device 800, and the drone with the device 800 orientation or acceleration/deceleration and The temperature of the drone with the device 800 changes.
  • Sensor assembly 814 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 814 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 818 is configured to facilitate wired or wireless communication between drone tracking device 800 and other devices.
  • the drone tracking device 800 can access a wireless network based on communication standards, such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 818 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 818 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • the drone tracking device 800 can be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs). ), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable gate array
  • controller microcontroller, microprocessor or other electronic components.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

一种无人机跟拍方法及装置,涉及通信技术领域。该方法包括:拍摄视频图像(101);对视频图像进行目标对象检测(102);当检测到目标对象时,获取目标对象的运动轨迹(103);基于目标对象的运动轨迹,生成无人机飞行线路(104)。该方法及装置能够解决无人机跟拍路径不稳定的问题。

Description

无人机跟拍方法及装置 技术领域
本发明涉及通信技术领域,尤其涉及一种无人机跟拍方法及装置。
背景技术
无人驾驶飞机简称无人机,一般是利用无线电遥控设备和无人机自身的程序控制装置进行操纵。无人机广泛应用于影视拍摄、街景拍摄、遥感测绘、快递投递、电力巡检、农作物监测、环境监测、灾后救援等领域。
随着科技的发展,人们对无人机的飞行拍摄功能提出了更高的要求。现有技术中,需要通过人工控制无人机的飞行路径,以实现对目标进行跟拍。然而,人工控制无人机进行跟拍的方式,需要人工参与,消耗人力成本,并且无人机跟拍路径受限于人工对无人机的操控能力,易出现跟拍路径不稳定的情况,从而导致跟拍画面质量较差。
发明内容
本发明的实施例提供一种无人机跟拍方法及装置,能够解决无人机跟拍路径不稳定的问题。
为达到上述目的,本发明的实施例采用如下技术方案:
第一方面,本发明的实施例提供一种无人机跟拍方法,包括:
拍摄视频图像;
对所述视频图像进行目标对象检测;
当检测到所述目标对象时,获取所述目标对象的运动轨迹;
基于所述目标对象的运动轨迹,生成无人机飞行线路。
结合第一方面,在第一方面的第一种可能的实现方式中,所述对所述视频图像进行目标对象检测,包括:
基于深度神经网络,对所述视频图像进行目标对象检测;其中,所述目标 对象包括:行人、动物、车辆中的任意一种或任意组合。
结合第一方面,在第一方面的第二种可能的实现方式中,所述基于所述目标对象的运动轨迹,生成无人机飞行线路,包括:
基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
结合第一方面,在第一方面的第三种可能的实现方式中,所述基于所述目标对象的运动轨迹,生成无人机飞行线路,包括:
获取所述目标对象的实时地理位置及所述无人机的实时地理位置;
根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离;
基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹;
根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
结合第一方面的第三种可能的实现方式,在第一方面的第四种可能的实现方式中,所述根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路之后,还包括:
按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
第二方面,本发明的实施例提供一种无人机跟拍装置,包括:
拍摄模块,用于拍摄视频图像;
检测模块,用于对所述视频图像进行目标对象检测;
获取模块,用于当检测到所述目标对象时,获取所述目标对象的运动轨迹;
生成模块,用于基于所述目标对象的运动轨迹,生成无人机飞行线路。
结合第二方面,在第二方面的第一种可能的实现方式中,所述检测模块,包括:
检测子模块,用于基于深度神经网络,对所述视频图像进行目标对象检测; 其中,所述目标对象包括:行人、动物、车辆中的任意一种或任意组合。
结合第二方面,在第二方面的第二种可能的实现方式中,所述生成模块,包括:
第一生成子模块,用于基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
结合第二方面,在第二方面的第三种可能的实现方式中,所述生成模块,包括:
获取子模块,用于获取所述目标对象的实时地理位置及所述无人机的实时地理位置;
计算子模块,用于根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离;
预测子模块,用于基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹;
第二生成子模块,用于根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
结合第二方面的第三种可能的实现方式,在第二方面的第四种可能的实现方式中,所述根装置还包括:
控制模块,用于按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
本发明实施例提供的无人机跟拍方法及装置,通过拍摄视频图像;对所述视频图像进行目标对象检测;当检测到所述目标对象时,获取所述目标对象的运动轨迹;基于所述目标对象的运动轨迹,生成无人机飞行线路。能够当在无人机飞行过程中进行拍摄时,实现在无需人工进行控制的同时,对待拍摄物体准确跟拍,跟拍路径稳定,从而可以提高跟拍画面质量。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1是本发明实施例的无人机跟拍方法的流程示意图;
图2是本发明实施例的无人机跟拍方法的另一流程示意图;
图3是本发明实施例的无人机跟拍装置结构示意图;
图4是本发明实施例的检测模块的结构示意图;
图5是本发明实施例的生成模块的结构示意图;
图6是本发明实施例的生成模块的另一结构示意图;
图7是本发明实施例的无人机跟拍装置的另一结构示意图;
图8是本发明实施例的无人机跟拍装置800的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本发明保护的范围。
本发明一实施例提供一种无人机跟拍方法,如图1所示,所述方法包括:
101、拍摄视频图像。
对于本发明实施例,可以通过无人机上集成的摄像模组或摄像装置拍摄视频图像,也可以通过外接在无人机上的摄像装置(例如,运动摄像头)拍摄视频图像,并将该视频图像传输给无人机,本发明实施例不做限制。
102、对所述视频图像进行目标对象检测。
对于本发明实施例,可以通过无人机上具有运算能力的处理芯片,对视频图像进行分析检测,是否存在该目标对象;也可以在无人机发送给已建立连接的移动终端或服务器后,由与无人机已建立连接的移动终端或服务器对视频图 像进行分析检测,是否存在该目标对象,本发明实施例不做限制。
在本发明实施例中,具体可以通过深度学习网络,在视频图像中进行特征提取,获取目标对象的各关键点信息,从而实现在视频图像中对目标对象的检测。
103、当检测到所述目标对象时,获取所述目标对象的运动轨迹。
可选地,在检测到目标对象的各视频帧中,获取目标对象的位置信息,以及该视频帧对应的时间信息;并基于各位置信息及各时间信息,生成目标对象的运动轨迹。其中,位置信息可以是目标对象的实际地理位置信息;也可以是目标对象相对于无人机的位置信息(即:以无人机的中心为坐标原点来看,目标对象在该坐标系中的位置);还可以是以检测到目标对象的第一个视频帧中目标对象的位置为中心,其他视频帧中目标对象的位置相对于该中心的位置(即:以第一个视频帧中目标对象的中心为坐标原点来看,其他视频帧中的目标对象在该坐标系中的位置)。
104、基于所述目标对象的运动轨迹,生成无人机飞行线路。
对于本发明实施例,具体可以为:基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
在本发明实施例中,与目标对象的运动轨迹一致的无人机飞行线路是指,二者的运动轨迹完全相同,即二者的运动方向、运动速度、运动距离均完全相同。
通过生成与目标对象的运动轨迹一致的无人机飞行线路,能够实现无人机对目标对象的稳定跟拍,即在无人机对目标对象的拍摄过程中,无人机在视频画面中的位置始终处于画面的同一位置,可以提高无人机对目标对象拍摄的视频画面的稳定性。
可选地,在检测到目标对象存在的第一帧视频图像时,无人机对视频拍摄画面的范围及角度进行调整,以实现该目标对象在无人机拍摄的视频图像的正中间。
在本发明实施例中,通过在首次拍摄到目标对象时,将其调整至无人机拍摄的视频图像的正中间,在后续对目标对象的拍摄过程中,无人机始终保持与 目标对象同步运动,能够实现目标对象始终在无人机拍摄的视频图像的正中间,从而可以进一步提高无人机对目标对象进行跟拍的画面质量。
对于本发明实施例,还可以获取预设时间段范围内的目标对象的运动轨迹,并基于此生成无人机的飞行线路。例如,获取目标对象前2秒的运动轨迹,基于此生成无人机当前的飞行线路。
与现有技术相比,本发明实施例通过拍摄视频图像;对所述视频图像进行目标对象检测;当检测到所述目标对象时,获取所述目标对象的运动轨迹;基于所述目标对象的运动轨迹,生成无人机飞行线路。能够当在无人机飞行过程中进行拍摄时,实现在无需人工进行控制的同时,对待拍摄物体准确跟拍,跟拍路径稳定,从而可以提高跟拍画面质量。
本发明又一实施例提供一种无人机拍摄方法,如图2所示,所述方法包括:
201、拍摄视频图像。
对于本发明实施例,可以通过无人机上集成的摄像模组或摄像装置拍摄视频图像,也可以通过外接在无人机上的摄像装置(例如,运动摄像头)拍摄视频图像,并将该视频图像传输给无人机,本发明实施例不做限制。
202、基于深度神经网络,对所述视频图像进行目标对象检测。
其中,所述目标对象包括:行人、动物、车辆中的任意一种或任意组合。
对于本发明实施例,可以通过无人机上具有运算能力的处理芯片,对视频图像进行分析检测,是否存在该目标对象;也可以在无人机发送给已建立连接的移动终端或服务器后,由与无人机已建立连接的移动终端或服务器对视频图像进行分析检测,是否存在该目标对象,本发明实施例不做限制。
203、当检测到所述目标对象时,获取所述目标对象的运动轨迹。
可选地,在检测到目标对象的各视频帧中,获取目标对象的位置信息,以及该视频帧对应的时间信息;并基于各位置信息及各时间信息,生成目标对象的运动轨迹。其中,位置信息可以是目标对象的实际地理位置信息;也可以是目标对象相对于无人机的位置信息(即:以无人机的中心为坐标原点来看,目标对象在该坐标系中的位置);还可以是以检测到目标对象的第一个视频帧中目标对象的位置为中心,其他视频帧中目标对象的位置相对于该中心的位置(即: 以第一个视频帧中目标对象的中心为坐标原点来看,其他视频帧中的目标对象在该坐标系中的位置)。
204、获取所述目标对象的实时地理位置及所述无人机的实时地理位置。
在本发明实施例中,具体可以通过深度神经网络,在视频图像中进行特征提取,获取目标对象的各关键点信息,从而实现在视频图像中对目标对象的检测。
205、根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离。
可选地,步骤204-205也可以替换为:获取所述目标对象相对于所述无人机的相对方向及相对距离,即:以无人机的中心为坐标原点来看,目标对象在该坐标系中的位置。
206、基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹。
207、根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
步骤204-207替换为:基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
在本发明实施例中,与目标对象的运动轨迹一致的无人机飞行线路是指,二者的运动轨迹完全相同,即二者的运动方向、运动速度、运动距离均完全相同。
通过生成与目标对象的运动轨迹一致的无人机飞行线路,能够实现无人机对目标对象的稳定跟拍,即在无人机对目标对象的拍摄过程中,无人机在视频画面中的位置始终处于画面的同一位置,可以提高无人机对目标对象拍摄的视频画面的稳定性。
可选地,在检测到目标对象存在的第一帧视频图像时,无人机对视频拍摄画面的范围及角度进行调整,以实现该目标对象在无人机拍摄的视频图像的正中间。
在本发明实施例中,通过在首次拍摄到目标对象时,将其调整至无人机拍 摄的视频图像的正中间,在后续对目标对象的拍摄过程中,无人机始终保持与目标对象同步运动,能够实现目标对象始终在无人机拍摄的视频图像的正中间,从而可以进一步提高无人机对目标对象进行跟拍的画面质量。
对于本发明实施例,还可以获取预设时间段范围内的目标对象的运动轨迹,并基于此生成无人机的飞行线路。例如,获取目标对象前2秒的运动轨迹,基于此生成无人机当前的飞行线路。
208、按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
对于本发明实施例,通过相对方向控制飞行方向,通过相对距离控制飞行速度,能够实现无人机对目标对象的准确跟拍。
与现有技术相比,本发明实施例通过拍摄视频图像;对所述视频图像进行目标对象检测;当检测到所述目标对象时,获取所述目标对象的运动轨迹;基于所述目标对象的运动轨迹,生成无人机飞行线路。能够当在无人机飞行过程中进行拍摄时,实现在无需人工进行控制的同时,对待拍摄物体准确跟拍,跟拍路径稳定,从而可以提高跟拍画面质量。
本发明又一实施例提供一种无人机跟拍装置,如图3所示,所述装置包括:
拍摄模块31,用于拍摄视频图像;
检测模块32,用于对所述视频图像进行目标对象检测;
获取模块33,用于当检测到所述目标对象时,获取所述目标对象的运动轨迹;
生成模块34,用于基于所述目标对象的运动轨迹,生成无人机飞行线路。
进一步的,如图4所示,所述检测模块32,包括:
检测子模块321,用于基于深度神经网络,对所述视频图像进行目标对象检测;其中,所述目标对象包括:行人、动物、车辆中的任意一种或任意组合。
进一步的,如图5所示,所述生成模块34,包括:
第一生成子模块341,用于基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
进一步的,如图6所示,所述生成模块34,包括:
获取子模块342,用于获取所述目标对象的实时地理位置及所述无人机的实时地理位置;
计算子模块343,用于根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离;
预测子模块344,用于基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹;
第二生成子模块345,用于根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
进一步的,如图7所示,所述根装置还包括:
控制模块71,用于按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
与现有技术相比,本发明实施例通过拍摄视频图像;对所述视频图像进行目标对象检测;当检测到所述目标对象时,获取所述目标对象的运动轨迹;基于所述目标对象的运动轨迹,生成无人机飞行线路。能够当在无人机飞行过程中进行拍摄时,实现在无需人工进行控制的同时,对待拍摄物体准确跟拍,跟拍路径稳定,从而可以提高跟拍画面质量。
本发明实施例提供的无人机跟拍装置可以实现上述提供的方法实施例,具体功能实现请参见方法实施例中的说明,在此不再赘述。本发明实施例提供的无人机跟拍方法及装置可以适用于控制无人机进行飞机,但不仅限于此。
如图8所示,无人机跟拍装置800可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,个人数字助理等。
参照图8,无人机跟拍装置800可以包括以下一个或多个组件:处理组件802,存储器804,电源组件806,多媒体组件808,音频组件810,输入/输出(I/O)的接口812,传感器组件814,以及通信组件818。
处理组件802通常控制无人机跟拍装置800的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件802可以包括一 个或多个处理器820来执行指令。
此外,处理组件802可以包括一个或多个模块,便于处理组件802和其他组件之间的交互。例如,处理组件802可以包括多媒体模块,以方便多媒体组件808和处理组件802之间的交互。
存储器804被配置为存储各种类型的数据以支持在无人机跟拍装置800的操作。这些数据的示例包括用于在无人机跟拍装置800上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器804可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件806为无人机跟拍装置800的各种组件提供电力。电源组件806可以包括电源管理系统,一个或多个电源,及其他与为无人机跟拍装置800生成、管理和分配电力相关联的组件。
多媒体组件808包括在所述无人机跟拍装置800和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件808包括一个前置摄像头和/或后置摄像头。当无人机跟拍装置800处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件810被配置为输出和/或输入音频信号。例如,音频组件810包括一个麦克风(MIC),当无人机跟拍装置800处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器804或经由通信组件818发送。在一些实施例中,音频 组件810还包括一个扬声器,用于输出音频信号。
I/O接口812为处理组件802和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件814包括一个或多个传感器,用于为无人机跟拍装置800提供各个方面的状态评估。例如,传感器组件814可以检测到无人机跟拍装置800的打开/关闭状态,组件的相对定位,例如所述组件为无人机跟拍装置800的显示器和小键盘,传感器组件814还可以检测无人机跟拍装置800或无人机跟拍装置800一个组件的位置改变,用户与无人机跟拍装置800接触的存在或不存在,无人机跟拍装置800方位或加速/减速和无人机跟拍装置800的温度变化。传感器组件814可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件814还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件814还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件818被配置为便于无人机跟拍装置800和其他设备之间有线或无线方式的通信。无人机跟拍装置800可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件818经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件818还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,无人机跟拍装置800可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于设备实施例而言,由于其基本相似于方法实施例,所以描述得比较 简单,相关之处参见方法实施例的部分说明即可。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求的保护范围为准。

Claims (10)

  1. 一种无人机跟拍方法,其特征在于,包括:
    拍摄视频图像;
    对所述视频图像进行目标对象检测;
    当检测到所述目标对象时,获取所述目标对象的运动轨迹;
    基于所述目标对象的运动轨迹,生成无人机飞行线路。
  2. 根据权利要求1所述的无人机跟拍方法,其特征在于,所述对所述视频图像进行目标对象检测,包括:
    基于深度神经网络,对所述视频图像进行目标对象检测;其中,所述目标对象包括:行人、动物、车辆中的任意一种或任意组合。
  3. 根据权利要求1所述的无人机跟拍方法,其特征在于,所述基于所述目标对象的运动轨迹,生成无人机飞行线路,包括:
    基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
  4. 根据权利要求1所述的无人机跟拍方法,其特征在于,所述基于所述目标对象的运动轨迹,生成无人机飞行线路,包括:
    获取所述目标对象的实时地理位置及所述无人机的实时地理位置;
    根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离;
    基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹;
    根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
  5. 根据权利要求4所述的无人机跟拍方法,其特征在于,所述根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路之后,还包括:
    按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
  6. 一种无人机跟拍装置,其特征在于,包括:
    拍摄模块,用于拍摄视频图像;
    检测模块,用于对所述视频图像进行目标对象检测;
    获取模块,用于当检测到所述目标对象时,获取所述目标对象的运动轨迹;
    生成模块,用于基于所述目标对象的运动轨迹,生成无人机飞行线路。
  7. 根据权利要求6所述的无人机跟拍装置,其特征在于,所述检测模块,包括:
    检测子模块,用于基于深度神经网络,对所述视频图像进行目标对象检测;其中,所述目标对象包括:行人、动物、车辆中的任意一种或任意组合。
  8. 根据权利要求6所述的无人机跟拍装置,其特征在于,所述生成模块,包括:
    第一生成子模块,用于基于所述目标对象的运动轨迹,生成与所述目标对象的运动轨迹一致的无人机飞行线路。
  9. 根据权利要求6所述的无人机跟拍装置,其特征在于,所述生成模块,包括:
    获取子模块,用于获取所述目标对象的实时地理位置及所述无人机的实时地理位置;
    计算子模块,用于根据所述目标对象的实时地理位置及所述无人机的实时地理位置,计算所述目标对象相对于所述无人机的相对方向及相对距离;
    预测子模块,用于基于深度神经网络,根据所述视频图像对所述目标对象的运动轨迹进行预测,得到目标对象预测轨迹;
    第二生成子模块,用于根据所述目标对象预测轨迹、所述相对方向及相对距离,生成所述无人机飞行线路。
  10. 根据权利要求9所述的无人机跟拍装置,其特征在于,所述根装置还包括:
    控制模块,用于按照所述无人机飞行线路,控制所述无人机的飞行方向及飞行速度,所述飞行方向至少通过所述相对方向确定得到,所述飞行速度至少通过所述相对距离确定得到。
PCT/CN2017/092296 2017-07-06 2017-07-07 无人机跟拍方法及装置 WO2019006769A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710544730.1A CN107172360A (zh) 2017-07-06 2017-07-06 无人机跟拍方法及装置
CN201710544730.1 2017-07-06

Publications (1)

Publication Number Publication Date
WO2019006769A1 true WO2019006769A1 (zh) 2019-01-10

Family

ID=59822886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/092296 WO2019006769A1 (zh) 2017-07-06 2017-07-07 无人机跟拍方法及装置

Country Status (2)

Country Link
CN (1) CN107172360A (zh)
WO (1) WO2019006769A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11443518B2 (en) 2020-11-30 2022-09-13 At&T Intellectual Property I, L.P. Uncrewed aerial vehicle shared environment privacy and security
CN115953704A (zh) * 2023-01-18 2023-04-11 北京理工大学 一种无人机探测方法
US11726475B2 (en) 2020-11-30 2023-08-15 At&T Intellectual Property I, L.P. Autonomous aerial vehicle airspace claiming and announcing
CN116612493A (zh) * 2023-04-28 2023-08-18 深圳先进技术研究院 一种行人地理轨迹提取方法及设备
US11797896B2 (en) 2020-11-30 2023-10-24 At&T Intellectual Property I, L.P. Autonomous aerial vehicle assisted viewing location selection for event venue

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108032994A (zh) * 2017-12-06 2018-05-15 余姚市荣事特电子有限公司 一种无人直升机
CN107985569A (zh) * 2017-12-06 2018-05-04 余姚市荣事特电子有限公司 一种无人机
CN108052914A (zh) * 2017-12-21 2018-05-18 中国科学院遥感与数字地球研究所 一种基于slam和图像识别的森林林木资源调查方法
WO2019127395A1 (zh) * 2017-12-29 2019-07-04 深圳市大疆创新科技有限公司 一种无人机拍照方法、图像处理方法和装置
CN109949616B (zh) * 2019-03-25 2021-05-11 同济大学 一种桥梁主动防船撞监测预警系统
CN113836646A (zh) * 2020-06-04 2021-12-24 北京国电思达科技有限公司 一种基于深度学习的风机叶片缺陷智能分析方法及系统
CN112485309A (zh) * 2020-11-19 2021-03-12 深圳市粤通建设工程有限公司 基于无人机的沥青混合料施工压实检测方法、装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2538298A1 (en) * 2011-06-22 2012-12-26 Sensefly Sàrl Method for acquiring images from arbitrary perspectives with UAVs equipped with fixed imagers
CN105120146A (zh) * 2015-08-05 2015-12-02 普宙飞行器科技(深圳)有限公司 一种利用无人机进行运动物体自动锁定拍摄装置及拍摄方法
CN105487552A (zh) * 2016-01-07 2016-04-13 深圳一电航空技术有限公司 无人机跟踪拍摄的方法及装置
CN106022239A (zh) * 2016-05-13 2016-10-12 电子科技大学 一种基于循环神经网络的多目标跟踪方法
CN106483980A (zh) * 2016-11-24 2017-03-08 腾讯科技(深圳)有限公司 一种无人机跟随飞行的控制方法、装置及系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939B (zh) * 2013-02-26 2015-10-21 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
CN105759839B (zh) * 2016-03-01 2018-02-16 深圳市大疆创新科技有限公司 无人机视觉跟踪方法、装置以及无人机
CN105929850B (zh) * 2016-05-18 2018-10-19 中国计量大学 一种具有持续锁定和跟踪目标能力的无人机系统与方法
KR101769601B1 (ko) * 2016-07-13 2017-08-18 아이디어주식회사 자동추적 기능을 갖는 무인항공기

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2538298A1 (en) * 2011-06-22 2012-12-26 Sensefly Sàrl Method for acquiring images from arbitrary perspectives with UAVs equipped with fixed imagers
CN105120146A (zh) * 2015-08-05 2015-12-02 普宙飞行器科技(深圳)有限公司 一种利用无人机进行运动物体自动锁定拍摄装置及拍摄方法
CN105487552A (zh) * 2016-01-07 2016-04-13 深圳一电航空技术有限公司 无人机跟踪拍摄的方法及装置
CN106022239A (zh) * 2016-05-13 2016-10-12 电子科技大学 一种基于循环神经网络的多目标跟踪方法
CN106483980A (zh) * 2016-11-24 2017-03-08 腾讯科技(深圳)有限公司 一种无人机跟随飞行的控制方法、装置及系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11443518B2 (en) 2020-11-30 2022-09-13 At&T Intellectual Property I, L.P. Uncrewed aerial vehicle shared environment privacy and security
US11726475B2 (en) 2020-11-30 2023-08-15 At&T Intellectual Property I, L.P. Autonomous aerial vehicle airspace claiming and announcing
US11797896B2 (en) 2020-11-30 2023-10-24 At&T Intellectual Property I, L.P. Autonomous aerial vehicle assisted viewing location selection for event venue
CN115953704A (zh) * 2023-01-18 2023-04-11 北京理工大学 一种无人机探测方法
CN115953704B (zh) * 2023-01-18 2023-10-03 北京理工大学 一种无人机探测方法
CN116612493A (zh) * 2023-04-28 2023-08-18 深圳先进技术研究院 一种行人地理轨迹提取方法及设备

Also Published As

Publication number Publication date
CN107172360A (zh) 2017-09-15

Similar Documents

Publication Publication Date Title
WO2019006769A1 (zh) 无人机跟拍方法及装置
KR101712301B1 (ko) 화면을 촬영하기 위한 방법 및 디바이스
US9674395B2 (en) Methods and apparatuses for generating photograph
JP6388706B2 (ja) 無人航空機の撮影制御方法及び撮影制御装置、電子デバイス
CN104065878B (zh) 拍摄控制方法、装置及终端
US11361586B2 (en) Method for sending warning information, storage medium and terminal
WO2017020408A1 (zh) 视频录制方法和装置
US20160027191A1 (en) Method and device for adjusting skin color
WO2018107785A1 (zh) 3d打印数据生成方法及装置
CN105120144A (zh) 图像拍摄方法和装置
WO2016101481A1 (zh) 自动对焦方法及装置
EP3352453B1 (en) Photographing method for intelligent flight device and intelligent flight device
RU2664674C2 (ru) Способ и устройство для создания панорамы
US11310443B2 (en) Video processing method, apparatus and storage medium
WO2019006770A1 (zh) 无人机充电方法及装置
US10191708B2 (en) Method, apparatrus and computer-readable medium for displaying image data
CN106210495A (zh) 图像拍摄方法和装置
CN108398127A (zh) 一种室内定位方法及装置
CN105959563B (zh) 图像存储方法和图像存储装置
WO2019006771A1 (zh) 无人机抗外力干扰方法及装置
EP3211879A1 (en) Method and device for automatically capturing photograph, electronic device
CN108986803B (zh) 场景控制方法及装置、电子设备、可读存储介质
CN114549578A (zh) 目标跟踪方法、装置及存储介质
CN109255839B (zh) 场景调整方法及装置
CN107948876B (zh) 控制音箱设备的方法、装置及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17917212

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17917212

Country of ref document: EP

Kind code of ref document: A1