WO2018112831A1 - 无人机及其控制方法 - Google Patents

无人机及其控制方法 Download PDF

Info

Publication number
WO2018112831A1
WO2018112831A1 PCT/CN2016/111490 CN2016111490W WO2018112831A1 WO 2018112831 A1 WO2018112831 A1 WO 2018112831A1 CN 2016111490 W CN2016111490 W CN 2016111490W WO 2018112831 A1 WO2018112831 A1 WO 2018112831A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
height
target
image
preset reference
Prior art date
Application number
PCT/CN2016/111490
Other languages
English (en)
French (fr)
Inventor
朱成伟
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201680004731.0A priority Critical patent/CN107108023B/zh
Priority to PCT/CN2016/111490 priority patent/WO2018112831A1/zh
Priority to CN201910840461.2A priority patent/CN110525650B/zh
Publication of WO2018112831A1 publication Critical patent/WO2018112831A1/zh
Priority to US16/445,796 priority patent/US20190310658A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/042Control of altitude or depth specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/20Transmission of mechanical power to rotors or propellers
    • B64U50/23Transmission of mechanical power to rotors or propellers with each propulsion means having an individual motor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U60/00Undercarriages
    • B64U60/50Undercarriages with landing legs

Definitions

  • the invention relates to a drone, and in particular to a drone with autonomous flight function.
  • the technical problem mainly solved by the present invention is to provide a drone with autonomous flight function, which can realize autonomous flight without being controlled by the user.
  • An aspect of the present invention provides a drone control method, the method comprising: receiving a position of a target in an image, acquiring a first height of the drone relative to the ground, and a position in the image according to the target And the first height controls the drone to fly.
  • Another invention of the present invention provides a drone, the drone including a sensor for acquiring a first height of the drone relative to the ground, and a processor for receiving a position of the target in the image, And controlling the drone to fly according to the location of the target in the image and the first height.
  • the drone further includes a memory for storing a preset reference height.
  • the processor is further configured to acquire the preset reference height, And controlling the drone to fly according to the preset reference height, the first height, and the position of the target in the image.
  • the processor is further configured to calculate a position corresponding to the target on the ground according to the position of the target in the image, analyze the first height according to the preset reference height, and control The drone flies to the preset reference altitude and controls the drone to fly along the preset reference altitude.
  • the processor is further configured to calculate a position corresponding to the target on the ground according to a position of the target in the image, analyze the first height according to the preset reference height, and The drone is controlled to fly along the first altitude to a position above the corresponding position on the ground and hover.
  • the senor comprises at least one of an ultrasonic sensor, a TOF sensor, a barometer, an infrared sensor, a microwave sensor, a proximity sensor.
  • Figure 1 is a schematic structural view of a drone according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural view of a bottom of a drone provided by an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a module of a drone provided by an embodiment of the present invention.
  • FIG. 4 is a flow chart of a drone control method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a UAV computing target position according to an embodiment of the present invention.
  • Embodiment 1 of a flight path of a drone provided by the present invention
  • FIG. 7 is a schematic diagram of Embodiment 2 of a flight path of a drone provided by the present invention.
  • FIG. 1 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • the drone 100 can include a fuselage 110 that includes a central portion 111 and at least one outer portion 112.
  • the fuselage 110 includes four outer portions 112 (such as the arm 113).
  • the four outer portions 112 extend from the central portion 111, respectively.
  • the body 110 can include any number of external portions 112 (eg, 6, 8, etc.).
  • each of the outer portions 112 can carry a propulsion system 120 that can drive the drone 100 to move (e.g., climb, land, horizontally move, etc.).
  • the arm 113 can carry a corresponding motor 121, and the motor 121 can drive the corresponding propeller to rotate.
  • the drone 100 can control any set of motors 121 and their corresponding propellers 122 without being affected by the remaining motors 121 and their corresponding propellers.
  • the body 110 can carry a load 130, such as an imaging device 131.
  • the imaging device 131 can include a camera, for example, an image, video, etc. around the drone can be taken.
  • the camera is photosensitive to light of various wavelengths including, but not limited to, visible light, ultraviolet light, infrared light, or any combination thereof.
  • the load 130 can include other kinds of sensors.
  • the load 130 is coupled to the fuselage 110 by a pan/tilt 150 such that the load 130 can move relative to the fuselage 110.
  • the imaging device 131 can move relative to the body 110 to capture images, videos, and the like around the drone 100.
  • the landing gear 114 can support the drone 100 to protect the load 130 when the drone 100 is on the ground.
  • the drone 100 can include a control system 140 that includes components disposed in the drone 100 and components that are separate from the drone 100.
  • the control system 140 can include a first controller 141 disposed on the drone 100, and a remote from the drone 100 and coupled via a communication link 160 (eg, a wireless link)
  • the second controller 142 is connected to the first controller 141.
  • the first controller 141 can include at least one processor, a memory, and an onboard computer readable medium 143a, the onboard computer readable medium 143a can store program instructions for controlling the behavior of the drone 100,
  • the behavior includes, but is not limited to, the operation of the propulsion system 120 and the imaging device 131, controlling the drone to perform automatic landing, and the like.
  • the computer readable medium 143a can also be used to store state information of the drone 100, such as altitude, speed, location, preset reference height, and the like.
  • the second controller 142 can include at least one processor, memory, off-board computer readable medium 143b, and at least one input and output device 148, such as display device 144 and control device 145.
  • An operator of the drone 100 can remotely control the drone 100 through the control device 145 and receive feedback information from the drone 100 via the display device 144 and/or other devices.
  • the drone 100 can operate autonomously, at which time the second controller 142 can be omitted, or the second controller 142 can only be used to make the drone operator heavy Write a function for drone flight.
  • the onboard computer readable medium 143a can be moved out of the drone 100.
  • the off-board computing connection The read medium 143b can be moved out of the second controller 142.
  • the drone 100 can include two forward looking cameras 171 and 172 that are sensitive to light of various wavelengths (eg, visible light, infrared light, ultraviolet light) for shooting. An image or video around the drone. In some embodiments, the drone 100 includes at least one sensor placed at the bottom.
  • various wavelengths eg, visible light, infrared light, ultraviolet light
  • the drone 100 can include two lower looking cameras 173 and 174 placed at the bottom of the fuselage 110.
  • the drone 100 further includes two ultrasonic sensors 177 and 178 placed at the bottom of the body 110.
  • the ultrasonic sensors 177 and 178 can detect and/or monitor objects and ground at the bottom of the drone 100 and measure the distance from the object or the ground by transmitting and receiving ultrasonic waves.
  • the drone 100 may include an inertial measurement unit (English: inertial measurement unit, IMU), an infrared sensor, a microwave sensor, a temperature sensor, a proximity sensor (English: proximity sensor), a three-dimensional laser Range finder, 3D TOF, etc.
  • IMU inertial measurement unit
  • IMU inertial measurement unit
  • MEMS inertial measurement unit
  • a microwave sensor microwave sensor
  • a temperature sensor a temperature sensor
  • a proximity sensor English: proximity sensor
  • 3D TOF a three-dimensional laser Range finder
  • the three-dimensional laser range finder and the 3D TOF can detect the distance of an object or a body surface under the drone.
  • the drone 100 can receive input information from the input and output device 148, such as a user transmitting a target to the drone 100 through the input and output device 148.
  • the drone 100 can identify a corresponding position of the target on the ground according to the target, and the first controller can control the drone 100 to fly above the corresponding position and hover.
  • the drone 100 can receive input information from the input and output device 148, such as a user transmitting a target to the drone 100 through the input and output device 148.
  • the drone 100 can identify a corresponding position of the target on the ground according to the target.
  • the first controller may control the drone 100 to fly to a preset reference altitude and fly along the preset reference altitude.
  • FIG. 3 is a schematic diagram of a module of a drone according to an embodiment of the present invention.
  • the drone 100 may include at least one processor 301, a sensor module 302, a storage module 303, and an input and output module 304.
  • the control module 301 can include at least one processor, including but not limited to a microprocessor (English: microcontroller), a reduced instruction set computer (English: reduced RISC), an application specific integrated circuit ( English: application specific integrated circuits (ASIC), application-specific instruction-set processor (ASIP), central processing unit (English: central processing unit, CPU for short), physical processing English (English: physics processing unit, referred to as: PPU), digital signal processor (English: digital signal processor, referred to as DSP), field programmable gate array (English: field programmable gate array, referred to as: FPGA).
  • a microprocessor English: microcontroller
  • a reduced instruction set computer English: reduced RISC
  • an application specific integrated circuit English: application specific integrated circuits (ASIC), application-specific instruction-set processor (ASIP)
  • central processing unit English: central processing unit, CPU for short
  • physical processing English English: physics processing unit, referred to as: PPU
  • digital signal processor English: digital signal processor, referred to as DSP
  • the sensor module 302 can include at least one sensor including, but not limited to, a temperature sensor, an inertial measurement unit, an accelerometer, an image sensor (such as a camera), an ultrasonic sensor, a TOF sensor, a microwave sensor, a proximity sensor, a three-dimensional laser. Range finder, infrared sensor, etc.
  • the inertial measurement unit can be used to measure attitude information (eg, pitch angle, roll angle, yaw angle, etc.) of the drone.
  • the inertial measurement unit may include, but is not limited to, at least one accelerometer, gyroscope, magnetometer, or any combination thereof.
  • the accelerometer can be used to measure the acceleration of the drone to calculate the speed of the drone.
  • the storage module 303 can include, but is not limited to, a read only memory (ROM), a random access memory (RAM), a programmable system memory (PROM), an electronic erasable programmable read only memory (EEPROM), and the like.
  • the storage module 303 can include a transitory computer readable medium that can store code, logic or instructions for performing at least one of the steps described elsewhere herein.
  • the control module 301 can perform at least one step, individually or collectively, in accordance with code, logic or instructions of the non-transitory computer readable medium described herein.
  • the storage module can be used to store state information of the drone 100, such as height, speed, position, preset reference height, and the like.
  • the input/output module 304 is configured to output information or instructions to an external device, such as receiving an instruction sent by the input/output device 148 (see FIG. 1), or transmitting an image captured by the imaging device 131 (see FIG. 1) to The input and output device 148.
  • FIG. 4 is a flow chart of a drone control method provided by the present invention.
  • Step 401 receiving a position of the target in the image.
  • the user may select an airplane mode via the input and output device 148, such as by clicking on the screen 550 to select an airplane mode.
  • the flight modes include, but are not limited to, pointing flight, smart following, autonomous return, and the like.
  • the user can determine a goal by clicking on any point on the screen 550.
  • the input and output device 148 can transmit location information of the target to the drone 500.
  • the location information of the target is used to control the flight of the drone.
  • the user can select a target A on the screen 550.
  • the input and output device 148 can calculate the coordinates (x screen , y screen ) of A on the screen 550, which in turn can convert the coordinates on the screen 550 into
  • the coordinates (x rawimage , y rawimage ) in the camera source image the input/output device 148 may also normalize the coordinates (x rawimage , y rawimage ) in the camera source image to (x percentage , according to the following formula) y percentage ):
  • the coordinates (x percentage , y percentage ) can be sent to the drone, which in turn is used to calculate the spatial flight direction of the drone.
  • Step 402 Acquire a first height of the drone relative to the ground.
  • the drone 500 can acquire a first height H relative to the ground.
  • the drone can acquire the first height via at least one sensor onboard.
  • the first height may be the current ground height of the drone.
  • the at least one sensor may include, but is not limited to, an ultrasonic sensor, a TOF sensor (such as a 3D TOF sensor), an infrared sensor, a microwave sensor, a proximity sensor (English: proximity sensor), a three-dimensional laser range finder, a barometer, a GPS module. Wait.
  • the first height H can be used to control the flight of the drone. In some embodiments, when the first height H is less than a preset reference height h, and A is on the ground. The drone 500 can then fly horizontally along the first height H and hover directly above A' (track 530). In other embodiments, when the first height H is greater than the preset reference height h and A is on the ground. The drone can then fly to the preset reference height h and then fly along the preset reference height h (track 540).
  • the drone 500 when the first height H is less than the preset reference height h and A is on the ground, the drone 500 can fly at any altitude and hover directly above A'.
  • Step 403 Control the drone to fly according to the position of the target in the image and the first height.
  • the processor can calculate the coordinates of A' based on the location of the target in the image.
  • A' is the corresponding point of A in the world coordinate system, the direction vector
  • (x i , y i ) is the coordinate of A in the camera coordinate system, and f is the focal length.
  • the processor may be based on the direction vector And a rotation matrix Calculate the direction vector according to the following formula Corresponding direction vector on the gimbal coordinate system The rotation matrix of the camera coordinate system to the PTZ coordinate system.
  • the processor may be based on the direction vector And a rotation matrix Calculate the direction vector according to the following formula Corresponding direction vector in the world coordinate system It is the rotation matrix of the PTZ coordinate system to the world coordinate system.
  • the processor can calculate the direction vector according to the following formula
  • the processor can be calculated according to the following formula
  • the rotation matrix of the camera coordinate system to the world coordinate system
  • ( ⁇ , ⁇ , ⁇ ) represents the attitude angle of the pan/tilt (such as pitch angle, roll angle, yaw angle, etc.).
  • the processor can be based on the direction vector And the first height H, the direction vector is calculated according to the following formula Direction vector relative to the ground
  • the processor can be based on the direction vector And the current position of the drone (pos x , pos y , pos z ), and the direction vector is calculated according to the following formula Direction vector relative to the takeoff point of the drone
  • the processor can be based on the direction vector And in case the first height H is smaller than the preset reference height h, the drone is controlled to fly above A' and hover.
  • the processor can be based on Calculating the coordinates of A', and controlling the drone to fly to the preset reference height and along the preset if the first height H is greater than the preset reference height h Refer to altitude flight.
  • the drone 500 if the drone 500 detects that the orientation of the target A is sky, the drone will fly in accordance with the position pointed by the target A.
  • the user can modify the preset reference height. If the user controls the drone indoors, the preset reference height can be modified to be less than or equal to the indoor height. When the user controls the drone outdoors, the preset reference height can be adjusted to a relatively large value.
  • the user can drag the target as needed or reset the target. After the new target is determined, the drone re-executes the steps in Flowchart 4.
  • the user may select at least two targets, and the drone may automatically determine whether the flight path composed of the at least two targets is feasible. If feasible, the drone will fly in accordance with the calculated flight path. If not feasible, the drone can return a failure prompt to the user, for example, alert information (such as path planning failure, etc.) can be displayed on the input and output device 148.
  • alert information such as path planning failure, etc.
  • the drone control method of the present invention can control the drone to fly to a position directly above the target on the ground according to the position of the input target in the image and the first height, and hover, thereby realizing the drone Autonomous flight means autonomous hovering, which can precisely control the flight of the drone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机的控制方法,包括,接收目标在图像中的位置(401),获取无人机相对地面的第一高度(402),以及根据目标在图像中的位置及第一高度控制无人机飞行(403)。

Description

无人机及其控制方法 技术领域
本发明涉及一种无人机,且特别涉及一种具有自主飞行功能的无人机。
背景技术
传统的无人机操作需要遥控器控制,这是一种人工的控制方式。如果要实现无人机自主飞行,脱离遥控器控制,就需要实现一套将任务或者目标转化为控制指令的技术,来指导或控制无人机到达指定区域或持续飞行。
发明内容
本发明主要解决的技术问题是提供一种具有自主飞行功能的无人机,能够在脱离用户控制的情况下实现自主飞行。
本发明一方面提供了一种无人机控制方法,所述方法包括,接收目标在图像中的位置,获取所述无人机相对地面的第一高度,以及根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行。
本发明另一发明提供了一种无人机,所述无人机包括,传感器,用于获取所述无人机相对地面的第一高度,处理器,用于接收目标在图像中的位置,以及根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行。
在一些实施例中,所述无人机还包括存储器,所述存储器用于存储预置的参考高度。
在一些实施例中,所述处理器还用于,获取所述预置的参考高度, 以及根据所述预置的参考高度、所述第一高度及所述目标在图像中的位置,控制所述无人机飞行。
在一些实施例中,所述处理器还用于,根据所述目标在图像中的位置计算所述目标在地面上对应的位置,根据所述预置的参考高度分析所述第一高度,控制所述无人机飞行至所述预置的参考高度,以及控制所述无人机沿所述预置的参考高度飞行。
在一些实施例中,所述处理器还用于,根据所述目标在图像中的位置计算所述目标在地面上对应的位置,根据所述预置的参考高度分析所述第一高度,以及控制所述无人机沿所述第一高度飞行至所述目标在地面上对应的位置的上方并悬停。
在一些实施例中,所述传感器包括超声传感器、TOF传感器、气压计、红外传感器、微波传感器、近距离传感器中的至少一种。
附图说明
为了更清楚地说明本披露实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本披露的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1位本发明实施例提供的无人机的结构示意图;
图2位本发明实施例提供的无人机底部的结构示意图;
图3位本发明实施例提供的无人机的模块示意图;
图4位本发明实施例提供的无人机控制方法流程图;
图5为本发明实施例提供的无人机计算目标位置的示意图;
图6为本发明提供的无人机飞行路径实施例一的示意图;
图7为本发明提供的无人机飞行路径实施例二的示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明的一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本发明保护的范围。
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本发明的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
下面结合附图和实施例对本发明进行详细说明。
参阅图1,图1为本发明实施例提供的无人机结构示意图。无人机100可以包括机身110,所述机身110包括中央部分111以及至少一个外部部分112。在图1所示的实施例中,所述机身110包括四个外部部分112(如机臂113)。所述四个外部部分112分别从所述中央部分111延伸出来。在其他实施例中,所述机身110可以包含任意数量的外部部分112(如6个、8个等)。在任何上述实施例中,每个所述外部部分112可以承载一个推进系统120,所述推进系统120可以驱动所述无人机100运动(如爬升、降落、水平移动等)。例如:所述机臂113可以承载对应的电机121,所述电机121可以驱动对应的螺旋桨转动。所述无人机100可以控制任意一组电机121及其对应的螺旋桨122,而不受其余的电机121及其对应的螺旋桨影响。
所述机身110可以携带一个负载130,例如:成像装置131。在一些实施例中,所述成像装置131可以包括一个摄像头,例如:可以拍摄所述无人机周围的图像、视频等。所述摄像头光敏于各种波长的光线,包括但不限于可见光、紫外线、红外线或其中的任意组合。在一些实施例中,所述负载130可以包括其他种类的传感器。在一些实施例中,所述负载130通过云台150与所述机身110连接在一起,使得所述负载130可以相对于所述机身110运动。例如:当所述负载130携带成像装置131时,所述成像装置131可以相对于机身110运动以拍摄所述无人机100周围的图像、视频等。如图所示,当无人机100位于地面时,起落架114可以支撑无人机100以保护所述负载130。
在一些实施例中,所述无人机100可以包括控制系统140,所述控制系统140包括置于所述无人机100的组件以及与所述无人机100分离的组件。例如,所述控制系统140可以包括一个置于所述无人机100上的第一控制器141,以及一个远离所述无人机100并通过通信链路160(如无线链路)与所述第一控制器141连接的第二控制器142。所述第一控制器141可以包括至少一个处理器、存储器、以及机载计算机可读介质143a,所述机载计算机可读介质143a可以存储用于控制无人机100行为的程序指令,所述行为包括但不限于所述推进系统120及所述成像装置131的操作,控制所述无人机进行自动降落等。所述计算可读介质143a也可用于存储所述无人机100的状态信息,如高度、速度、位置、预置的参考高度等。所述第二控制器142可以包括至少一个处理器、存储器、机外计算机可读介质143b,以及至少一个输入输出装置148,例如:显示装置144及控制装置145。所述无人机100的操作者可以通过所述控制装置145远程控制所述无人机100,并通过所述显示装置144和/或其他装置接收来自所述无人机100的反馈信息。在其他实施例中,所述无人机100可以自主运作,此时所述第二控制器142可以被省去,或者所述第二控制器142可以仅被用来使无人机操作者重写用于无人机飞行的函数。所述机载计算机可读介质143a可以被移出于所述无人机100。所述机外计算接可 读介质143b可以被移出于所述第二控制器142。
在一些实施例中,所述无人机100可以包括两个前视摄像头171和172,所述前视摄像头171和172光敏于各种波长的光线(如可见光、红外光、紫外线)用于拍摄所述无人机周围的图像或视频。在一些实施例中,所述无人机100包括置于底部的至少一个传感器。
图2是本发明实施例提供的无人机底部的结构示意图。所述无人机100可以包括两个置于所述机身110底部的下视摄像头173和174。此外,所述无人机100还包括两个置于所述机身110底部的超声传感器177和178。所述超声传感器177和178可以检测和/或监测所述无人机100底部的物体及地面,并通过发送及接受超声波来测量离该物体或地面的距离。
在其他实施例中,所述无人机100可以包括惯性测量单元(英文:inertial measurement unit,缩写:IMU)、红外传感器、微波传感器、温度传感器、近距离传感器(英文:proximity sensor)、三维激光测距仪、3D TOF等。所述三维激光测距仪及所述3D TOF可以检测无人机具下方物体或体面的距离。
在一些实施例中,所述无人机100可以从所述输入输出装置148接收输入信息,如用户在通过所述输入输出装置148向所述无人机100发送一目标。所述无人机100可以根据所述目标,识别出所述目标在地面上的对应位置,所述第一控制器可以控制所述无人机100飞至所述对应位置的上方并悬停。
在一些实例中,所述无人机100可以从所述输入输出装置148接收输入信息,如用户在通过所述输入输出装置148向所述无人机100发送一目标。所述无人机100可以根据所述目标,识别出所述目标在地面上的对应位置。所述第一控制器可以控制所述无人机100飞行至一预置的参考高度,并沿所述预置的参考高度飞行。
图3为本发明实施例提供的的无人机的模块示意图。参阅图3,无人机100可以包括至少一个处理器301、传感器模块302、存储模块303以及输入输出模块304。
所述控制模块301可以包括至少一个处理器,所述处理器包括但不限于微处理器(英文:microcontroller),精简指令集计算机(英文:reduced instruction set computer,简称:RISC),专用集成电路(英文:application specific integrated circuits,简称:ASIC),专用指令集处理器(英文:application-specific instruction-set processor,简称:ASIP),中央处理单元(英文:central processing unit,简称:CPU),物理处理器英文(英文:physics processing unit,简称:PPU),数字信号处理器(英文:digital signal processor,简称DSP),现场可编程门阵列(英文:field programmable gate array,简称:FPGA)等。
所述传感器模块302可以包括至少一个传感器,所述传感器包括但不限于温度传感器、惯性测量单元、加速度计、图像传感器(如摄像头)、超声传感器、TOF传感器、微波传感器、近距离传感器、三维激光测距仪、红外传感器等。
在一些实施例中,所述惯性测量单元可以用于测量所述无人机的姿态信息(如俯仰角、横滚角、偏航角等)。所述惯性测量单元可以包括但不限于,至少一个加速度计、陀螺仪、磁力仪或其中的任意组合。所述加速度计可以用于测量所述无人机的加速度,以计算所述无人机的速度。
所述存储模块303可以包括但不限于只读存储器(ROM)、随机存储器(RAM)、可编程制度存储器(PROM)、电子抹除式可编程只读存储器(EEPROM)等。所述存储模块303可以包括费暂时性计算机可读介质,其可以存储用于执行本文其他各处所描述的至少一个步骤的代码、逻辑或指令。所述控制模块301,其可以根据本文所描述的非暂时性计算机可读介质的代码、逻辑或指令而单独地或共同地执行至少一个步骤。所述存储模块可用于存储所述无人机100的状态信息,如高度、速度、位置、预置的参考高度等。
所述输入输出模块304用于向外部设备输出信息或指令,如接收所述输入输出装置148(见图1)发送的指令,或将所述成像装置131(见图1)拍摄的图像发送给所述输入输出装置148。
图4为本发明提供的无人机控制方法的流程图。
步骤401,接收目标在图像中的位置。
在一些实施例中,用户可以通过所述输入输出装置148选择一种飞行模式,如通过点击屏幕550选择一种飞行模式。所述飞行模式包括但不限于指点飞行、智能跟随、自主返航等。
在一些实施例中,当用户进入指点飞行模式之后,用户可以通过点击所述屏幕550上任意一点,以确定一个目标。所述输入输出装置148可以将所述目标的位置信息发送至所述无人机500。所述目标的位置信息用于控制所述无人机飞行。
参考图6及图7,用户可以在所述屏幕550上选择一个目标A。在选定A之后,所述输入输出装置148可以计算出A在所述屏幕550上的坐标(xscreen,yscreen),所述输入输出装置148进而可以将所述屏幕550上的坐标转换成相机源图像中的坐标(xrawimage,yrawimage),所述输入输出装置148还可以根据以下公式将所述相机源图像中的坐标(xrawimage,yrawimage)归一化处理为(xpercentage,ypercentage):
Figure PCTCN2016111490-appb-000001
坐标(xpercentage,ypercentage)可以被发送给所述无人机,进而用于计算所述无人机的空间飞行方向。
步骤402,获取所述无人机相对地面的第一高度。
参阅图6及图7,所述无人机500可以获取相对地面的第一高度H。
在一些实施例中,所述无人机可以通过机载的至少一个传感器获取所述第一高度。所述第一高度可以为所述无人机的当前对地高度。 所述至少一个传感器可以包括但不限于,超声传感器、TOF传感器(如3D TOF传感器)、红外传感器、微波传感器、近距离传感器(英文:proximity sensor)、三维激光测距仪、气压计、GPS模块等。
所述第一高度H可以用于控制所述无人机飞行。在一些实施例中,当所述第一高度H小于一个预置的参考高度h,且A位于地面。则所述无人机500可以沿所述第一高度H水平飞行并悬停于A’的正上方(轨迹530)。在其他实施例中,当所述第一高度H大于所述预置的参考高度h,且A位于地面。则所述无人机可以先飞行至所述预置的参考高度h,然后沿所述预置的参考高度h飞行(轨迹540)。
在其他实施例中,当所述第一高度H小于所述预置的参考高度h,且A位于地面,所述无人机500可以沿任意高度飞行并悬停于A’的正上方。
步骤403,根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行。
在一些实施例中,所述处理器可以根据所述目标在图像中的位置计算出A’的坐标。
参阅图5,A’为A在世界坐标系中对应的点,方向向量
Figure PCTCN2016111490-appb-000002
的坐标为(xw,yw,zw),D表示深度,且zw=D。(xi,yi)为A在相机坐标系中的坐标,f为焦距。因而可以得到以下关系:
Figure PCTCN2016111490-appb-000003
以下公式基于(xpercentage,ypercentage),(xi,yi),以及图像的尺寸(ImageWidth,ImageHeight):
Figure PCTCN2016111490-appb-000004
基于以下焦距与图像视野的关系
Figure PCTCN2016111490-appb-000005
可以得到以下公式:
Figure PCTCN2016111490-appb-000006
因而可以得到,
Figure PCTCN2016111490-appb-000007
可以看到,(xw,yw,zw)的表达式中包含未知值D。可以对所述方向向量
Figure PCTCN2016111490-appb-000008
做归一化处理,以消除所述未知值D。假设D=1,则方向向量
Figure PCTCN2016111490-appb-000009
可以被表示为:
Figure PCTCN2016111490-appb-000010
再对方向向量
Figure PCTCN2016111490-appb-000011
作归一化处理,可以得到:
Figure PCTCN2016111490-appb-000012
因此在相机坐标系中得到了所述方向向量
Figure PCTCN2016111490-appb-000013
的坐标。
所述处理器可以根据所述方向向量
Figure PCTCN2016111490-appb-000014
以及一旋转矩阵
Figure PCTCN2016111490-appb-000015
按照以下公式计算出所述方向向量
Figure PCTCN2016111490-appb-000016
在云台坐标系上的对应方向向量
Figure PCTCN2016111490-appb-000017
为相机坐标系到云台坐标系的旋转矩阵。
Figure PCTCN2016111490-appb-000018
所述处理器可以根据所述方向向量
Figure PCTCN2016111490-appb-000019
以及一旋转矩阵
Figure PCTCN2016111490-appb-000020
按照以下公式计算出所述方向向量
Figure PCTCN2016111490-appb-000021
在世界坐标系中对应的方向向量
Figure PCTCN2016111490-appb-000022
为云台坐标系到世界坐标系的旋转矩阵。
Figure PCTCN2016111490-appb-000023
综上,所述处理器可以根据以下公式计算所述方向向量
Figure PCTCN2016111490-appb-000024
Figure PCTCN2016111490-appb-000025
其中,
Figure PCTCN2016111490-appb-000026
所述处理器可以根据以下公式计算
Figure PCTCN2016111490-appb-000027
为相机坐标系到世界坐标系的旋转矩阵,
Figure PCTCN2016111490-appb-000028
其中,(α,β,γ)表示所述云台的姿态角(如俯仰角、横滚角、偏航角等)。
在一些实施例中,所述处理器可以根据所述方向向量
Figure PCTCN2016111490-appb-000029
Figure PCTCN2016111490-appb-000030
以及所述第一高度H,按照以下公式计算出所述方向向量
Figure PCTCN2016111490-appb-000031
相对于地面的方向向量
Figure PCTCN2016111490-appb-000032
Figure PCTCN2016111490-appb-000033
其中,zgnd为所述第一高度H。
最后,所述处理器可以根据所述方向向量
Figure PCTCN2016111490-appb-000034
以及所述无人机当前的位置(posx,posy,posz),按照以下公式计算出所述方向向量
Figure PCTCN2016111490-appb-000035
相对所述无人机起飞点的方向向量
Figure PCTCN2016111490-appb-000036
Figure PCTCN2016111490-appb-000037
Figure PCTCN2016111490-appb-000038
在一些实施例中,所述处理器可以根据所述方向向量
Figure PCTCN2016111490-appb-000039
并在所述第一高度H小于所述预置的参考高度h的情况下,控制所述无人机飞行至A’的上方并悬停。
在其他实施例中,所述处理器可以根据
Figure PCTCN2016111490-appb-000040
计算出A’的坐标,并在所述第一高度H大于所述预置的参考高度h的情况下,控制所述无人机飞行至所述预置的参考高度并沿所述预置的参考高度飞行。
在其他实施例中,如果所述无人机500检测到所述目标A的朝向为天空,则所述无人机将按照所述目标A指向的位置飞行。
在一些实施例中,用户可以修改所述预置的参考高度。如用户在室内控制无人机时,可以将所述预置的参考高度修改至小于或等于室内高度。用户在室外控制无人机时,可以将所述预置的参考高度调整到一个相对较大的值。
在一些实施例中,在用户选定所述目标且所述无人机开始飞行之后,用户可以根据需要拖动目标,或者重新设定目标。在新的目标确定之后,所述无人机重新执行流程图4中的步骤。
在一些实施例中,用户可以选择至少两个目标,无人机可以自动判断所述至少两个目标组成的飞行路径是否可行。如果可行,所述无人机将按照计算出的飞行路径飞行。如果不可行,则所述无人机可以向用户返回失败提示,例如可以在所述输入输出装置148上显示警示信息(如路径规划失败等)。
本发明的无人机控制方法可以根据输入目标在图像中的位置及所述第一高度,控制无人机飞行至地面上对应所述目标正上方的位置并悬停,实现了无人机的自主飞行即自主悬停,可以精准地控制无人机的飞行。
值得注意的是,上述对无人机控制方法的描述仅为了便于理解本 发明。对本领域的普通技术人员来说,可以在理解本发明的基础上对本发明做出一些修改与变换,但所述修改与变换仍在本发明的保护范围之内。例如,上述无人机控制方法可以应用于室内,也可以应用于室外。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者该专利披露。
最后应说明的是:以上各实施例仅用以说明本披露的技术方案,而非对其限制;尽管参照前述各实施例对本披露进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本披露各实施例技术方案的范围。

Claims (12)

  1. 一种无人机的控制方法,其特征在于,包括:
    接收目标在图像中的位置;
    获取所述无人机相对地面的第一高度;以及
    根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行。
  2. 如权利要求1所述的方法,其特征在于,所述根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行包括:
    获取预置的参考高度;以及
    根据所述预置的参考高度、所述第一高度及所述目标在图像中的位置,控制所述无人机飞行。
  3. 如权利要求2所述的方法,其特征在于,所述根据所述预置的参考高度、所述第一高度及所述目标在图像中的位置,控制所述无人机飞行还包括:
    根据所述目标在图像中的位置计算所述目标在地面上对应的位置;
    根据所述预置的参考高度分析所述第一高度;
    控制所述无人机飞行至所述预置的参考高度;以及
    控制所述无人机沿所述预置的参考高度飞行。
  4. 如权利要求2所述的方法,其特征在于,所述根据所述预置的参考高度、所述第一高度及所述目标在图像中的位置,控制所述无人机飞行还包括:
    根据所述目标在图像中的位置计算所述目标在地面上对应的位置;
    根据所述预置的参考高度分析所述第一高度;以及
    控制所述无人机沿所述第一高度飞行至所述目标在地面上对应的位置的上方并悬停。
  5. 如权利要求1所述的无人机控制方法,其特征在于,所述获取所述无人机相对地面的第一高度包括:
    通过传感器获取所述第一高度。
  6. 如权利要求5所述的方法,其特征在于,所述传感器包括超声传感器、TOF传感器、红外传感器、微波传感器、近距离传感器中的至少一种。
  7. 一种无人机,其特征在于,包括:
    传感器,用于获取所述无人机相对地面的第一高度;
    处理器,用于:
    接收目标在图像中的位置;
    根据所述目标在图像中的位置及所述第一高度控制所述无人机飞行。
  8. 如权利要求6所述的无人机,其特征在于,所述无人机还包括:
    存储器,用于存储预置的参考高度。
  9. 如权利要求8所述的无人机,其特征在于,所述处理器还用于:
    获取所述预置的参考高度;以及
    根据所述预置的参考高度、所述第一高度及所述目标在图像中的位置,控制所述无人机飞行。
  10. 如权利要求9所述的无人机,其特征在于,所述处理器还用 于:
    根据所述目标在图像中的位置计算所述目标在地面上对应的位置;
    根据所述预置的参考高度分析所述第一高度;
    控制所述无人机飞行至所述预置的参考高度;以及
    控制所述无人机沿所述预置的参考高度飞行。
  11. 如权利要求9所述的无人机,其特征在于,所述处理器还用于:
    根据所述目标在图像中的位置计算所述目标在地面上对应的位置;
    根据所述预置的参考高度分析所述第一高度;以及
    控制所述无人机沿所述第一高度飞行至所述目标在地面上对应的位置的上方并悬停。
  12. 如权利要求7所述的无人机,其特征在于,所述传感器包括超声传感器、TOF传感器、气压计、红外传感器、微波传感器、近距离传感器中的至少一种。
PCT/CN2016/111490 2016-12-22 2016-12-22 无人机及其控制方法 WO2018112831A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201680004731.0A CN107108023B (zh) 2016-12-22 2016-12-22 无人机及其控制方法
PCT/CN2016/111490 WO2018112831A1 (zh) 2016-12-22 2016-12-22 无人机及其控制方法
CN201910840461.2A CN110525650B (zh) 2016-12-22 2016-12-22 无人机及其控制方法
US16/445,796 US20190310658A1 (en) 2016-12-22 2019-06-19 Unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/111490 WO2018112831A1 (zh) 2016-12-22 2016-12-22 无人机及其控制方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/445,796 Continuation US20190310658A1 (en) 2016-12-22 2019-06-19 Unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
WO2018112831A1 true WO2018112831A1 (zh) 2018-06-28

Family

ID=59676414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/111490 WO2018112831A1 (zh) 2016-12-22 2016-12-22 无人机及其控制方法

Country Status (3)

Country Link
US (1) US20190310658A1 (zh)
CN (2) CN107108023B (zh)
WO (1) WO2018112831A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577471A (zh) * 2020-12-31 2021-03-30 北京四维远见信息技术有限公司 一种超大幅面倾斜航摄仪

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108569412B (zh) * 2018-03-15 2019-05-14 广东高商科技有限公司 无人机飞行能力自检平台
CN109533327B (zh) * 2018-03-15 2020-04-10 拓航科技有限公司 无人机飞行能力自检方法
CN108982316B (zh) * 2018-06-14 2020-11-27 河海大学文天学院 一种基于无人机检测大坝背面混凝土表面渗流系统及方法
CN110447371A (zh) * 2019-08-06 2019-11-15 深圳拓邦股份有限公司 一种割草机的控制方法及割草机
JP6929914B2 (ja) * 2019-10-11 2021-09-01 三菱重工業株式会社 垂直離着陸機の自動着陸システム、垂直離着陸機および垂直離着陸機の着陸制御方法
CN111307291B (zh) * 2020-03-02 2021-04-20 武汉大学 基于无人机的地表温度异常检测和定位方法、装置及系统
CN112162568B (zh) * 2020-09-18 2022-04-01 深圳市创客火科技有限公司 无人机终端降落控制方法、无人机终端及存储介质
CN112666973B (zh) * 2020-12-15 2022-04-29 四川长虹电器股份有限公司 基于tof的无人机群在飞行中队形保持和变队的方法
US20230202644A1 (en) * 2021-12-28 2023-06-29 Industrial Technology Research Institute Uav and control method thereof
TWI806318B (zh) * 2021-12-28 2023-06-21 財團法人工業技術研究院 無人機及其控制方法
CN114394236A (zh) * 2022-01-14 2022-04-26 北京华能新锐控制技术有限公司 风电叶片巡检用无人机
CN116203986B (zh) * 2023-03-14 2024-02-02 成都阜时科技有限公司 无人机及其降落方法、主控设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050004723A1 (en) * 2003-06-20 2005-01-06 Geneva Aerospace Vehicle control system including related methods and components
US20120089274A1 (en) * 2010-10-06 2012-04-12 Hon Hai Precision Industry Co., Ltd. Electronic device and method for controlling unmanned aerial vehicle
CN103412568A (zh) * 2013-08-27 2013-11-27 重庆市勘测院 同架次变航高无人机遥感影像获取方法
CN104777846A (zh) * 2015-04-20 2015-07-15 中国科学院长春光学精密机械与物理研究所 用于无人机航迹飞行高度控制的平滑过渡方法
CN105045281A (zh) * 2015-08-13 2015-11-11 深圳一电科技有限公司 无人机飞行控制方法及装置
CN105182992A (zh) * 2015-06-30 2015-12-23 深圳一电科技有限公司 无人机的控制方法、装置
CN106168807A (zh) * 2016-09-09 2016-11-30 腾讯科技(深圳)有限公司 一种飞行器的飞行控制方法和飞行控制装置
CN106227233A (zh) * 2016-08-31 2016-12-14 北京小米移动软件有限公司 飞行设备的控制方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016015251A1 (en) * 2014-07-30 2016-02-04 SZ DJI Technology Co., Ltd. Systems and methods for target tracking
CN104656663B (zh) * 2015-02-15 2017-12-01 西北工业大学 一种基于视觉的无人机编队感知与规避方法
CN105955292B (zh) * 2016-05-20 2018-01-09 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
CN106054917A (zh) * 2016-05-27 2016-10-26 广州极飞电子科技有限公司 一种无人飞行器的飞行控制方法、装置和遥控器

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050004723A1 (en) * 2003-06-20 2005-01-06 Geneva Aerospace Vehicle control system including related methods and components
US20120089274A1 (en) * 2010-10-06 2012-04-12 Hon Hai Precision Industry Co., Ltd. Electronic device and method for controlling unmanned aerial vehicle
CN103412568A (zh) * 2013-08-27 2013-11-27 重庆市勘测院 同架次变航高无人机遥感影像获取方法
CN104777846A (zh) * 2015-04-20 2015-07-15 中国科学院长春光学精密机械与物理研究所 用于无人机航迹飞行高度控制的平滑过渡方法
CN105182992A (zh) * 2015-06-30 2015-12-23 深圳一电科技有限公司 无人机的控制方法、装置
CN105045281A (zh) * 2015-08-13 2015-11-11 深圳一电科技有限公司 无人机飞行控制方法及装置
CN106227233A (zh) * 2016-08-31 2016-12-14 北京小米移动软件有限公司 飞行设备的控制方法及装置
CN106168807A (zh) * 2016-09-09 2016-11-30 腾讯科技(深圳)有限公司 一种飞行器的飞行控制方法和飞行控制装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577471A (zh) * 2020-12-31 2021-03-30 北京四维远见信息技术有限公司 一种超大幅面倾斜航摄仪
CN112577471B (zh) * 2020-12-31 2023-04-07 北京四维远见信息技术有限公司 一种超大幅面倾斜航摄仪

Also Published As

Publication number Publication date
CN107108023A (zh) 2017-08-29
CN110525650A (zh) 2019-12-03
CN107108023B (zh) 2019-09-27
US20190310658A1 (en) 2019-10-10
CN110525650B (zh) 2021-05-25

Similar Documents

Publication Publication Date Title
WO2018112831A1 (zh) 无人机及其控制方法
US11604479B2 (en) Methods and system for vision-based landing
US11794890B2 (en) Unmanned aerial vehicle inspection system
US20220083078A1 (en) Method for controlling aircraft, device, and aircraft
US11866198B2 (en) Long-duration, fully autonomous operation of rotorcraft unmanned aerial systems including energy replenishment
US20210065400A1 (en) Selective processing of sensor data
JP6609833B2 (ja) 無人航空機の飛行を制御する方法及びシステム
US20220206515A1 (en) Uav hardware architecture
US10549846B2 (en) UAV with transformable arms
CN108351654B (zh) 用于视觉目标跟踪的系统和方法
CN108351649B (zh) 用于控制可移动物体的方法和设备
JP6816156B2 (ja) Uav軌道を調整するシステム及び方法
JP7152836B2 (ja) 無人飛行体のアクションプラン作成システム、方法及びプログラム
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
Thurrowgood et al. A biologically inspired, vision‐based guidance system for automatic landing of a fixed‐wing aircraft
WO2018082000A1 (zh) 无人机及天线组件
WO2018053867A1 (zh) 无人机及其控制方法
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
WO2018188086A1 (zh) 无人机及其控制方法
US20240348751A1 (en) Autonomous monitoring by unmanned aerial vehicle systems and methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924810

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924810

Country of ref document: EP

Kind code of ref document: A1