WO2018053867A1 - 无人机及其控制方法 - Google Patents

无人机及其控制方法 Download PDF

Info

Publication number
WO2018053867A1
WO2018053867A1 PCT/CN2016/100197 CN2016100197W WO2018053867A1 WO 2018053867 A1 WO2018053867 A1 WO 2018053867A1 CN 2016100197 W CN2016100197 W CN 2016100197W WO 2018053867 A1 WO2018053867 A1 WO 2018053867A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
landing
height
reference height
speed
Prior art date
Application number
PCT/CN2016/100197
Other languages
English (en)
French (fr)
Inventor
应佳行
彭昭亮
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2016/100197 priority Critical patent/WO2018053867A1/zh
Priority to CN202011464102.0A priority patent/CN112666969A/zh
Priority to CN201680004278.3A priority patent/CN107438567A/zh
Publication of WO2018053867A1 publication Critical patent/WO2018053867A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the invention relates to a drone, and in particular to a drone with an automatic landing function.
  • the existing drone landing methods generally go directly down, there is no sense of how high the ground is from landing, or there is perception but no protection measures are added, and the perception of the ground environment is often not. This is likely to cause a safety accident when the drone landed at a place that is not suitable for landing.
  • the technical problem to be solved by the present invention is to provide a drone and a control method thereof, which can automatically fall down by selecting a suitable landing site in different environments, without human intervention, and avoiding damage to the drone or injuring others.
  • An aspect of the present invention provides a control method for a drone, the method comprising: acquiring a first height of the drone; acquiring a preset reference height; and comparing the reference height according to the preset Performing an analysis to obtain a first landing speed of the drone; and controlling the drone motion according to the first landing speed.
  • a drone comprising: a sensor for acquiring a first height of the drone; a memory for storing a preset reference height; and one or more a processor, configured to: retrieve the preset reference height from the memory; analyze the first height according to the preset reference height to obtain a first landing of the drone Speed; and controlling the drone motion according to the first drop speed.
  • the first drop speed is linearly related to the preset reference height.
  • the preset reference height includes a first reference height and a second reference height, The second reference height is less than the first reference height.
  • controlling the drone motion according to the first drop speed comprises: controlling the human machine at the first reference height and the second reference height according to the first drop speed Internal movement.
  • controlling the drone landing according to the first drop speed comprises: controlling the drone to hover at the preset reference height according to the first drop speed.
  • the method further comprises: acquiring a second height of the drone, the second height being less than or equal to the second reference height; and obtaining according to the second height of the drone The second landing speed of the drone.
  • the second drop speed is less than the first drop speed.
  • the second drop speed is constant.
  • the method further comprises: acquiring an environmental image of the drone; extracting a landing location from the environmental image; and controlling the drone landing according to the landing location.
  • controlling the drone landing according to the landing location comprises: controlling the drone to land on an area corresponding to the landing point on the ground according to the landing location.
  • the method further comprises: acquiring an environmental image of the drone; and controlling the drone to fly horizontally when no landing location is extracted from the environmental image.
  • the method further comprises: receiving status information of the sensor; and controlling the drone to hover at a predetermined height based on the status information.
  • FIG. 1 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of a bottom of a drone according to an embodiment of the present invention.
  • FIG. 3 is a schematic block diagram of a drone according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of Embodiment 1 of an automatic landing of a drone provided by the present invention.
  • Embodiment 2 is a schematic diagram of Embodiment 2 of an automatic landing of a drone provided by the present invention
  • FIG. 7 is a schematic diagram of Embodiment 3 of an automatic landing of a drone provided by the present invention.
  • FIG. 1 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • the drone 100 can include a fuselage 110 that includes a central portion 111 and one or more outer portions 112.
  • the fuselage 110 includes four outer portions 112 (such as the arm 113).
  • the four outer portions 112 extend from the central portion 111, respectively.
  • the body 110 can include any number of external portions 112 (eg, 6, 8, etc.).
  • each of the outer portions 112 can carry a propulsion system 120 that can drive the drone 100 to move (e.g., climb, land, horizontally move, etc.).
  • the arm 113 can carry a corresponding motor 121, and the motor 121 can drive the corresponding propeller to rotate.
  • the drone 100 can control any set of motors 121 and their corresponding propellers 122 without being affected by the remaining motors 121 and their corresponding propellers.
  • the body 110 can carry a load 130, such as an imaging device 131.
  • the imaging device 131 can include a camera, for example, can be photographed around the drone Images, videos, etc.
  • the camera is photosensitive to light of various wavelengths including, but not limited to, visible light, ultraviolet light, infrared light, or any combination thereof.
  • the load 130 can include other kinds of sensors.
  • the load 130 is coupled to the fuselage 110 by a pan/tilt 150 such that the load 130 can move relative to the fuselage 110.
  • the imaging device 131 can move relative to the body 110 to capture images, videos, and the like around the drone 100.
  • the landing gear 114 can support the drone 100 to protect the load 130 when the drone 100 is on the ground.
  • the drone 100 can include a control system 140 that includes components disposed in the drone 100 and components that are separate from the drone 100.
  • the control system 140 can include a first controller 141 disposed on the drone 100, and a remote from the drone 100 and coupled via a communication link 160 (eg, a wireless link)
  • the second controller 142 is connected to the first controller 141.
  • the first controller 141 can include one or more processors, memory, and an onboard computer readable medium 143a that can store program instructions for controlling the behavior of the drone 100,
  • the behavior includes, but is not limited to, operation of the propulsion system 120 and the imaging device 131, controlling the drone to perform automatic landing, and the like.
  • the second controller 142 can include one or more processors, memory, off-board computer readable media 143b, and one or more input and output devices 148, such as display device 144 and control device 145.
  • An operator of the drone 100 can remotely control the drone 100 through the control device 145 and receive feedback information from the drone 100 via the display device 144 and/or other devices.
  • the drone 100 can operate autonomously, at which time the second controller 142 can be omitted, or the second controller 142 can only be used to make the drone operator heavy Write a function for drone flight.
  • the onboard computer readable medium 143a can be moved out of the drone 100.
  • the off-board computing readable medium 143b can be moved out of the second controller 142.
  • the drone 100 can include two forward looking cameras 171 and 172 that are sensitive to light of various wavelengths (eg, visible light, infrared light, ultraviolet light) for shooting. An image or video around the drone. In some embodiments, the drone 100 includes one or more sensors placed at the bottom.
  • various wavelengths eg, visible light, infrared light, ultraviolet light
  • the drone 100 can include two lower looking cameras 173 and 174 placed at the bottom of the fuselage 110.
  • the drone 100 further includes two ultrasonic sensors 177 and 178 placed at the bottom of the body 110. Ultrasonic sensing The devices 177 and 178 can detect and/or monitor objects and ground at the bottom of the drone 100 and measure the distance from the object or the ground by transmitting and receiving ultrasonic waves.
  • the drone 100 may include an inertial measurement unit (English: inertial measurement unit, IMU), an infrared sensor, a microwave sensor, a temperature sensor, a proximity sensor (English: proximity sensor), a three-dimensional laser Range finder, 3D TOF, etc.
  • IMU inertial measurement unit
  • IMU inertial measurement unit
  • the three-dimensional laser range finder and the three-dimensional TOF can detect the distance of an object or a body surface under the drone.
  • the inertial measurement unit can be used to measure the height of most drones.
  • the inertial measurement unit may include, but is not limited to, one or more accelerometers, gyroscopes, magnetometers, or any combination thereof.
  • the accelerometer can be used to measure the acceleration of the drone to calculate the speed of the drone.
  • the drone can detect and/or monitor environmental information through the sensors described above to select a location suitable for landing.
  • the environmental information includes, but is not limited to, ground flatness, whether it is a water surface, and the like.
  • the drone may take a picture of environmental information through a camera and extract depth information from the picture to reconstruct a three-dimensional topography of the environment (eg, a three-dimensional topography of the bottom of the drone), and Select a location suitable for landing from the three-dimensional terrain.
  • a three-dimensional topography of the environment eg, a three-dimensional topography of the bottom of the drone
  • the drone can automatically land down based on environmental information (eg, altitude) detected and/or monitored by the sensor described above until it stops on the ground below the drone.
  • environmental information eg, altitude
  • the drone can perform a segmented landing, and the landing speed of the drone is different in each segment, and the number of the segments is not limited herein, and may be any number (for example, 2 segments, 3 segments) , 4 paragraphs, etc.).
  • the drone can hover at a preset height after landing a certain height.
  • the environmental information of the bottom of the drone may be detected by a sensor to select a suitable landing location to control the drone to automatically land.
  • FIG. 3 is a schematic diagram of a module of a drone according to an embodiment of the present invention.
  • the drone 100 may include one or more processors 301, a sensor module 302, a storage module 303, and an input and output module 304.
  • the control module 301 can include one or more processors, including but not limited to a microprocessor (English: microcontroller), a reduced instruction set computer (English: reduced instruction set computer, referred to as: RISC), dedicated integration Circuit (English: application specific integrated circuits, ASIC for short), dedicated instruction set processor (English: application-specific instruction-set) Processor, referred to as: ASIP), central processing unit (English: central processing unit, referred to as: CPU), physical processor English (English: physics processing unit, referred to as: PPU), digital signal processor (English: digital signal processor, referred to as DSP), field programmable gate array (English: field programmable gate array, referred to as: FPGA).
  • a microprocessor English: microcontroller
  • RISC reduced instruction set computer
  • dedicated integration Circuit English: application specific integrated circuits, ASIC for short
  • ASIP dedicated instruction set processor
  • ASIP dedicated instruction set processor
  • CPU central processing unit
  • CPU central processing unit
  • PPU physical processor English
  • DSP digital signal processor
  • the sensor module 302 may include one or more sensors including, but not limited to, a temperature sensor, an inertial measurement unit, an accelerometer, an image sensor (such as a camera), an ultrasonic sensor, a microwave sensor, a proximity sensor, and a three-dimensional laser measurement. Distance meter, infrared sensor, etc.
  • the inertial measurement unit can be used to measure the height of most drones.
  • the inertial measurement unit may include, but is not limited to, one or more accelerometers, gyroscopes, magnetometers, or any combination thereof.
  • the accelerometer can be used to measure the acceleration of the drone to calculate the speed of the drone.
  • the storage module 303 can include, but is not limited to, a read only memory (ROM), a random access memory (RAM), a programmable system memory (PROM), an electronic erasable programmable read only memory (EEPROM), and the like.
  • the storage module 303 can include a transitory computer readable medium that can store code, logic or instructions for performing one or more of the steps described elsewhere herein.
  • the control module 301 can perform one or more steps individually or collectively in accordance with code, logic or instructions of the non-transitory computer readable medium described herein.
  • the input/output module 304 is configured to output information or instructions to an external device, such as receiving an instruction sent by the input/output device 148 (see FIG. 1), or transmitting an image captured by the imaging device 131 (see FIG. 1) to The input and output device 148.
  • control module 301 can control the drone 100 to land based on information detected and/or monitored by the sensor module 302. For example, the control module 301 can calculate the landing speed of the drone 100 according to the height detected and/or monitored by the sensor module 302. For example, the control module 301 can reconstruct a three-dimensional terrain of the bottom of the drone 100 according to an image or video captured by the sensor module 302, and select a suitable landing location in the three-dimensional terrain to control the The drone 100 landed.
  • FIG. 4 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • the execution body of the flow is the control system of the drone and one or more sensors.
  • the control system of the drone may receive data detected and/or monitored by the one or more sensors and control the drone motion based on the data.
  • Step 401 Acquire a first height of the drone.
  • the drone may acquire the first height by one or more of one or more ultrasonic sensors, such as ultrasonic sensors 177 and 178 (see FIG. 1).
  • the ultrasonic sensors 177 and 178 may transmit ultrasonic waves to the ground, and the ultrasonic waves may be received by the drone after being reflected by the ground, by acquiring the transmission time and the receiving time of the ultrasonic waves, supplemented by the propagation speed of the sound.
  • the first height can be calculated.
  • the drone may acquire the first height through one or more cameras, such as forward looking cameras 171 and 172, and one or more of the lower looking cameras 173 and 174.
  • the down-view cameras 173 and 174 are mounted to the bottom of the drone for capturing images and/or video at the bottom of the drone.
  • the UAV can extract depth information in the image and/or the video by using stereo matching techniques, and reconstruct a three-dimensional terrain of the bottom of the drone according to the depth information. Thereby the first height is obtained.
  • the drone may utilize an external camera (eg, imaging device 131) to acquire the first height.
  • the UAV can transmit microwaves to the ground through a microwave sensor, and the microwaves can be received by the UAV after being reflected by the ground, by acquiring the transmission time and the receiving time of the microwave, supplemented by The first height can be calculated by the speed of propagation of the microwave.
  • the drone may acquire the first height by one or more laser rangefinders.
  • the laser range finder may be installed at the bottom of the drone to transmit microwaves to the ground, and the laser may be received by the drone after being reflected by the ground, by acquiring the sending moment of the laser And the receiving time, supplemented by the propagation speed of the laser, can calculate the first height. .
  • the UAV can acquire the first height by using an infrared sensor, a proximity sensor, or the like, and details are not described herein.
  • Step 402 Acquire a preset reference height.
  • the preset reference height can be used to calculate a drop speed of the drone, the reference height being preset with the drone, such as the computer readable medium 143a ( See Figure 1), the memory module 403 (see Figure 4), or the memory.
  • FIG. 5 is a schematic diagram of Embodiment 1 of an automatic landing of a drone provided by the present invention.
  • the drone 500 has an automatic landing function.
  • the drone 500 can automatically land in a direction perpendicular to the ground 501.
  • the drone 500 can segment automatically in a direction perpendicular to the ground 501 based on the reference altitude.
  • the preset reference height includes a first reference height H 1 and a second reference height H 2 , and when the height of the drone 500 is greater than or equal to H 1 , it will fall according to the speed V 1 . When the height of the drone 500 is greater than H 2 less than H 1 , it will fall at a speed V 2 . When the height of UAV 500 is less than or equal to 2 H, it will land at the speed V 3.
  • the formula for calculating V1, V2, and V3 is:
  • h is the current height of the drone 500, and a and b are constant.
  • H 1 is 5 meters and H 2 is 0.5 meters, and the drone 500 will calculate its landing speed according to the following formula.
  • h refers to the current height of the drone 500.
  • V is a constant (eg, 5 meters/second). , 4 m / s).
  • V 2 is a variable which is linearly related to the height h.
  • V 3 is 0.4 m/sec.
  • the drone or a user manipulating the drone can make values of the first reference height and the second reference height. Modifications and transformations in real time, but such modifications and transformations are still within the scope of the invention.
  • the first height H 1 is modified to be 10 meters
  • the second height H 2 is modified to be 1 meter.
  • the user can modify and transform the parameters and constants in the formula group 1, but the modifications and transformations are still within the protection scope of the present invention.
  • the reference height can be used to calculate the landing speed of the drone and to hover the drone.
  • the reference height is preset within the drone, such as the computer readable medium 143a (see FIG. 1), the storage module 403 (see FIG. 4), or the memory.
  • FIG. 6 is a schematic diagram of Embodiment 2 of an automatic landing of a drone provided by the present invention.
  • the drone 600 has an automatic drop Fall function.
  • the preset reference height is h
  • the current height of the drone is H.
  • the drone 600 can automatically drop one end distance in a direction perpendicular to the ground 601 and then hover over h.
  • the landing speed V of the drone 600 falling from the height H to the height h can be calculated according to the following formula:
  • V H-h (formula group 3)
  • the drop speed V of the drone 600 is linearly related to the height H, and decreases as its height decreases, eventually causing the drone 600 to hover over the preset reference height h.
  • the drop speed V of the drone 600 may be nonlinearly related to the height H.
  • the drop speed V of the drone 600 may be calculated according to the following formula:
  • V H 2 -h (formula group 4)
  • the landing speed V of the drone 600 is nonlinearly related to the height H, and decreases as its height decreases, eventually causing the drone 600 to hover at the height.
  • Step 403 Analyze the first height according to the preset reference height to obtain a first landing speed of the drone.
  • the first drop speed of the drone can be calculated according to the formula group 1, the formula group 2, the formula group 3, the formula group 4, and the like in the above, and will not be described herein.
  • the first drop speed can be V 1 . In other embodiments, the first drop speed may be V 2 .
  • the first drop speed is constant and the drone can land at a predetermined height at a constant speed or land at a uniform speed.
  • Step 404 controlling the drone movement according to the first landing speed.
  • the drone may acquire a second height that is less than or equal to the second reference height, and the drone may acquire a second landing speed based on the second height.
  • the second rate of descent is constant (e.g., V 3), the UAV can descend to the ground according to the second rate of descent.
  • the drone can detect and/or monitor the environment at the bottom of the drone using an onboard environmental sensor after landing at a certain altitude (eg, 3 meters) according to the first drop speed. Information to determine if the bottom of the drone is suitable for landing.
  • the sensors include, but are not limited to, ultrasonic sensors, infrared sensors, proximity sensors, microwave sensors, cameras, three-dimensional laser range finder, three-dimensional TOF, and the like.
  • the drone can pass through a camera, as seen below, cameras 173 and 174 One or both of (see Figure 2), taking images of the bottom of the drone from different angles.
  • the drone can select a target area in the image through a sliding window.
  • the coordinate information of each pixel in the target area (such as the x, y, z coordinates of each pixel, and the z coordinate representing the depth information of the pixel) can be extracted.
  • the drone may simultaneously or sequentially capture two images using a camera (such as cameras 172 and 714), and the depth information may be extracted according to a stereo matching technique (such as a semi-global block matching algorithm). come out.
  • the drone may select an optimal plane and a value function corresponding to the optimal plane from the icon regions according to coordinate information of pixels in the target area.
  • the drone can determine an optimal plane and its corresponding value function by using an algorithm, such as the Leverberg-Marquardt algorithm. Thereafter, the drone will traverse the remaining area in the image through the sliding window to generate a plurality of optimal planes and a value function corresponding to the plurality of optimal planes.
  • the plurality of optimal planes will be processed (e.g., smoothed) to produce an optimal plane as a suitable landing location.
  • the drone captures an image of the bottom through the camera, and reconstructs a three-dimensional topography of the bottom according to the image to select a location suitable for landing from the three-dimensional terrain.
  • An advantage of the embodiment of the present invention is that the location that can be landed can be automatically selected through the three-dimensional terrain, and the landing can be safely and gently, without human intervention. Ensure the safety of the drone landing, avoid damage to the drone or hurt others.
  • FIG. 7 is a schematic diagram of Embodiment 3 of an automatic landing of a drone according to an embodiment of the present invention.
  • the drone 700 can take a bottom image 705 by the method described in the above embodiment, and reconstruct a three-dimensional topography of the bottom of the drone 700 according to the image 705, and select a suitable landing location in the three-dimensional terrain.
  • the drone 700 can recognize the water surface 701, the small slope 702, and the flat land 703.
  • the drone 700 can determine that the water surface 701 and the small slope 702 are not suitable for landing, and finally select the flat land 703 to land.
  • the drone if no location suitable for landing is found in the three-dimensional terrain, the drone will remain at a constant height, and a new bottom image is captured by moving horizontally to acquire a new three-dimensional terrain, and an attempt is made. Find the right place to land in the new 3D terrain.
  • the drone if the drone can never find a suitable landing site, such as unable to find a suitable landing site within a predetermined time interval, the drone will hover over a known location. The height, waiting for the user to enter the next instruction.
  • the drone if the drone finds a suitable landing site in the acquired three-dimensional terrain, the drone will land directly to the landing site.
  • the drone may acquire status information of sensors of the drone, such as the forward looking cameras 171 and 172, the lower looking cameras 173 and 174, the ultrasonic sensors 177 and 178. For example, detecting and/or monitoring whether the sensor has failed. For example, the drone can determine that the sensor has failed by sending an inquiry signal to the sensor, if the sensor does not return a response signal. In other embodiments, the sensor may transmit detected and/or monitored information to the drone periodically or non-periodically if the drone is within a predetermined time interval (eg, 60) Secondly, if the information sent by the sensor is not received, the drone can judge that the sensor is invalid.
  • a predetermined time interval eg, 60
  • the drone can hover at a known altitude, waiting for the user to confirm whether the landing site is safe, and if the user confirms that the landing site is safe, The drone sends a landing command. After the drone receives the landing command, it will begin to land until the entire landing process is completed.
  • the protection mechanism can be adopted in time to control the drone to hover at a predetermined height and wait for the input instruction of the user, thereby ensuring that the drone can safely land. To avoid damage to drones or injury to others.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机(500)的控制方法,包括,获取无人机(500)的第一高度(401),获取无人机(500)的参考高度(402),根据第一高度及参考高度获取无人机(500)的第一降落速度(403),以及根据第一降落速度控制无人机(500)运动(404)。

Description

无人机及其控制方法 【技术领域】
本发明涉及一种无人机,且特别涉及一种具有自动降落功能的无人机。
【背景技术】
无人机飞行降落时无人机飞行每次都必须有的一个过程,而且降落时发生事故的概率往往比较高。如降落到不适合无人机降落的地方(如树上,水面,不平的地面等),容易造成无人机受损伤。所以很有必要在无人机降落的时候引入一些保护措施。保证无人机降落的安全性,避免损坏无人机或伤及他人。
现有的无人机降落方法一般都是直接往下降,没有感知降落时距地面还有多高,或者有感知但是没有加入一些保护措施,而对地面环境的感知往往没有。这容易造成无人机降落时降落到不适合降落的地方而发生安全事故。
【发明内容】
本发明主要解决的技术问题是提供一种无人机及其控制方法,能够在不同环境下选择合适的降落地点而自动降落,不需要人为干预,避免损坏无人机或伤及他人。
本发明一方面提供了一种无人机的控制方法,所述方法包括:获取所述无人机的第一高度;获取预置的参考高度;根据所述预置的参考高度对所述第一高度进行分析,以获取所述无人机的第一降落速度;以及根据所述第一降落速度控制所述无人机运动。
本发明另一方面提供了一种无人机,所述无人机包括:传感器,用于获取所述无人机的第一高度;存储器,用于存储预置的参考高度;以及一个或多个处理器,用于:从所述存储器中调取所述预置的参考高度;根据所述预置的参考高度对所述第一高度进行分析,以获取所述无人机的第一降落速度;以及根据所述第一降落速度控制所述无人机运动。
在一些实施例中,所述第一降落速度线性相关于所述预置的参考高度。
在一些实施例中,所述预置的参考高度包括第一参考高度及第二参考高度, 所述第二参考高度小于所述第一参考高度。
在一些实施例中,所述根据所述第一降落速度控制所述无人机运动包括:根据所述第一降落速度控制所述人机在所述第一参考高度及所述第二参考高度内运动。
在一些实施例中,所述根据所述第一降落速度控制所述无人机降落包括:根据所述第一降落速度控制所述无人机悬停在所述预置的参考高度。
在一些实施例中,所述方法还包括:获取所述无人机的第二高度,所述第二高度小于或等于所述第二参考高度;以及根据所述无人机的第二高度获取所述无人机的第二降落速度。
在一些实施例中,述第二降落速度小于所述第一降落速度。
在一些实施例中,所述第二降落速度为常量。
在一些实施例中,所述方法还包括:获取所述无人机的环境图像;从所述环境图像中提取降落地点;以及根据所述降落地点控制所述无人机降落。
在一些实施例中,所述根据所述降落地点控制所述无人机降落包括:根据所述降落地点控制所述无人机降落到地面上对应所述降落点的区域。
在一些实施例中,所述方法还包括:获取所述无人机的环境图像;以及当没有降落地点从所述环境图像中提取出来时,控制所述无人机水平飞行。
在一些实施例中,所述方法还包括:接收传感器的状态信息;以及根据所述状态信息控制所述无人机,使其悬停于预定高度。
【附图说明】
为了更清楚地说明本披露实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本披露的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的无人机的结构示意图;
图2为本发明实施例提供的无人机底部的结构示意图;
图3为本发明实施例提供的无人机的模块示意图;
图4为本发明实施例提供的无人机控制方法的流程图;
图5为本发明提供的无人机自动降落实施例一的示意图;
图6为本发明提供的无人机自动降落实施例二的示意图;
图7为本发明提供的无人机自动降落实施例三的示意图;
【具体实施方式】
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明的一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本发明保护的范围。
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本发明的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
下面结合附图和实施例对本发明进行详细说明。
参阅图1,图1为本发明实施例提供的无人机结构示意图。无人机100可以包括机身110,所述机身110包括中央部分111以及一个或多个外部部分112。在图1所示的实施例中,所述机身110包括四个外部部分112(如机臂113)。所述四个外部部分112分别从所述中央部分111延伸出来。在其他实施例中,所述机身110可以包含任意数量的外部部分112(如6个、8个等)。在任何上述实施例中,每个所述外部部分112可以承载一个推进系统120,所述推进系统120可以驱动所述无人机100运动(如爬升、降落、水平移动等)。例如:所述机臂113可以承载对应的电机121,所述电机121可以驱动对应的螺旋桨转动。所述无人机100可以控制任意一组电机121及其对应的螺旋桨122,而不受其余的电机121及其对应的螺旋桨影响。
所述机身110可以携带一个负载130,例如:成像装置131。在一些实施例中,所述成像装置131可以包括一个摄像头,例如:可以拍摄所述无人机周围 的图像、视频等。所述摄像头光敏于各种波长的光线,包括但不限于可见光、紫外线、红外线或其中的任意组合。在一些实施例中,所述负载130可以包括其他种类的传感器。在一些实施例中,所述负载130通过云台150与所述机身110连接在一起,使得所述负载130可以相对于所述机身110运动。例如:当所述负载130携带成像装置131时,所述成像装置131可以相对于机身110运动以拍摄所述无人机100周围的图像、视频等。如图所示,当无人机100位于地面时,起落架114可以支撑无人机100以保护所述负载130。
在一些实施例中,所述无人机100可以包括控制系统140,所述控制系统140包括置于所述无人机100的组件以及与所述无人机100分离的组件。例如,所述控制系统140可以包括一个置于所述无人机100上的第一控制器141,以及一个远离所述无人机100并通过通信链路160(如无线链路)与所述第一控制器141连接的第二控制器142。所述第一控制器141可以包括一个或多个处理器、存储器、以及机载计算机可读介质143a,所述机载计算机可读介质143a可以存储用于控制无人机100行为的程序指令,所述行为包括但不限于所述推进系统120及所述成像装置131的操作,控制所述无人机进行自动降落等。所述第二控制器142可以包括一个或多个处理器、存储器、机外计算机可读介质143b,以及一个或多个输入输出装置148,例如:显示装置144及控制装置145。所述无人机100的操作者可以通过所述控制装置145远程控制所述无人机100,并通过所述显示装置144和/或其他装置接收来自所述无人机100的反馈信息。在其他实施例中,所述无人机100可以自主运作,此时所述第二控制器142可以被省去,或者所述第二控制器142可以仅被用来使无人机操作者重写用于无人机飞行的函数。所述机载计算机可读介质143a可以被移出于所述无人机100。所述机外计算接可读介质143b可以被移出于所述第二控制器142。
在一些实施例中,所述无人机100可以包括两个前视摄像头171和172,所述前视摄像头171和172光敏于各种波长的光线(如可见光、红外光、紫外线)用于拍摄所述无人机周围的图像或视频。在一些实施例中,所述无人机100包括置于底部的一个或多个传感器。
图2是本发明实施例提供的无人机底部的结构示意图。所述无人机100可以包括两个置于所述机身110底部的下视摄像头173和174。此外,所述无人机100还包括两个置于所述机身110底部的超声传感器177和178。所述超声传感 器177和178可以检测和/或监测所述无人机100底部的物体及地面,并通过发送及接受超声波来测量离该物体或地面的距离。
在其他实施例中,所述无人机100可以包括惯性测量单元(英文:inertial measurement unit,缩写:IMU)、红外传感器、微波传感器、温度传感器、近距离传感器(英文:proximity sensor)、三维激光测距仪、三维TOF等。所述三维激光测距仪及所述三维TOF可以检测无人机具下方物体或体面的距离。
在一些实施例中,所述惯性测量单元可以用于测量多数无人机的高度。所述惯性测量单元可以包括但不限于,一个或多个加速度计、陀螺仪、磁力仪或其中的任意组合。所述加速度计可以用于测量所述无人机的加速度,以计算所述无人机的速度。
在一些实施例中,所述无人机可以通过上述传感器检测和/或监测环境信息,以选择一个适合降落的地点。所述环境信息包括但不限于地面平整度、是否是水面等。
在一些实施例中,所述无人机可以通过摄像头拍摄关于环境信息的图片,并从该图片中提取深度信息以重建出所述环境的三维地形(如无人机底部的三维地形),并从所述三维地形中选择一个适合降落的地点。
在一些实施例中,所述无人机可以根据上述传感器检测和/或监测到的环境信息(如高度),进行自动降落直到停留在所述无人机下方的地面。例如,所述无人机可以进行分段降落,在每一段中所述无人机的降落速度不同,所述分段的数量在此不做限定,可以是任意数量(如2段、3段、4段等)。
在一些实施例中,所述无人机可以在降落一定高度之后,悬停在一预置的高度。在一些实施例中,所述无人机悬停之后,可以通过传感器检测所述无人机底部的环境信息,以选择一个适合降落的地点以控制所述无人机自动降落。
图3为本发明实施例提供的的无人机的模块示意图。参阅图3,无人机100可以包括一个或多个处理器301、传感器模块302、存储模块303以及输入输出模块304。
所述控制模块301可以包括一个或多个处理器,所述处理器包括但不限于微处理器(英文:microcontroller),精简指令集计算机(英文:reduced instruction set computer,简称:RISC),专用集成电路(英文:application specific integrated circuits,简称:ASIC),专用指令集处理器(英文:application-specific instruction-set  processor,简称:ASIP),中央处理单元(英文:central processing unit,简称:CPU),物理处理器英文(英文:physics processing unit,简称:PPU),数字信号处理器(英文:digital signal processor,简称DSP),现场可编程门阵列(英文:field programmable gate array,简称:FPGA)等。
所述传感器模块302可以包括一个或多个传感器,所述传感器包括但不限于温度传感器、惯性测量单元、加速度计、图像传感器(如摄像头)、超声传感器、微波传感器、近距离传感器、三维激光测距仪、红外传感器等。
在一些实施例中,所述惯性测量单元可以用于测量多数无人机的高度。所述惯性测量单元可以包括但不限于,一个或多个加速度计、陀螺仪、磁力仪或其中的任意组合。所述加速度计可以用于测量所述无人机的加速度,以计算所述无人机的速度。
所述存储模块303可以包括但不限于只读存储器(ROM)、随机存储器(RAM)、可编程制度存储器(PROM)、电子抹除式可编程只读存储器(EEPROM)等。所述存储模块303可以包括费暂时性计算机可读介质,其可以存储用于执行本文其他各处所描述的一个或多个步骤的代码、逻辑或指令。所述控制模块301,其可以根据本文所描述的非暂时性计算机可读介质的代码、逻辑或指令而单独地或共同地执行一个或多个步骤。
所述输入输出模块304用于向外部设备输出信息或指令,如接收所述输入输出装置148(见图1)发送的指令,或将所述成像装置131(见图1)拍摄的图像发送给所述输入输出装置148。
在一些实施例中,所述控制模块301可以根据所述传感器模块302检测和/或监测到的信息,控制所述无人机100降落。例如,所述控制模块301可以根据所述传感器模块302检测和/或监测到的高度,计算所述无人机100的降落速度。再如,所述控制模块301可以根据所述传感器模块302拍摄的图像或视频,重建所述无人机100底部的三维地形,并在所述三维地形中选择合适的降落地点,以控制所述无人机100降落。
图4为本发明实施例提供的无人机控制方法的流程图。参阅图4,该流程的执行主体为所述无人机的控制系统及一个或多个传感器。所述无人机的控制系统可以接收所述一个或多个传感器检测和/或监测的数据,并根据所述数据控制所述无人机运动。
步骤401,获取所述无人机的第一高度。
在一些实施例中,所述无人机可以通过一个或多个超声传感器,如:超声传感器177和178(见图1)中的一个或多个,获取所述第一高度。具体地,所述超声传感器177和178可以向地面发送超声波,所述超声波经地面反射之后可以被所述无人机接收,通过获取所述超声波的发送时刻以及接收时刻,辅以声音的传播速度,可以计算出所述第一高度。
在其他实施例中,所述无人机可以通过一个或多个摄像头,如前视摄像头171和172,下视摄像头173和174中的一个或多个,获取所述第一高度。如图2所示,所述下视摄像头173和174被安装于所述无人机的底部,用于拍摄所述无人机底部的图像和/或视频。所述无人机可以利用立体匹配技术(英文:stereo matching techniques),提取所述图像和/或所述视频中的深度信息,并依据所述深度信息重建所述无人机底部的三维地形,从而获取所述第一高度。在其他实施例中,所述无人机可以利用外部摄像头(如:成像装置131),获取所述第一高度。
在其他实施例中,所述无人机可以通过微波传感器向地面发送微波,所述微波经地面反射之后可以被所述无人机接收,通过获取所述微波的发送时刻及接收时刻,辅以微波的传播速速,可以计算出所述第一高度。
在其他实施例中,所述无人机可以通过一个或多个激光测距仪获取所述第一高度。具体地,所述激光测距仪可以被安装在所述无人机的底部以向地面发送微波,所述激光经地面反射之后可以被所述无人机接收,通过获取所述激光的发送时刻及接收时刻,辅以激光的传播速度,可以计算出所述第一高度。。
在其他实施例中,所述无人机可以通过红外传感器、近距离传感器等获取所述第一高度,在此不做赘述。
步骤402,获取预置的参考高度。
在一些实施例中,所述预置的参考高度可以用于计算所述无人机的降落速度,所述参考高度被预置与所述无人机内,如所述计算机可读介质143a(见图1)、所述存储模块403(见图4)、或者所述存储器。参阅图5,图5为本发明提供的无人机自动降落实施例一的示意图。无人机500有自动降落功能。在一些实施例中,无人机500可以沿垂直于地面501的方向自动降落。在一些实施例中,无人机500可以根据参考高度,沿垂直于地面501的方向分段进行自动降 落。例如:所述预置的参考高度包括第一参考高度H1及第二参考高度H2,当无人机500的高度大于或等于H1时,其将按照速度V1降落。当所述无人机500的高度大于H2小于H1时,其将按照速度V2降落。当所述无人机500的高度小于或等于H2时,其将按照速度V3降落。计算V1、V2、V3的公式为:
Figure PCTCN2016100197-appb-000001
其中,h指所述无人机500的当前高度,a与b为常量。
在一些实施例中,H1为5米,H2为0.5米,无人机500将按照如下公式计算其降落速度。
Figure PCTCN2016100197-appb-000002
其中,h指所述无人机500的当前高度,当所述无人机500的高度大于或等于5米时,其将按照速度V米/秒降落,V为一常量(如5米/秒、4米/秒)。当所述无人机500的高度大于0.5米且小于5米时,其将按照速度V2降落。速度V2为一变量,其线性相关于高度h。当所述无人机500的高度小于或等于0.5米时,其将按照速度V3降落。在本实施例中,V3为0.4米/秒。
值得注意的是,上述对参考高度的描述仅为了便于理解本发明。对本领域的普通技术人员来说,在理解本发明的基础上,所述无人机或操控所述无人机的用户可以对所述第一参考高度及所述第二参考高度的值做出实时的修改与变换,但所述修改与变换仍在本发明的保护范围之内。例如:修改所述第一高度H1为10米,修改所述第二高度H2为1米。再如:用户可以对公式组1中的参数及常量做出修改与变换,但所述修改与变换仍在本发明的保护范围之内。
在一些实施例中,所述参考高度可以用于计算所述无人机的降落速度,并使所述无人机悬停。所述参考高度被预置于所述无人机内,如所述计算机可读介质143a(见图1)、所述存储模块403(见图4)、或者所述存储器。参阅图6,图6为本发明提供的无人机自动降落实施例二的示意图。无人机600有自动降 落功能。在本实施例中,所述预置的参考高度为h,无人机的当前高度为H,无人机600可以沿垂直于地面601的方向自动降落一端距离后悬停于h。则无人机600从高度H降落到高度h的降落速度V可以根据如下公式计算:
V=H-h            (公式组3)
其中,无人机600的降落速度V线性相关于高度H,随着其高度不断降低而减小,最终使得无人机600悬停于所述预置的参考高度h。
在其他实施例中,无人机600的降落速度V可以非线性相关于高度H,如:无人机600的降落速度V可以按照如下公式计算:
V=H2-h            (公式组4)
其中,无人机600的降落速度V非线性相关于高度H,随着其高度不断降低而减小,最终使得无人机600悬停于高度
Figure PCTCN2016100197-appb-000003
步骤403,根据所述预置的参考高度对所述第一高度进行分析,以获取所述无人机的第一降落速度。
其中,所述无人机的第一降落速度可以根据上文中的公式组1、公式组2、公式组3、公式组4等计算,在此不作赘述。
在一些实施例中,所述第一降落速度可以为V1。在其他实施例中,所述第一降落速度可以为V2
在一些实施例中,所述第一降落速度为常量,所述无人机可以匀速降落到一预定高度,或者匀速降落到地面。
步骤404,根据所述第一降落速度控制所述无人机运动。
在一些实施例中,所述无人机可以获取第二高度,所述第二高度小于或等于所述第二参考高度,所述无人机可以根据所述第二高度获取第二降落速度。在一些实施例中,所述第二降落速度为常量(如V3),所述无人机可以根据所述第二降落速度降落到地面。
在一些实施例中,所述无人机可以依据所述第一降落速度降落到一定高度(如3米)后,利用机载的环境传感器,检测和/或监测所述无人机底部的环境信息,以判断所述无人机的底部是否适合降落。所述传感器包括但不限于超声传感器、红外传感器、近距离传感器、微波传感器、摄像头、三维激光测距仪、三维TOF等。
在一些实施例中,所述无人机可以通过摄像头,如下视摄像头173和174 (见图2)中的一个或两个,从不同角度拍摄所述无人机底部的图像。在一些实施例中,所述无人机可以通过一个滑动窗口在所述图像中选择一个目标区域。所述目标区域中每个像素的坐标信息(如每个像素的x、y、z坐标,z坐标表示所述像素的深度信息)可以被提取出来。在一些实施例中,所述无人机可以利用摄像头(如摄像头172和714)同时拍摄或相继拍摄两张图像,所述深度信息可以根据立体匹配技术(如semi-global block matching算法)被提取出来。所述无人机可以根据所述目标区域中像素的坐标信息从所述图标区域中选择一个最佳平面以及与所述最佳平面对应的价值函数。在一些实施例中,所述无人机可以通过利用算法,如:莱文贝格-马夸特方法(英文:Leverberg-Marquardt algorithm),来确定一个最佳平面及其对应的价值函数。之后,所述无人机将通过所述滑动窗口遍历所述图像中的剩余区域,生成多个最佳平面及与所述多个最佳平面对应的价值函数。所述多个最佳平面将被处理(如平滑处理)以生成一个最佳平面作为适合降落的地点。
在本实施例中,所述无人机通过摄像头拍摄底部的图像,并根据所述图像重建出底部的三维地形,以从所述三维地形中选择一个适合降落的地点。本发明实施例的优点在于,可以通过所述三维地形,自动选择可以降落的地点,安全平缓地降落,而不需要人为干预。保证无人机降落的安全性,避免损坏无人机或伤及他人。
参阅图7,图7为本发明实施例提供的无人机自动降落实施例三的示意图。无人机700可以通过以上实施例描述的方法,拍摄底部图像705,并依据所述图像705重建所述无人机700底部的三维地形,并在所述三维地形中选择合适的降落地点。例如:所述无人机700可以识别出水面701、小坡702以及平地703。所述无人机700可以判断所述水面701及所述小坡702不适合降落,并最终选择所述平地703降落。
在一些实施例中,如果所述三维地形中没有找到适合降落的地点,则所述无人机将保持高度不变,通过水平方向移动拍摄新的底部图片以获取新的的三维地形,并尝试在新的三维地形中寻找合适的降落地点。
在一些实施例中,如果所述无人机始终找不到合适的降落地点,如在一个预定的时间区间内无法找到合适的降落地点,则所述无人机将会悬停在一个已知的高度,等待用户输入下一个指令。
在一些实施例中,如果所述无人机在获取的三维地形中找到了合适的降落地点,则所述无人机将直接降落至所述降落地点。
在一些实施例中所述无人机可以获取所述无人机的传感器(如所述前视摄像头171和172、所述下视摄像头173和174、所述超声传感器177和178)的状态信息,如检测和/或监测所述传感器是否失效。例如,所述无人机可以通过向所述传感器发送一个查询信号,如果所述传感器没有返回应答信号,则所述无人机可以确定所述传感器失效。在其他实施例中,所述传感器可以周期性或非周期性地向所述无人机发送检测和/或监测到的信息,如果所述无人机在一个预定的时间区间内(如:60秒)未收到所述传感器发送的信息,则所述无人机可以判断所述传感器失效。
在一些实施例中,如果所述无人机判断所述传感器失效,则所述无人机可以悬停在一个已知高度,等待用户确认降落地点是否安全,如果用户确认降落地点安全,可以向所述无人机发送一个降落指令。所述无人机收到降落指令后,将开始降落直到完成整个降落过程。
采用本发明实施例,在无人机的传感器失效时,可以及时采用保护机制,控制所述无人机悬停在一预定高度,等待用户的输入的指令,从而确保了无人机可以安全降落,避免损坏无人机或伤及他人。
值得注意的是,上述流程图只是为了便于理解本发明,不应被视为是本发明唯一的实现方案。对本领域的普通技术人员来说,在理解本发明的基础上,可以对上述流程图的中的步骤进行增加、删除以及变换,但所述对流程图的修改仍在本发明的保护范围之内。例如,用户可以修改所述参考高度。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者该专利披露。
最后应说明的是:以上各实施例仅用以说明本披露的技术方案,而非对其限制;尽管参照前述各实施例对本披露进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者 对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本披露各实施例技术方案的范围。

Claims (24)

  1. 一种无人机的控制方法,其特征在于,所述方法包括:
    获取所述无人机的第一高度;
    获取预置的参考高度;
    根据所述预置的参考高度对所述第一高度进行分析,以获取所述无人机的第一降落速度;以及
    根据所述第一降落速度控制所述无人机运动。
  2. 根据权利要求1所述的方法,其特征在于,所述第一降落速度线性相关于所述预置的参考高度。
  3. 根据权利要求2所述的方法,其特征在于,所述预置的参考高度包括第一参考高度及第二参考高度,所述第二参考高度小于所述第一参考高度。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述第一降落速度控制所述无人机运动包括:
    根据所述第一降落速度控制所述人机在所述第一参考高度及所述第二参考高度内运动。
  5. 根据权利要求2所述的方法,其特征在于,所述根据所述第一降落速度控制所述无人机降落包括:
    根据所述第一降落速度控制所述无人机悬停在所述预置的参考高度。
  6. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    获取所述无人机的第二高度,所述第二高度小于或等于所述第二参考高度;以及
    根据所述无人机的第二高度获取所述无人机的第二降落速度。
  7. 根据权利要求6所述的方法,其特征在于,所述第二降落速度小于所述第一降落速度。
  8. 根据权利要求7所述的方法,其特征在于,所述第二降落速度为常量。
  9. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    获取所述无人机的环境图像;
    从所述环境图像中提取降落地点;以及
    根据所述降落地点控制所述无人机降落。
  10. 根据权利要求9所述的方法,其特征在于,所述根据所述降落地点控制所述无人机降落包括:
    根据所述降落地点控制所述无人机降落到地面上对应所述降落地点的区域。
  11. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    获取所述无人机的环境图像;以及
    当没有降落地点从所述环境图像中提取出来时,控制所述无人机水平飞行。
  12. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    接收传感器的状态信息;以及
    根据所述状态信息控制所述无人机,使其悬停于预定高度。
  13. 一种无人机,其特征在于,所述无人机包括:
    传感器,用于获取所述无人机的第一高度;
    存储器,用于存储预置的参考高度;以及
    一个或多个处理器,用于:
    从所述存储器中调取所述预置的参考高度;
    根据所述预置的参考高度对所述第一高度进行分析,以获取所述无人机的第一降落速度;以及
    根据所述第一降落速度控制所述无人机运动。
  14. 根据权利要求13所述的无人机,其特征在于,所述第一降落速度线性相关于所述预置的参考高度。
  15. 根据权利要求14所述的无人机,其特征在于,所述预置的参考高度包括第一参考高度及第二参考高度,所述第二参考高度小于所述第一参考高度。
  16. 根据权利要求15所述的无人机,其特征在于,所述根据所述第一降落速度控制所述无人机运动包括:
    根据所述第一降落速度控制所述无人机在所述第一参考高度及所述第二参考高度内运动。
  17. 根据权利要求14所述的无人机,其特征在于,所述根据所述第一降落速度控制所述无人机降落包括:
    根据所述第一降落速度控制所述无人机悬停在所述预置的参考高度。
  18. 根据权利要求16所述的无人机,其特征在于,所述方法还包括:
    获取所述无人机的第二高度,所述第二高度小于或等于所述第二参考高度;以及
    根据所述无人机的第二高度获取所述无人机的第二降落速度。
  19. 根据权利要求18所述的无人机,其特征在于,所述第二降落速度小于所述第一降落速度。
  20. 根据权利要求18所述的无人机,其特征在于,所述第二降落速度为常量。
  21. 根据权利要求17所述的无人机,其特征在于,所述一个或多个处理器还用于:
    获取所述无人机的环境图像;
    从所述环境图像中提取降落地点;以及
    根据所述降落地点控制所述无人机降落。
  22. 根据权利要求21所述的无人机,其特征在于,所述根据所述降落地点控制所述无人机降落包括:
    根据所述降落地点控制所述无人机降落到地面上对应所述降落地点的区域。
  23. 根据权利要求17所述的无人机,其特征在于,所述方法还包括:
    获取所述无人机的环境图像;以及
    当没有降落地点从所述环境图像中提取出来时,控制所述无人机水平飞行。
  24. 根据权利要求13所述的无人机,其特征在于,所述一个或多个传感器还用于:
    接收传感器的状态信息;以及
    根据所述状态信息控制所述无人机,使其悬停于预定高度。
PCT/CN2016/100197 2016-09-26 2016-09-26 无人机及其控制方法 WO2018053867A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2016/100197 WO2018053867A1 (zh) 2016-09-26 2016-09-26 无人机及其控制方法
CN202011464102.0A CN112666969A (zh) 2016-09-26 2016-09-26 无人机及其控制方法
CN201680004278.3A CN107438567A (zh) 2016-09-26 2016-09-26 无人机及其控制方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/100197 WO2018053867A1 (zh) 2016-09-26 2016-09-26 无人机及其控制方法

Publications (1)

Publication Number Publication Date
WO2018053867A1 true WO2018053867A1 (zh) 2018-03-29

Family

ID=60458671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100197 WO2018053867A1 (zh) 2016-09-26 2016-09-26 无人机及其控制方法

Country Status (2)

Country Link
CN (2) CN107438567A (zh)
WO (1) WO2018053867A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019119183A1 (zh) * 2017-12-18 2019-06-27 深圳市大疆创新科技有限公司 农业无人机的飞行控制方法、雷达系统及农业无人机
CN107943090A (zh) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 一种无人机的降落方法及系统
CN109383826A (zh) * 2018-10-09 2019-02-26 成都戎创航空科技有限公司 旋翼无人机自动降落辅助系统
CN110262559B (zh) * 2019-07-18 2022-11-08 深圳市道通智能航空技术股份有限公司 一种无人机安全保护方法、装置及无人机
WO2023044897A1 (zh) * 2021-09-27 2023-03-30 深圳市大疆创新科技有限公司 无人机的控制方法、装置、无人机及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110108666A (ko) * 2010-03-29 2011-10-06 한국항공우주산업 주식회사 무인항공기 이착륙 유도 시스템
CN105518559A (zh) * 2014-12-15 2016-04-20 深圳市大疆创新科技有限公司 飞行器及其起飞控制方法及系统、降落控制方法及系统
CN105599912A (zh) * 2016-01-27 2016-05-25 谭圆圆 无人飞行器的自动降落方法及自动降落装置
CN105652887A (zh) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 采用二级图形识别的无人机降落方法
CN105676873A (zh) * 2016-03-08 2016-06-15 览意科技(上海)有限公司 无人机的自动降落方法及其控制系统
CN105787192A (zh) * 2016-03-15 2016-07-20 联想(北京)有限公司 一种信息处理方法及飞行器

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015024705A (ja) * 2013-07-25 2015-02-05 有限会社Gen Corporation 小型電動ヘリコプタの自動離着陸制御方法
CN107861426B (zh) * 2014-07-16 2020-02-14 深圳市大疆创新科技有限公司 电动无人机及其智能电量保护方法
CN104309803B (zh) * 2014-10-27 2017-07-21 广州极飞科技有限公司 旋翼飞行器自动降落系统及方法
CN104808682B (zh) * 2015-03-10 2017-12-29 成都优艾维智能科技有限责任公司 小型旋翼无人机自主避障飞行控制方法
CN105700551A (zh) * 2016-01-27 2016-06-22 浙江大华技术股份有限公司 无人机降落区域确定方法、无人机降落方法及相关装置
CN105867181A (zh) * 2016-04-01 2016-08-17 腾讯科技(深圳)有限公司 无人机的控制方法和装置
KR101651600B1 (ko) * 2016-04-29 2016-08-29 공간정보기술 주식회사 스테레오 카메라에 의한 자동 착륙 기능을 갖는 무인 비행용 드론
CN105867405A (zh) * 2016-05-23 2016-08-17 零度智控(北京)智能科技有限公司 无人机、无人机降落控制装置及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110108666A (ko) * 2010-03-29 2011-10-06 한국항공우주산업 주식회사 무인항공기 이착륙 유도 시스템
CN105518559A (zh) * 2014-12-15 2016-04-20 深圳市大疆创新科技有限公司 飞行器及其起飞控制方法及系统、降落控制方法及系统
CN105599912A (zh) * 2016-01-27 2016-05-25 谭圆圆 无人飞行器的自动降落方法及自动降落装置
CN105676873A (zh) * 2016-03-08 2016-06-15 览意科技(上海)有限公司 无人机的自动降落方法及其控制系统
CN105787192A (zh) * 2016-03-15 2016-07-20 联想(北京)有限公司 一种信息处理方法及飞行器
CN105652887A (zh) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 采用二级图形识别的无人机降落方法

Also Published As

Publication number Publication date
CN107438567A (zh) 2017-12-05
CN112666969A (zh) 2021-04-16

Similar Documents

Publication Publication Date Title
WO2018053867A1 (zh) 无人机及其控制方法
JP6609833B2 (ja) 無人航空機の飛行を制御する方法及びシステム
WO2018112831A1 (zh) 无人机及其控制方法
US10802509B2 (en) Selective processing of sensor data
JP6673371B2 (ja) 可動物体を使用して障害物を検出する方法及びそのシステム
US20190220039A1 (en) Methods and system for vision-based landing
CN109219785B (zh) 一种多传感器校准方法与系统
EP3249487A2 (en) Uav and uav landing control device and method
CN108885469B (zh) 用于在跟踪系统中初始化目标物体的系统和方法
JP6324649B1 (ja) 検出システム、検出方法、及びプログラム
US10565887B2 (en) Flight initiation proximity warning system
US20160027313A1 (en) Environmentally-aware landing zone classification
CN108163203B (zh) 一种拍摄控制方法、装置及飞行器
WO2022021027A1 (zh) 目标跟踪方法、装置、无人机、系统及可读存储介质
WO2021056139A1 (zh) 获取降落位置的方法、设备、无人机、系统及存储介质
EP3550506B1 (en) A method for improving the interpretation of the surroundings of a uav, and a uav system
WO2020024182A1 (zh) 一种参数处理方法、装置及摄像设备、飞行器
CN109857133A (zh) 基于双目视觉的多旋翼无人机选择性避障控制方法
RU2735196C1 (ru) Способ управления посадкой малого беспилотного летательного аппарата
JP7155062B2 (ja) 監視システム及び飛行ロボット
JP7037436B2 (ja) 飛行装置、飛行装置の制御方法、飛行装置の制御プログラム、及び飛行装置の経路を成す構造物
JP7401068B1 (ja) 情報処理システム、情報処理方法及びプログラム
WO2024029046A1 (ja) 情報処理システム及びプログラム、情報処理方法、サーバ
JP7317684B2 (ja) 移動体、情報処理装置、及び撮像システム
JP2024012827A (ja) 無人航空機対処システム及び映像追尾装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916606

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916606

Country of ref document: EP

Kind code of ref document: A1