CN112666969A - Unmanned aerial vehicle and control method thereof - Google Patents

Unmanned aerial vehicle and control method thereof Download PDF

Info

Publication number
CN112666969A
CN112666969A CN202011464102.0A CN202011464102A CN112666969A CN 112666969 A CN112666969 A CN 112666969A CN 202011464102 A CN202011464102 A CN 202011464102A CN 112666969 A CN112666969 A CN 112666969A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
drone
height
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011464102.0A
Other languages
Chinese (zh)
Inventor
应佳行
彭昭亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202011464102.0A priority Critical patent/CN112666969A/en
Publication of CN112666969A publication Critical patent/CN112666969A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for

Abstract

The application discloses a control method of an unmanned aerial vehicle and the unmanned aerial vehicle, wherein the method comprises the following steps: detecting whether the sensor fails or not when the unmanned aerial vehicle is controlled to land; controlling the drone to hover at a predetermined height and wait for a user input instruction to enable the drone to land safely upon determining that the sensor is disabled.

Description

Unmanned aerial vehicle and control method thereof
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle and a control method thereof.
Background
When the unmanned aerial vehicle flies and lands, the unmanned aerial vehicle flies and has a necessary process at every time, and the probability of accidents is higher when the unmanned aerial vehicle lands. If the unmanned aerial vehicle is landed to the place (such as on the tree, the surface of water, uneven ground, etc.) that is not suitable for the unmanned aerial vehicle to land, cause unmanned aerial vehicle easily and receive the damage. It is therefore necessary to introduce some protective measures when the drone lands. Guarantee the security that unmanned aerial vehicle descends, avoid damaging unmanned aerial vehicle or hindering other people.
The existing unmanned aerial vehicle landing method generally directly descends, the distance between the existing unmanned aerial vehicle and the ground is not high when the existing unmanned aerial vehicle lands, or some protective measures are not added when the existing unmanned aerial vehicle lands, and the existing unmanned aerial vehicle does not usually sense the ground environment. This easily causes the unmanned aerial vehicle to descend to the place that is unsuitable to descend and takes place the incident when descending.
Disclosure of Invention
The main technical problem who solves of this application provides an unmanned aerial vehicle and control method with at least partial solution above-mentioned technical problem.
One aspect of the present application provides a control method for an unmanned aerial vehicle, the unmanned aerial vehicle being provided with one or more sensors, one or more of the sensors being used for monitoring environmental information of the unmanned aerial vehicle, the method comprising: detecting whether the sensor fails or not when the unmanned aerial vehicle is controlled to land; controlling the drone to hover at a predetermined height and wait for a user input instruction to enable the drone to land safely upon determining that the sensor is disabled.
In some embodiments, said detecting whether said sensor has failed comprises: sending an inquiry signal to the sensor; if the answer signal is not received, determining that the sensor is invalid.
In some embodiments, said detecting whether said sensor has failed comprises: and if the information sent by the sensor is not received within a preset time interval, determining that the sensor is invalid.
In some embodiments, the method further comprises: acquiring environmental information of the bottom of the unmanned aerial vehicle through the sensor to judge whether the bottom of the unmanned aerial vehicle is suitable for landing; and if the bottom of the unmanned aerial vehicle is not suitable for landing, controlling the unmanned aerial vehicle to hover at a known height, and waiting for the user to input a next instruction.
In some embodiments, the controlling the drone to hover at a known height if the bottom of the drone is not suitable for landing, and waiting for the user to input the next instruction includes: if the bottom of the unmanned aerial vehicle is not suitable for landing, controlling the unmanned aerial vehicle to fly horizontally to find a suitable landing place; and when the suitable landing point is not found within the preset time length, controlling the unmanned aerial vehicle to hover at a known height, and waiting for the next instruction input by the user.
In some embodiments, the method further comprises: acquiring an image of environmental information of the bottom of the unmanned aerial vehicle through the sensor; traversing the image through a sliding window to select a plurality of target regions in the image; determining an optimal plane from each of said target areas based on coordinate information of pixels in said target area and/or using an algorithm; processing a plurality of the best planes to generate a final best plane as a location suitable for landing.
In some embodiments, the method further comprises: acquiring the height of the unmanned aerial vehicle; determining the landing speed of the unmanned aerial vehicle according to the height of the unmanned aerial vehicle and a preset reference height; and controlling the unmanned aerial vehicle to move according to the landing speed.
In some embodiments, said controlling said drone movement according to said landing speed comprises: and controlling the unmanned aerial vehicle to land according to the landing speed and hovering at the preset reference height.
In some embodiments, the preset reference height comprises a first reference height and a second reference height; the determining the landing speed of the unmanned aerial vehicle according to the height and the preset reference height comprises the following steps: when the height of the unmanned aerial vehicle is greater than or equal to the first reference height, determining the landing speed of the unmanned aerial vehicle as a first landing speed; when the height of the unmanned aerial vehicle is smaller than the first reference height and larger than the second reference height, determining that the landing speed of the unmanned aerial vehicle is a second landing speed; and when the height of the unmanned aerial vehicle is smaller than or equal to the second reference height, determining that the landing speed of the unmanned aerial vehicle is a third landing speed.
In some embodiments, the height of the drone is measured by an inertial measurement unit.
This application another aspect provides an unmanned aerial vehicle, unmanned aerial vehicle includes: one or more sensors for monitoring environmental information of the drone; a memory to store instructions; one or more processors configured to execute instructions stored in the memory to implement any of the methods described above.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive exercise.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a bottom of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 4 is a flowchart of a method for controlling an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 5 is a schematic view of a first embodiment of the automatic landing of the unmanned aerial vehicle provided by the invention;
fig. 6 is a schematic view of a second embodiment of the automatic landing of the unmanned aerial vehicle provided by the invention;
fig. 7 is a schematic view of a third embodiment of automatic landing of the unmanned aerial vehicle provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in the description and in the claims, and in the drawings described above, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the invention in its embodiments for distinguishing between objects of the same nature. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
The present invention will be described in detail below with reference to the accompanying drawings and examples.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention. The drone 100 may include a fuselage 110, the fuselage 110 including a central portion 111 and one or more outer portions 112. In the embodiment shown in fig. 1, the body 110 includes four outer portions 112 (e.g., arms 113). The four outer portions 112 extend from the central portion 111, respectively. In other embodiments, the body 110 may include any number of outer portions 112 (e.g., 6, 8, etc.). In any of the above embodiments, each of the outer portions 112 may carry a propulsion system 120, and the propulsion systems 120 may drive the drone 100 in motion (e.g., climb, land, move horizontally, etc.). For example: the horn 113 may carry a corresponding motor 121, and the motor 121 may drive a corresponding propeller to rotate. The drone 100 may control any one set of motors 121 and their corresponding propellers 122 without being affected by the remaining motors 121 and their corresponding propellers.
The fuselage 110 may carry a load 130, such as: an imaging device 131. In some embodiments, the imaging device 131 may include a camera, such as: images, videos, etc. around the drone may be captured. The camera is photosensitive to light of various wavelengths including, but not limited to, visible light, ultraviolet light, infrared light, or any combination thereof. In some embodiments, the load 130 may include other types of sensors. In some embodiments, the load 130 is coupled to the fuselage 110 via a cradle 150 such that the load 130 is movable relative to the fuselage 110. For example: when the load 130 carries the imaging device 131, the imaging device 131 can move relative to the main body 110 to capture images, videos, and the like around the drone 100. As shown, the landing gear 114 may support the drone 100 to protect the load 130 when the drone 100 is located on the ground.
In some embodiments, the drone 100 may include a control system 140, the control system 140 including components disposed on the drone 100 and components separate from the drone 100. For example, the control system 140 may include a first controller 141 disposed on the drone 100 and a second controller 142 remote from the drone 100 and connected to the first controller 141 via a communication link 160 (e.g., a wireless link). The first controller 141 may include one or more processors, memory, and an onboard computer readable medium 143a, which may store program instructions for controlling the behavior of the drone 100, including but not limited to the operation of the propulsion system 120 and the imaging device 131, controlling the drone for automatic landing, etc. The second controller 142 may include one or more processors, memory, off-board computer readable media 143b, and one or more input-output devices 148, such as: a display device 144 and a control device 145. The operator of the drone 100 may remotely control the drone 100 via the control device 145 and receive feedback from the drone 100 via the display device 144 and/or other devices. In other embodiments, the drone 100 may operate autonomously, in which case the second controller 142 may be omitted, or the second controller 142 may simply be used to cause the drone operator to override the function for drone flight. The onboard computer readable medium 143a may be removable from the drone 100. The off-board computer readable medium 143b may be removable from the second controller 142.
In some embodiments, the drone 100 may include two forward looking cameras 171 and 172, the forward looking cameras 171 and 172 being photosensitive to various wavelengths of light (e.g., visible light, infrared light, ultraviolet light) for capturing images or video around the drone. In some embodiments, the drone 100 includes one or more sensors placed at the bottom.
Fig. 2 is a schematic structural diagram of a bottom of an unmanned aerial vehicle according to an embodiment of the present invention. The drone 100 may include two downward looking cameras 173 and 174 placed at the bottom of the fuselage 110. In addition, the drone 100 also includes two ultrasonic sensors 177 and 178 placed at the bottom of the fuselage 110. The ultrasonic sensors 177 and 178 can detect and/or monitor objects and the ground at the bottom of the drone 100 and measure the distance to the objects or the ground by sending and receiving ultrasonic waves.
In other embodiments, the drone 100 may include an Inertial Measurement Unit (IMU), an infrared sensor, a microwave sensor, a temperature sensor, a proximity sensor (proximity sensor), a three-dimensional laser range finder, a three-dimensional TOF, and the like. The three-dimensional laser range finder and the three-dimensional TOF can detect the distance of an object or a body surface below the unmanned machine tool.
In some embodiments, the inertial measurement unit may be used to measure the altitude of most drones. The inertial measurement unit may include, but is not limited to, one or more accelerometers, gyroscopes, magnetometers, or any combination thereof. The accelerometer may be used to measure the acceleration of the drone to calculate the velocity of the drone.
In some embodiments, the drone may detect and/or monitor environmental information via the sensors to select a location suitable for landing. The environmental information includes, but is not limited to, the flatness of the ground, whether it is water level, etc.
In some embodiments, the drone may take a picture of the environment information via a camera and extract depth information from the picture to reconstruct the three-dimensional terrain of the environment (e.g., the three-dimensional terrain of the bottom of the drone) and select a location from the three-dimensional terrain that is suitable for landing.
In some embodiments, the drone may automatically land until resting on the ground below the drone based on environmental information (e.g., altitude) detected and/or monitored by the sensors. For example, the drones may land in segments, the landing speed of the drones is different in each segment, and the number of segments is not limited herein and may be any number (e.g., 2 segments, 3 segments, 4 segments, etc.).
In some embodiments, the drone may hover at a preset height after landing at a certain height. In some embodiments, after the drone is hovered, the environmental information at the bottom of the drone may be detected by the sensor to select a suitable landing place to control the drone to land automatically.
Fig. 3 is a schematic block diagram of an unmanned aerial vehicle according to an embodiment of the present invention. Referring to fig. 3, the drone 100 may include one or more processors 301, a sensor module 302, a storage module 303, and an input-output module 304.
The control module 301 may include one or more processors, including but not limited to a microprocessor (micro controller), a Reduced Instruction Set Computer (RISC), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a Central Processing Unit (CPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), and the like.
The sensor module 302 may include one or more sensors including, but not limited to, temperature sensors, inertial measurement units, accelerometers, image sensors (e.g., cameras), ultrasonic sensors, microwave sensors, proximity sensors, three-dimensional laser rangefinders, infrared sensors, and the like.
In some embodiments, the inertial measurement unit may be used to measure the altitude of most drones. The inertial measurement unit may include, but is not limited to, one or more accelerometers, gyroscopes, magnetometers, or any combination thereof. The accelerometer may be used to measure the acceleration of the drone to calculate the velocity of the drone.
The memory module 303 may include, but is not limited to, a Read Only Memory (ROM), a Random Access Memory (RAM), a Programmable Read Only Memory (PROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), and the like. The storage module 303 may include a non-transitory computer-readable medium that may store code, logic, or instructions for performing one or more of the steps described elsewhere herein. The control module 301, which may perform one or more steps individually or collectively according to the code, logic or instructions of the non-transitory computer readable medium described herein.
The input/output module 304 is configured to output information or instructions to an external device, for example, receive instructions sent by the input/output device 148 (see fig. 1), or send images captured by the imaging device 131 (see fig. 1) to the input/output device 148.
In some embodiments, the control module 301 may control the drone 100 to land based on information detected and/or monitored by the sensor module 302. For example, the control module 301 may calculate the landing speed of the drone 100 based on the altitude detected and/or monitored by the sensor module 302. For another example, the control module 301 may reconstruct a three-dimensional terrain of the bottom of the drone 100 according to the image or video captured by the sensor module 302, and select a suitable landing point in the three-dimensional terrain to control the drone 100 to land.
Fig. 4 is a flowchart of an unmanned aerial vehicle control method according to an embodiment of the present invention. Referring to fig. 4, the main execution body of the process is the control system of the unmanned aerial vehicle and one or more sensors. The control system of the drone may receive data detected and/or monitored by the one or more sensors and control the motion of the drone according to the data.
Step 401, obtaining a first height of the unmanned aerial vehicle.
In some embodiments, the drone may be powered by one or more ultrasonic sensors, such as: the first height is acquired by one or more of the ultrasonic sensors 177 and 178 (see fig. 1). Specifically, the ultrasonic sensors 177 and 178 may transmit ultrasonic waves to the ground, the ultrasonic waves may be received by the drone after being reflected by the ground, and the first height may be calculated by obtaining a transmission time and a reception time of the ultrasonic waves and assisting a propagation speed of sound.
In other embodiments, the drone may acquire the first height via one or more cameras, such as one or more of forward looking cameras 171 and 172, and downward looking cameras 173 and 174. As shown in fig. 2, the downward-looking cameras 173 and 174 are mounted at the bottom of the drone for capturing images and/or video of the bottom of the drone. The unmanned aerial vehicle can extract depth information in the image and/or the video by using a stereo matching technology (English), and reconstruct a three-dimensional terrain at the bottom of the unmanned aerial vehicle according to the depth information, so as to obtain the first height. In other embodiments, the drone may acquire the first altitude using an external camera (e.g., imaging device 131).
In other embodiments, the drone may transmit microwaves to the ground through the microwave sensor, the microwaves may be received by the drone after being reflected by the ground, and the first height may be calculated by obtaining a transmission time and a reception time of the microwaves and assisting a propagation velocity of the microwaves.
In other embodiments, the drone may acquire the first altitude through one or more laser rangefinders. Specifically, laser range finder can be installed unmanned aerial vehicle's bottom is in order to send the microwave to ground, laser can be by after ground reflection unmanned aerial vehicle receives, through acquireing the transmission moment and the receipt moment of laser are assisted with the propagation velocity of laser, can calculate first height. .
In other embodiments, the unmanned aerial vehicle may acquire the first height through an infrared sensor, a proximity sensor, and the like, which is not described herein.
At step 402, a preset reference height is obtained.
In some embodiments, the preset reference altitude may be used to calculate the landing speed of the drone, the reference altitude being preset within the drone, such as the computer readable medium 143a (see fig. 1), the storage module 403 (see fig. 4), or the memory. Referring to fig. 5, fig. 5 is a schematic view of an embodiment of the automatic landing of the unmanned aerial vehicle provided by the invention. Unmanned aerial vehicle 500 has the automatic landing function. In some embodiments, the drone 500 may automatically land in a direction perpendicular to the ground 501. In some embodiments, the drone 500 may automatically land in sections in a direction perpendicular to the ground 501 according to a reference height. For example: the preset reference height comprises a first reference height H1And a second reference height H2When the height of the unmanned aerial vehicle 500 is greater than or equal to H1It will follow the speed V1And (6) falling. When the height of the unmanned aerial vehicle 500 is greater than H2Is less than H1It will follow the speed V2And (6) falling. When the height of the unmanned aerial vehicle 500 is less than or equal to H2It will follow the speed V3And (6) falling. The formulas for calculating V1, V2, V3 are:
Figure BDA0002830807280000091
wherein h denotes the current altitude of the drone 500, and a and b are constants.
In some embodiments, H1Is 5 m, H2At 0.5 meters, the drone 500 will calculate its landing velocity as follows.
Figure BDA0002830807280000092
Where h denotes the current height of the drone 500, and when the height of the drone 500 is greater than or equal to 5 meters, it will land at a speed V m/s, V being a constant (e.g., 5 m/s, 4 m/s). When the height of the drone 500 is greater than 0.5 meters and less than 5 meters, it will follow a velocity V2And (6) falling. Velocity V2Is a variable that is linearly related to the height h. When the height of the drone 500 is less than or equal to 0.5 meters, it will follow a velocity V3And (6) falling. In this embodiment, V3Is 0.4 m/s.
It is noted that the above description of the reference heights is only for the convenience of understanding the present invention. For those skilled in the art, on the basis of understanding the present invention, the drone or the user operating the drone may make real-time modification and transformation on the values of the first reference altitude and the second reference altitude, but the modification and transformation are still within the protection scope of the present invention. For example: modifying the first height H1Is 10 meters, the second height H is modified2Is 1 meter. The following steps are repeated: the user can make modifications and changes to the parameters and constants in formula set 1, but the modifications and changes are still within the scope of the present invention.
In some embodiments, the reference height may be used to calculate a landing speed of the drone and cause the drone to hover. The reference height is preset within the drone, such as the computer readable medium 143a (see fig. 1), the memory module 403 (see fig. 4), or the memory. Referring to fig. 6, fig. 6 is a schematic view of an embodiment of the automatic landing of the unmanned aerial vehicle according to the present invention. Unmanned aerial vehicle 600 has the automatic landing function. In this embodiment, the preset reference height is H, the current height of the drone is H, and the drone 600 can automatically land for a distance from one end in a direction perpendicular to the ground 601 and then hover over H. The landing speed V of the drone 600 from height H to height H may be calculated according to the following formula:
V=H-h
(formula group 3)
The landing speed V of the drone 600 is linearly related to the height H, and decreases as the height thereof decreases, so that the drone 600 finally hovers at the preset reference height H.
In other embodiments, the landing velocity V of the drone 600 may be non-linearly related to the height H, such as: the landing speed V of the drone 600 may be calculated according to the following formula:
V=H2-h
(formula group 4)
Wherein, the landing velocity V of the drone 600 is non-linearly related to the height H, decreasing as its height decreases, eventually making the drone 600 hover at height
Figure BDA0002830807280000101
Step 403, analyzing the first altitude according to the preset reference altitude to obtain a first landing speed of the drone.
Wherein, unmanned aerial vehicle's first landing speed can be calculated according to above formula group 1, formula group 2, formula group 3, formula group 4 etc. and is not repeated here.
In some embodiments, the first drop velocity may be V1. In other embodiments, the first drop velocity may be V2
In some embodiments, the first landing speed is a constant, and the drone may land to a predetermined height at a constant speed, or land to the ground at a constant speed.
And step 404, controlling the unmanned aerial vehicle to move according to the first landing speed.
In some embodiments, the drone may acquire a second altitude, the second altitude being less than or equal to the second reference altitude, from which the drone may acquire a second landing speed. In some embodiments, the second drop velocity is a constant (e.g., V)3) And the unmanned aerial vehicle can land to the ground according to the second landing speed.
In some embodiments, the drone may utilize an onboard environmental sensor to detect and/or monitor environmental information of the bottom of the drone after the drone lands at a certain height (e.g., 3 meters) according to the first landing speed, so as to determine whether the bottom of the drone is suitable for landing. Including but not limited to ultrasonic sensors, infrared sensors, proximity sensors, microwave sensors, cameras, three-dimensional laser rangefinders, three-dimensional TOF, and the like.
In some embodiments, the drone may take images of the bottom of the drone from different angles through a camera, such as one or both of cameras 173 and 174 (see fig. 2) as follows. In some embodiments, the drone may select a target area in the image through a sliding window. Coordinate information for each pixel in the target region (e.g., x, y, z coordinates for each pixel, the z coordinate representing depth information for the pixel) may be extracted. In some embodiments, the drone may take two images simultaneously or sequentially with a camera (e.g., cameras 172 and 714), and the depth information may be extracted according to a stereo matching technique (e.g., semi-global block matching algorithm). The unmanned aerial vehicle can select an optimal plane and a cost function corresponding to the optimal plane from the icon area according to the coordinate information of the pixels in the target area. In some embodiments, the drone may be controlled by utilizing algorithms such as: the leveberg-Marquardt method (english: leveberg-Marquardt algorithm) determines an optimal plane and its corresponding cost function. Then, the unmanned aerial vehicle traverses the remaining area in the image through the sliding window to generate a plurality of optimal planes and a cost function corresponding to the optimal planes. The multiple best planes are processed (e.g., smoothed) to generate a best plane as a location suitable for landing.
In this embodiment, the unmanned aerial vehicle shoots an image of the bottom through the camera, and reconstructs a three-dimensional terrain of the bottom according to the image, so as to select a place suitable for landing from the three-dimensional terrain. The embodiment of the invention has the advantages that the landing place can be automatically selected through the three-dimensional terrain, and the landing can be safely and smoothly carried out without human intervention. Guarantee the security that unmanned aerial vehicle descends, avoid damaging unmanned aerial vehicle or hindering other people.
Referring to fig. 7, fig. 7 is a schematic view of a third embodiment of automatic landing of the unmanned aerial vehicle according to the embodiment of the present invention. The drone 700 may capture a bottom image 705 by the method described in the above embodiment, reconstruct the three-dimensional terrain at the bottom of the drone 700 from the image 705, and select a suitable touchdown point in the three-dimensional terrain. For example: the drone 700 may identify the surface of the water 701, the hill 702, and the flat ground 703. The drone 700 may determine that the water surface 701 and the hill 702 are not suitable for landing, and finally select the flat ground 703 for landing.
In some embodiments, if no suitable landing place is found in the three-dimensional terrain, the drone will remain at the same height, take a new bottom picture by moving in the horizontal direction to acquire a new three-dimensional terrain, and attempt to find a suitable landing place in the new three-dimensional terrain.
In some embodiments, if the drone cannot find a suitable touchdown point all the time, such as within a predetermined time interval, the drone will hover at a known height waiting for the user to enter the next instruction.
In some embodiments, if the drone finds a suitable touchdown point in the acquired three-dimensional terrain, the drone will land directly to the touchdown point.
In some embodiments, the drone may acquire status information of sensors of the drone (e.g., the forward looking cameras 171 and 172, the downward looking cameras 173 and 174, the ultrasonic sensors 177 and 178), such as to detect and/or monitor whether the sensors have failed. For example, the drone may determine that the sensor is malfunctioning if the sensor does not return an acknowledgement signal by sending an inquiry signal to the sensor. In other embodiments, the sensor may periodically or aperiodically transmit detected and/or monitored information to the drone, and the drone may determine that the sensor is malfunctioning if the drone does not receive the information transmitted by the sensor within a predetermined time interval (e.g., 60 seconds).
In some embodiments, if the drone determines that the sensor is disabled, the drone may hover at a known height, waiting for the user to confirm whether the touchdown point is safe, and if the user confirms that the touchdown point is safe, a landing instruction may be sent to the drone. After the unmanned aerial vehicle receives the landing command, the unmanned aerial vehicle starts to land until the whole landing process is completed.
By adopting the embodiment of the invention, when the sensor of the unmanned aerial vehicle fails, a protection mechanism can be adopted in time to control the unmanned aerial vehicle to hover at a preset height and wait for an instruction input by a user, so that the unmanned aerial vehicle can be ensured to land safely, and the unmanned aerial vehicle is prevented from being damaged or injuring other people.
It should be noted that the above-mentioned flow charts are only for the understanding of the present invention, and should not be considered as the only implementation of the present invention. It will be apparent to those skilled in the art that steps in the above-described flowcharts can be added, deleted, and changed without departing from the scope of the invention, and that modifications to the flowcharts are within the scope of the invention. For example, the user may modify the reference height.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
The disclosure of this patent document contains material which is subject to copyright protection. The copyright is owned by the copyright owner. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the patent and trademark office official records and records.
Finally, it should be noted that: the above embodiments are merely illustrative of the technical solutions of the present disclosure, and not restrictive; while the present disclosure has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, which should be construed as follows.

Claims (11)

1. A method of controlling a drone, the drone being provided with one or more sensors for monitoring environmental information of the drone, the method comprising:
detecting whether the sensor fails or not when the unmanned aerial vehicle is controlled to land;
controlling the drone to hover at a predetermined height and wait for a user input instruction to enable the drone to land safely upon determining that the sensor is disabled.
2. The method of claim 1, wherein said detecting whether the sensor has failed comprises:
sending an inquiry signal to the sensor;
if the answer signal is not received, determining that the sensor is invalid.
3. The method of claim 1, wherein said detecting whether the sensor has failed comprises:
and if the information sent by the sensor is not received within a preset time interval, determining that the sensor is invalid.
4. The method of claim 1, further comprising:
acquiring environmental information of the bottom of the unmanned aerial vehicle through the sensor to judge whether the bottom of the unmanned aerial vehicle is suitable for landing;
and if the bottom of the unmanned aerial vehicle is not suitable for landing, controlling the unmanned aerial vehicle to hover at a known height, and waiting for the user to input a next instruction.
5. The method of claim 4, wherein if the bottom of the drone is not suitable for landing, controlling the drone to hover at a known height, waiting for a user to input a next instruction, comprises:
if the bottom of the unmanned aerial vehicle is not suitable for landing, controlling the unmanned aerial vehicle to fly horizontally to find a suitable landing place;
and when the suitable landing point is not found within the preset time length, controlling the unmanned aerial vehicle to hover at a known height, and waiting for the next instruction input by the user.
6. The method according to claim 4 or 5, characterized in that the method further comprises:
acquiring an image of environmental information of the bottom of the unmanned aerial vehicle through the sensor;
traversing the image through a sliding window to select a plurality of target regions in the image;
determining an optimal plane from each of said target areas based on coordinate information of pixels in said target area and/or using an algorithm;
processing a plurality of the best planes to generate a final best plane as a location suitable for landing.
7. The method of claim 1, further comprising:
acquiring the height of the unmanned aerial vehicle;
determining the landing speed of the unmanned aerial vehicle according to the height of the unmanned aerial vehicle and a preset reference height;
and controlling the unmanned aerial vehicle to move according to the landing speed.
8. The method of claim 7, wherein said controlling said drone to move according to said landing speed comprises:
and controlling the unmanned aerial vehicle to land according to the landing speed and hovering at the preset reference height.
9. The method of claim 7, wherein the preset reference heights comprise a first reference height and a second reference height;
the determining the landing speed of the unmanned aerial vehicle according to the height and the preset reference height comprises the following steps:
when the height of the unmanned aerial vehicle is greater than or equal to the first reference height, determining the landing speed of the unmanned aerial vehicle as a first landing speed;
when the height of the unmanned aerial vehicle is smaller than the first reference height and larger than the second reference height, determining that the landing speed of the unmanned aerial vehicle is a second landing speed;
and when the height of the unmanned aerial vehicle is smaller than or equal to the second reference height, determining that the landing speed of the unmanned aerial vehicle is a third landing speed.
10. The method of claim 7, wherein the height of the drone is measured by an inertial measurement unit.
11. A drone, characterized in that it comprises:
one or more sensors for monitoring environmental information of the drone;
a memory to store instructions;
one or more processors configured to execute instructions stored in the memory to implement the method of any one of claims 1-10.
CN202011464102.0A 2016-09-26 2016-09-26 Unmanned aerial vehicle and control method thereof Pending CN112666969A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011464102.0A CN112666969A (en) 2016-09-26 2016-09-26 Unmanned aerial vehicle and control method thereof

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202011464102.0A CN112666969A (en) 2016-09-26 2016-09-26 Unmanned aerial vehicle and control method thereof
PCT/CN2016/100197 WO2018053867A1 (en) 2016-09-26 2016-09-26 Unmanned aerial vehicle, and control method thereof
CN201680004278.3A CN107438567A (en) 2016-09-26 2016-09-26 Unmanned plane and its control method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201680004278.3A Division CN107438567A (en) 2016-09-26 2016-09-26 Unmanned plane and its control method

Publications (1)

Publication Number Publication Date
CN112666969A true CN112666969A (en) 2021-04-16

Family

ID=60458671

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011464102.0A Pending CN112666969A (en) 2016-09-26 2016-09-26 Unmanned aerial vehicle and control method thereof
CN201680004278.3A Pending CN107438567A (en) 2016-09-26 2016-09-26 Unmanned plane and its control method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201680004278.3A Pending CN107438567A (en) 2016-09-26 2016-09-26 Unmanned plane and its control method

Country Status (2)

Country Link
CN (2) CN112666969A (en)
WO (1) WO2018053867A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023044897A1 (en) * 2021-09-27 2023-03-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and apparatus, unmanned aerial vehicle, and storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116860003A (en) * 2017-12-18 2023-10-10 深圳市大疆创新科技有限公司 Flight control method of agricultural unmanned aerial vehicle, radar system and agricultural unmanned aerial vehicle
CN107943090A (en) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 The landing method and system of a kind of unmanned plane
CN109383826A (en) * 2018-10-09 2019-02-26 成都戎创航空科技有限公司 Rotor wing unmanned aerial vehicle lands auxiliary system automatically
CN115840461A (en) 2019-07-18 2023-03-24 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle safety protection method and device and unmanned aerial vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808682A (en) * 2015-03-10 2015-07-29 成都市优艾维机器人科技有限公司 Small rotor unmanned aerial vehicle autonomous obstacle avoidance flight control system and control method
CN105676873A (en) * 2016-03-08 2016-06-15 览意科技(上海)有限公司 Automatic landing method and control system of unmanned aerial vehicle
CN105700551A (en) * 2016-01-27 2016-06-22 浙江大华技术股份有限公司 An unmanned aerial vehicle landing area determination method, an unmanned aerial vehicle landing method and related apparatuses
CN105867181A (en) * 2016-04-01 2016-08-17 腾讯科技(深圳)有限公司 Control method and apparatus of unmanned aerial vehicle
CN105867405A (en) * 2016-05-23 2016-08-17 零度智控(北京)智能科技有限公司 UAV (unmanned aerial vehicle) as well as UAV landing control method and device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101136120B1 (en) * 2010-03-29 2012-04-17 한국항공우주산업 주식회사 Departure and landing path guide system for UAV
JP2015024705A (en) * 2013-07-25 2015-02-05 有限会社Gen Corporation Automatic landing/taking-off control method of small electric helicopter
CN104166355B (en) * 2014-07-16 2017-12-08 深圳市大疆创新科技有限公司 Electronic unmanned plane and its intelligent power guard method
CN104309803B (en) * 2014-10-27 2017-07-21 广州极飞科技有限公司 The automatic landing system of rotor craft and method
CN105518559A (en) * 2014-12-15 2016-04-20 深圳市大疆创新科技有限公司 Aircraft, take-off control method and system thereof and landing control method and system thereof
CN105599912A (en) * 2016-01-27 2016-05-25 谭圆圆 Automatic landing method and automatic landing device of unmanned aerial vehicle
CN105787192B (en) * 2016-03-15 2020-06-23 联想(北京)有限公司 Information processing method and aircraft
CN105652887A (en) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 Unmanned aerial vehicle landing method adopting two-level graph recognition
KR101651600B1 (en) * 2016-04-29 2016-08-29 공간정보기술 주식회사 Unmanned aerial drone having automatic landing function by stereo camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808682A (en) * 2015-03-10 2015-07-29 成都市优艾维机器人科技有限公司 Small rotor unmanned aerial vehicle autonomous obstacle avoidance flight control system and control method
CN105700551A (en) * 2016-01-27 2016-06-22 浙江大华技术股份有限公司 An unmanned aerial vehicle landing area determination method, an unmanned aerial vehicle landing method and related apparatuses
CN105676873A (en) * 2016-03-08 2016-06-15 览意科技(上海)有限公司 Automatic landing method and control system of unmanned aerial vehicle
CN105867181A (en) * 2016-04-01 2016-08-17 腾讯科技(深圳)有限公司 Control method and apparatus of unmanned aerial vehicle
CN105867405A (en) * 2016-05-23 2016-08-17 零度智控(北京)智能科技有限公司 UAV (unmanned aerial vehicle) as well as UAV landing control method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023044897A1 (en) * 2021-09-27 2023-03-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and apparatus, unmanned aerial vehicle, and storage medium

Also Published As

Publication number Publication date
WO2018053867A1 (en) 2018-03-29
CN107438567A (en) 2017-12-05

Similar Documents

Publication Publication Date Title
JP6609833B2 (en) Method and system for controlling the flight of an unmanned aerial vehicle
US20210065400A1 (en) Selective processing of sensor data
CN110525650B (en) Unmanned aerial vehicle and control method thereof
US11822353B2 (en) Simple multi-sensor calibration
CN112666969A (en) Unmanned aerial vehicle and control method thereof
US20230343087A1 (en) Automatic terrain evaluation of landing surfaces, and associated systems and methods
US20190220039A1 (en) Methods and system for vision-based landing
JP6673371B2 (en) Method and system for detecting obstacle using movable object
ES2910700T3 (en) Automatic surface inspection system and procedure
JP6181300B2 (en) System for controlling the speed of unmanned aerial vehicles
EP3158411B1 (en) Sensor fusion using inertial and image sensors
CN111033561A (en) System and method for navigating a robotic device using semantic information
US10565887B2 (en) Flight initiation proximity warning system
JP2017501915A (en) Inline sensor calibration method and calibration apparatus
CN110268356A (en) The system of leading unmanned plane
CN110226143A (en) The method of leading unmanned plane
WO2017160192A1 (en) Method for precision landing an unmanned aerial vehicle
JP7155062B2 (en) Surveillance system and flying robot
JP7317684B2 (en) Mobile object, information processing device, and imaging system
JP2024012827A (en) Unmanned aircraft coping system and video tracking device
JP2021047738A (en) Moving object, flight path control method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination