CN111615677B - Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium - Google Patents

Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium Download PDF

Info

Publication number
CN111615677B
CN111615677B CN201880066282.1A CN201880066282A CN111615677B CN 111615677 B CN111615677 B CN 111615677B CN 201880066282 A CN201880066282 A CN 201880066282A CN 111615677 B CN111615677 B CN 111615677B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
return
point
safe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880066282.1A
Other languages
Chinese (zh)
Other versions
CN111615677A (en
Inventor
李劲松
张立天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202410267172.9A priority Critical patent/CN118092500A/en
Publication of CN111615677A publication Critical patent/CN111615677A/en
Application granted granted Critical
Publication of CN111615677B publication Critical patent/CN111615677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A safe landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a computer storage medium, wherein the method comprises the following steps: when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle (S101); controlling the unmanned aerial vehicle to return based on the first return path and the return target position (S102); when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording a safe landing point for the unmanned aerial vehicle to land (S103); the landing is performed according to the recorded safe landing point (S104), and the method is helpful for providing guarantee for the safe landing of the unmanned aerial vehicle.

Description

Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a safe landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium.
Background
In the process of autonomous operation of the agricultural unmanned aerial vehicle, the GNSS signals can be lost due to environmental interference or hardware faults and the like, so that the agricultural unmanned aerial vehicle cannot acquire position information, and in order to enable the unmanned aerial vehicle to return when losing the GNSS signals, the agricultural unmanned aerial vehicle can return according to the position information provided by the binocular vision module under the machine body.
However, due to the fact that the binocular vision module can not provide accurate positions due to light, environment and the like, large deviation occurs in the course of returning of the unmanned aerial vehicle, and due to the large deviation occurring in the course of returning of the unmanned aerial vehicle, the unmanned aerial vehicle can land in an uneven ground area or in water in the course of returning, and the unmanned aerial vehicle lacks safety landing guarantee during landing.
Disclosure of Invention
The embodiment of the invention provides a safe landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium, which are beneficial to providing guarantee for the safe landing of the unmanned aerial vehicle.
A first aspect of an embodiment of the present invention provides a method for safely landing an unmanned aerial vehicle, including:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to return based on the first return route and the return target position;
when the current position of the unmanned aerial vehicle is in a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for the unmanned aerial vehicle to land;
landing is performed according to the recorded safe landing points.
A second aspect of the embodiment of the present invention provides a safety landing apparatus, applied to an unmanned aerial vehicle, wherein the safety landing apparatus includes a memory and a processor;
the memory is used for storing program codes;
the processor invokes the program code, which when executed, is operable to:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to return based on the first return route and the return target position;
when the current position of the unmanned aerial vehicle is in a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for the unmanned aerial vehicle to land;
landing is performed according to the recorded safe landing points.
A third aspect of an embodiment of the present invention is to provide a unmanned aerial vehicle, including:
a body;
the power system is arranged on the machine body and used for providing power for the unmanned aerial vehicle;
and a safety landing apparatus as described in the second aspect.
In the embodiment of the invention, when the unmanned aerial vehicle loses the navigation signal, after the return target position and the first return path of the unmanned aerial vehicle are determined, the unmanned aerial vehicle is controlled to return according to the first return path and the return target position, and when the current position of the unmanned aerial vehicle is detected to be within the first preset range from the return target position, the safe landing point is detected and recorded, so that the unmanned aerial vehicle can land based on the recorded safe landing point, reliable guarantee can be provided for safe landing of the unmanned aerial vehicle, the working efficiency of the unmanned aerial vehicle when the safe landing point is determined is improved, and the processing resources of the unmanned aerial vehicle are effectively saved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a safe landing method of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for safely landing an unmanned aerial vehicle according to another embodiment of the present invention;
fig. 3 is a schematic diagram of an observation area and an area to be detected of the unmanned aerial vehicle according to the embodiment of the present invention;
FIG. 4 is a schematic view of a back projection of the observation area and the area to be detected shown in FIG. 3 into a two-dimensional image according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an inclined flight attitude of a unmanned aerial vehicle according to an embodiment of the present invention;
fig. 6 is a schematic diagram of determining a safe landing point of an unmanned aerial vehicle according to a preset track according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of a method for safely landing a unmanned aerial vehicle according to still another embodiment of the present invention;
Fig. 8 is a schematic diagram of a return flow of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 9 is a schematic block diagram of a safety landing apparatus according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without any inventive effort, are intended to be within the scope of the invention.
The unmanned aerial vehicle may lose navigation signals due to the interference of the environment or hardware faults and other factors in the operation process. The navigational signals of the drone may include at least one of signals of a positioning sensor or signals of a compass. The positioning sensor may comprise a global satellite navigation system (Global Navigation Satellite System, GNSS) module provided on the drone. For example, an agricultural drone performing a pesticide spraying operation may fail due to interference of environmental factors with GNSS modules (commonly known as satellite loss of the drone), or malfunction of a compass due to hardware failure, resulting in loss of navigation signals by the drone. When the unmanned aerial vehicle loses the navigation signal, the unmanned aerial vehicle cannot acquire a reliable navigation signal by means of a GNSS module or a compass, that is to say, the unmanned aerial vehicle cannot execute accurate return according to reliable coordinate information or direction information, in order to enable the unmanned aerial vehicle losing the navigation signal to return to a return target position or a position near the return target position, visual return can be executed by means of a binocular vision module of the unmanned aerial vehicle, and in some embodiments, the return target position can be called a Home point. The Home point may be a departure position of the unmanned aerial vehicle, or may be a position set by the user (may not be the same as the departure position). When the unmanned aerial vehicle executes the return voyage, the Home point is taken as a return voyage target position and flies to the return voyage target position.
Currently, when a navigation signal is lost, the unmanned aerial vehicle can take the departure position of the unmanned aerial vehicle as the return target position of the unmanned aerial vehicle, so that the first return direction of the unmanned aerial vehicle can be determined according to the direction of the unmanned aerial vehicle towards the return target position, which is recorded latest before the unmanned aerial vehicle loses the navigation signal, and the first return path of the unmanned aerial vehicle is determined according to the first return direction and the return target position; if a safe landing point is not detected, it is necessary to find a safe landing point again and perform landing.
By adopting the current unmanned aerial vehicle return method, although the current landing position is detected when the return target position is reached, if the detection result indicates that the current landing position is not the safe landing point, the unmanned aerial vehicle is required to re-determine a new flight route and re-determine the safe landing point from the new flight route, so that the working efficiency of the unmanned aerial vehicle when determining the safe landing point is reduced, and the processing resource of the unmanned aerial vehicle is wasted.
Based on the above, the application provides a safe landing method of the unmanned aerial vehicle, which can improve the working efficiency of the unmanned aerial vehicle when determining the safe landing point, thereby effectively saving the processing resources of the unmanned aerial vehicle and simultaneously providing reliable guarantee for the safe landing of the unmanned aerial vehicle.
Referring to fig. 1, a schematic flow chart of a safe landing method of an unmanned aerial vehicle according to an embodiment of the present invention is shown in fig. 1, where the method may include:
s101, when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle.
In one embodiment, if a positioning failure of the drone occurs during flight, the drone may be caused to lose navigation signals, such as a navigation signal loss condition caused when the GNSS of the drone fails. When the unmanned aerial vehicle detects that the navigation signal is lost, the binocular vision module below the unmanned aerial vehicle can be triggered to carry out vision return, so that the unmanned aerial vehicle can return to a preset return target position. In one embodiment, the preset return target position is determined from at least one preset return position, and the preset return position may be, for example, a position preset in the unmanned aerial vehicle for performing battery replacement or performing flight operation, or may be a departure position of the unmanned aerial vehicle. If the unmanned aerial vehicle only loses the navigation signal due to the loss of the signal of the positioning sensor, for example, the positioning sensor fails to lose the navigation signal, the navigation signal loss caused by the GNSS failure can determine the first return direction of the unmanned aerial vehicle when the unmanned aerial vehicle returns by means of the compass, so that the first return path of the unmanned aerial vehicle is determined in combination with the determined return target position and the first return direction.
In one embodiment, the unmanned aerial vehicle may also cause the unmanned aerial vehicle to lose navigation signals if a compass failure occurs during flight. At the moment, the unmanned aerial vehicle can carry out visual return voyage according to the binocular vision module, and the direction of the unmanned aerial vehicle is determined according to the coordinate signal of the positioning sensor in the return voyage process. The preset return target position may be a return position as described above. At this time, the unmanned aerial vehicle can determine the position of the unmanned aerial vehicle by means of a positioning sensor such as a GNSS module, and determine a first return path of the unmanned aerial vehicle by combining the determined return target position.
In another embodiment, if the unmanned aerial vehicle is a loss of navigation signals caused by a loss of signals of the positioning sensor and the compass, for example, the navigation signals are lost when the positioning sensor and the compass are both failed, the unmanned aerial vehicle can identify the surrounding environment according to the vision module and position and direction of the unmanned aerial vehicle according to the surrounding environment, and the first return path and the return target position of the unmanned aerial vehicle can be determined by combining the position and direction information recorded before the navigation signals are lost.
In the process that the unmanned aerial vehicle flies from the return target position to the position corresponding to the lost navigation signal, the positioning device of the unmanned aerial vehicle is normal, and the flying position of the unmanned aerial vehicle can be updated and recorded in real time. When the unmanned aerial vehicle takes off, the return target position of the unmanned aerial vehicle can be recorded. Therefore, when the unmanned aerial vehicle is in the normal flight process, the positioning device which works normally can update and record the current position of the unmanned aerial vehicle in real time, and update and record the direction of the unmanned aerial vehicle towards the return target position in real time according to the current position and the return target position.
For example, the unmanned aerial vehicle may record the attitude of the unmanned aerial vehicle in real time through an inertial measurement unit (Inertial Measurement Unit, IMU) thereon, thereby recording the flight direction of the unmanned aerial vehicle in real time to obtain the direction of the unmanned aerial vehicle toward the return target position. When the unmanned aerial vehicle loses the navigation signal, for example, when the GNSS module encounters interference loss signal, fault or other problems, any position can be selected from at least one preset return position to serve as a return target position, the last recorded flight position before the unmanned aerial vehicle loses the navigation signal can be determined to be a return starting position, and the last recorded direction towards the return target position before the unmanned aerial vehicle loses the navigation signal can be determined to be a first return direction, so that a first return path of the unmanned aerial vehicle can be determined according to the first return direction, the return target position and the return starting position. The first return direction is the direction in which the return starting position of the unmanned aerial vehicle points to the return target position. For example, the unmanned aerial vehicle may determine, according to the return starting position and the first return direction, a straight line path as a first return path to fly toward the Home point; the original flight path can be determined according to the return starting position, the Home point and the flight position recorded in the way and used as a first return path to return to the Home point; of course, the unmanned aerial vehicle may also generate the first return path according to other manners, which are not limited herein.
S102, controlling the unmanned aerial vehicle to return based on the first return route and the return target position.
In one embodiment, the unmanned aerial vehicle may use any return position recorded in a positioning device such as a GNSS module as a return target position of the unmanned aerial vehicle, where any return position recorded by the positioning device such as the GNSS module may be, for example, a position where the unmanned aerial vehicle performs flight operation or battery replacement, a position where the agricultural unmanned aerial vehicle performs pesticide spraying, or a position set by a user. In one embodiment, the unmanned aerial vehicle can adjust the first return route of the unmanned aerial vehicle in real time in combination with the current position of the unmanned aerial vehicle in the return process, and control the unmanned aerial vehicle to return towards the Home point (namely the return target position) according to the indication of the first return route corresponding to each current position in the return process.
In some cases, the first return path may be substantially unchanged, e.g., the drone flies from the return home position directly towards the return target position. In other cases, the first return path may be changed, for example, the unmanned aerial vehicle encounters an obstacle during the return process and causes the first return path to change due to the start of the obstacle avoidance function, at which time the unmanned aerial vehicle may return to the first return path after bypassing the obstacle avoidance function, or continue the return with the changed first return path.
When controlling the unmanned aerial vehicle to return based on the first return route and the return target position, the unmanned aerial vehicle can also refer to the position information provided by the visual odometer, and the unmanned aerial vehicle can determine the current position of the unmanned aerial vehicle according to the position information provided by the visual odometer.
S103, when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, safety landing point detection is carried out, and a safety landing point for the unmanned aerial vehicle to land is recorded.
In the process that the unmanned aerial vehicle loses the navigation signal and carries out visual return, because accurate coordinate information of unmanned aerial vehicle can not be obtained through GNSS, in addition, the error of visual odometer and/or compass in the return process probably makes the unmanned aerial vehicle can not return to the target position that returns to the journey accurately, the final target position that returns to the journey of unmanned aerial vehicle probably is the near position of target position that returns to the journey of unmanned aerial vehicle, because the near position that returns to the journey the target position probably is not the position that is fit for unmanned aerial vehicle safety landing, so unmanned aerial vehicle need confirm again and can supply the safe landing point that unmanned aerial vehicle landed, in order to effectively improve unmanned aerial vehicle find the processing efficiency of safe landing point, can be in the return journey to unmanned aerial vehicle' S current position carries out safe landing point detection to record the safe landing point that determines, thereby can be according to the safe landing point that falls of record in step S104.
In one embodiment, the unmanned aerial vehicle may start to detect the safe landing point when the current position is within the first preset range from the return target position, and record the detected safe landing point (i.e. store the detected safe landing point), so when the unmanned aerial vehicle lands at or near the Home point, if the point is detected to be an unsafe landing point, step S104 may be executed to land according to the recorded safe landing point, thereby avoiding the situation of repeatedly searching for the safe landing point, and further realizing the saving of processing resources of the unmanned aerial vehicle, and improving the success rate of the unmanned aerial vehicle landing to the safe landing point. For example, the unmanned plane may start to detect a safe landing point when the current position is within 30 meters from the return target position, and record the detected safe landing point, but not land at this time and still fly to the Home point.
S104, landing is carried out according to the recorded safe landing points.
In one embodiment, when the unmanned aerial vehicle flies to a return target position or a position nearby the return target position, the current position of the unmanned aerial vehicle can be subjected to safe landing point detection, if the current position of the unmanned aerial vehicle is detected to be a safe landing point, landing is directly carried out, the safe landing point detection is continuously carried out on the position in the landing process, if the current position is detected to be an unsafe landing point in the landing process, the flying height is adjusted to be the height before landing is carried out, and the unmanned aerial vehicle flies to the recorded safe landing point to carry out landing; if the current position is detected to be a safe landing point in the landing process, continuously lowering the flying height until the relative height between the unmanned aerial vehicle and the ground is detected to be zero, closing a power system, and landing the unmanned aerial vehicle on the ground.
In still another embodiment, if the current position of the unmanned aerial vehicle is detected to be not a safe landing point, landing is performed according to the safe landing point recorded in a first preset range from the return target position, and when the unmanned aerial vehicle records the safe landing point, the position information of the safe landing point is recorded, so when the unmanned aerial vehicle flies to the recorded safe landing point, a second return path of the unmanned aerial vehicle can be determined according to the current position of the unmanned aerial vehicle and the recorded position information of the safe landing point, and the second return path is a path of the unmanned aerial vehicle from the current position to the recorded safe recording point. In some cases, the second return direction corresponding to the second return path of the unmanned aerial vehicle flying to the recorded safe landing point is opposite to the first return direction corresponding to the first return path, so that when the unmanned aerial vehicle flies to the recorded safe landing point, the direction opposite to the first return direction can be directly used as the second return direction.
In the embodiment of the invention, when the unmanned aerial vehicle loses the navigation signal, after the return target position and the first return path of the unmanned aerial vehicle are determined, the unmanned aerial vehicle is controlled to return according to the first return path and the return target position, and when the current position of the unmanned aerial vehicle is detected to be within the first preset range from the return target position, the safe landing point is detected and recorded, so that the unmanned aerial vehicle can land based on the recorded safe landing point, reliable guarantee can be provided for safe landing of the unmanned aerial vehicle, the working efficiency of the unmanned aerial vehicle when the safe landing point is determined is improved, and the processing resources of the unmanned aerial vehicle are effectively saved.
The following describes a method for detecting a safe landing point of a current position by an unmanned aerial vehicle in an embodiment of the present invention, as shown in fig. 2, including the following steps:
s201, when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle.
S202, controlling the unmanned aerial vehicle to return based on the first return route and the return target position.
In an embodiment, the specific implementation of step S201 and step S202 may be referred to the description of step S101 and step S102, which is not repeated herein.
And S203, when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, carrying out plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and carrying out water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
S204, recording a safe landing point for the unmanned aerial vehicle to land.
In step S203 and step S204, the unmanned aerial vehicle may perform plane detection on a current position of the unmanned aerial vehicle based on the binocular vision sensor and/or perform water surface detection on the current position. When the unmanned aerial vehicle performs plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, a to-be-detected area for plane detection is determined from an observation area corresponding to the current position, in one embodiment, an observation area of a binocular vision sensor (i.e., a binocular vision module) of the unmanned aerial vehicle is larger, as shown in a 301 area (i.e., a P-type visual module) shown in fig. 3 1-3D ,P 2-3D ,P 3-3D ,P 4-3D Enclosed area) is assumed to be the observation area of the binocular vision sensor of the unmanned aerial vehicleWhen the plane detection is carried out on the current position, only whether the corresponding maximum circumscribed circle of the unmanned aerial vehicle in the physical space is flat or not is required to be determined, and whether all the areas which can be observed by the binocular vision sensor are flat or not is not required to be detected.
If the area to be detected selected by the unmanned aerial vehicle is too large, an error can be caused to the plane detection result of the unmanned aerial vehicle on the current position, namely if the largest circumcircle corresponding to the unmanned aerial vehicle on the current position is a flat ground, the selected area to be detected contains uneven ground after being enlarged, so that the unmanned aerial vehicle considers that the current position is not a plane, and an error of safe drop point detection occurs; and if the area to be detected selected by the unmanned aerial vehicle is too small, the error can be caused to the plane detection result of the current position of the unmanned aerial vehicle, namely if the current position of the unmanned aerial vehicle is uneven ground, and because the area to be detected is too small, the area to be detected is selected to be detected, the unmanned aerial vehicle can be caused to take the uneven current position as the even current position, and the unmanned aerial vehicle is controlled to descend to the uneven ground to generate a safety fault.
Therefore, in performing the plane detection, a region to be detected 302 with a proper size needs to be selected from the observation region 301 to perform the plane detection, in one embodiment, the selected region to be detected 302 may be a square region with a size of 2 m×2 m in the physical space, so as to determine whether the current position is a plane according to the plane detection result, in one embodiment, the observation region 301 may also be referred to as a detection range of the unmanned aerial vehicle, and the region to be detected 302 may be referred to as an image interest range (Region Of Interest, ROI) of the unmanned aerial vehicle. It will be appreciated that the area size of the selected area to be detected 302 is not limited to 2 m×2 m, and may be determined according to the size of the unmanned aerial vehicle in some cases, for example, the size of a projection plane on the ground when the unmanned aerial vehicle flies is taken as the area size of the area to be detected 302, or the size of the projection plane is enlarged by 50% as the area size.
After determining the area to be detected for plane detection from the observation area of the unmanned aerial vehicle, the method further comprises the following steps ofSpecifically, the observation region 301 and the region to be detected 302 in the three-dimensional space shown in fig. 3 may be subjected to a back projection rule to obtain an observation region image and a region to be detected image in the two-dimensional image, and the observation region 301 in the three-dimensional space corresponds to the observation region image 401 in the two-dimensional image (i.e. defined by P 1-2D ,P 2-2D ,P 3-2D ,P 4-2D An enclosed area), the region to be detected 302 corresponds to the region to be detected image 402 in the two-dimensional image. In one embodiment, the back-projection rule may specifically be:
S·P 2d =K·(R ci ·P 3d +T ci ) 2.1
Wherein, and P 3d Is a three-dimensional space point corresponding to the region to be detected in the three-dimensional space, P 2d For two-dimensional spatial neutralization P 3d The corresponding two-dimensional image points can be back projected according to the back projection rule corresponding to the above formula 2.1 on the basis of the external parameters (R, T) of the camera and the internal parameters K of the camera, so as to determine the two-dimensional projection image 402 corresponding to the to-be-detected area, wherein the points in the two-dimensional projection image 402 are the points to be observed.
After determining the two-dimensional projection image corresponding to the area to be detected, converting points in the two-dimensional projection image into three-dimensional space points under a ground coordinate system, wherein the three-dimensional space points under the ground coordinate system are real observation points, and representing the height of an object higher than a plane in the real space by using the height of a bulge of the bulge point, for example, if the area to be detected selected below the unmanned plane is a plane, the obtained point cloud is almost concentrated on the same plane; if the area to be detected selected under the unmanned aerial vehicle comprises a tree, the shape of the point cloud comprises a bulge similar to the top of the tree.
In one embodiment, a binocular vision module may be used to convert points in the two-dimensional projection image into three-dimensional space points under a ground coordinate system, that is, a binocular vision depth map may be used to convert a physical scene of a region to be detected under the aircraft into a three-dimensional space point set, so that plane detection may be performed on a current position according to the three-dimensional space point set.
When the plane detection is performed on the current position according to the three-dimensional space point set, two embodiments are specifically provided:
(1) And fitting out a fitting plane closest to the three-dimensional space point set by using a plane equation, judging the percentage of the fitting plane to the internal points of the three-dimensional space point set, determining the fitting plane as a plane if the percentage of the internal points meets the preset percentage quantity, and determining the inclination degree of the fitting plane by determining the normal vector of the fitting plane. Wherein, the plane equation is:
ax+by+cz+d=0 formula 2.2
In one embodiment, the fitting of the plane equation parameters as shown in equation 2.2 is similar to solving a linear system ax=0, and a large number of points [ x, y, z ] in the three-dimensional spatial point set and the parameters a, b, c and d to be solved form an overdetermined equation as shown in equation 2.3, so that the plane equation can be solved using a random sampling consistency algorithm (RANSCA).
(2) Because the plane detection is performed on the current position, that is, whether the area to be detected corresponding to the current position is a safe plane needs to be determined, and the plane determined to be safe is close to the horizontal plane, the plane to be fitted can be forcedly determined to be the horizontal plane according to the formula cz+d=0, and then the inner point percentage of the three-dimensional space point set relative to the horizontal plane forcedly fitted can be determined, and whether the current position is flat or not can be determined according to the inner point percentage.
In a second embodiment, the unmanned aerial vehicle may first acquire a standard plane equation, calculate a distance between any one spatial three-dimensional point in the three-dimensional spatial point set and the standard plane equation, determine the number of internal points in the three-dimensional spatial point set according to the distance, where the internal points are three-dimensional spatial points whose distance is less than or equal to a preset distance threshold, and determine the current position as a plane when the number of internal points is greater than or equal to a preset number threshold.
When the second method is adopted to determine whether the current position is flat, because the compulsory fitting horizontal plane equation is cz+d=0, the parameter a=b=0 in the overstation equation 2.3 can be made, and then the overstation equation is solved, so that the parameter solving process of the overstation equation is simpler, in one embodiment, the plane solved according to the overstation equation is the plane in which all point clouds in the three-dimensional space point set are gathered, if the current position is flat, the plane in which the point clouds are gathered must be fitted into a horizontal plane, and if the current position is uneven, the plane in which the point clouds are gathered must not be fitted into a horizontal plane, that is, the parameter which makes the overstation equation 2.3 meaningful can not be solved.
After solving the overdetermined equation 2.3 to obtain a plane equation, each spatial point in the three-dimensional spatial point set can be classified into an inner point and an outer point according to the distance between each spatial point in the three-dimensional spatial point set and the plane equation. In one embodiment, any spatial point in the three-dimensional spatial point set is selected as an observation point, the distance d between the observation point and the plane equation is calculated, and when d is smaller than or equal to a preset distance threshold value, the observation point is classified as an inner point; and classifying the observation points as outer points when the d is larger than the preset distance threshold. Wherein if the observation point coordinates are (x, y, z), the distance between the observation point and the plane equation can be calculated according to equation 2.4:
it should be noted that, if the unmanned aerial vehicle flies in the flying attitude as shown in fig. 5, and performs the safety point detection in the flying process of the flying attitude, since there is an inclined angle between the flying attitude and the horizontal plane of the unmanned aerial vehicle, when the plane detection is performed on the current position by adopting the above method, it is necessary to compensate the flying height of the unmanned aerial vehicle according to the flying attitude, the current position, the inclined angle and the flying height H of the unmanned aerial vehicle, and perform the plane detection on the current position of the unmanned aerial vehicle according to the compensated flying height.
In still another embodiment, when the current position of the unmanned aerial vehicle is detected according to a preset water surface detection algorithm, the area 302 to be detected for water surface detection may be determined from the observation area 301 corresponding to the current position according to the binocular vision sensor of the unmanned aerial vehicle, and the two-dimensional projection image corresponding to the area 302 to be detected may be determined according to the back projection rule of the camera, such as the area 402 in fig. 4. The machine learning algorithm for water surface detection is performed based on a convolutional neural network CNN model obtained through gray map training, namely, the obtained two-dimensional projection image corresponding to the region to be detected can be input into the CNN model, the CNN model can detect whether the region corresponding to each frame of image in the two-dimensional projection image has water surface or not, and the detection result is taken as output so as to determine whether the current position is the water surface or not.
In an embodiment, according to the above plane detection result and the water surface detection result, determining whether the current position is a safe landing point, and recording the determined safe landing point, where when the safe landing point is recorded, the relative position of the safe landing point corresponding to the return destination position may be recorded, where the determined safe landing point is a position where the current position is a plane or a non-water surface.
In one embodiment, other algorithms may be used for the planar detection and water surface detection of the current position, such as conventional vision-based detection methods.
S205, when the current position of the unmanned aerial vehicle is in a second preset range from the return target position in the return process, safety landing point detection is carried out.
And S206, if the detection result is that the safety landing point is not detected, executing the landing step according to the recorded safety landing point.
And S207, if the detection result is that the safe falling point is detected, falling according to the detected safe falling point.
In step S205-step S207, when the unmanned aerial vehicle is within a first preset range from the return target position, a safe landing point may be detected and recorded for the unmanned aerial vehicle to land at the current position, and when the unmanned aerial vehicle continues to fly to a second preset range from the return target position, the second preset range from the current position to the return target position is the vicinity of the return target position, where the first preset range is greater than the second preset range, the first preset range may be, for example, 20 meters or 10 meters, and the second preset range may be, for example, 5 meters or 10 meters.
And when the current position of the unmanned aerial vehicle is in a second preset range from the return target position, if the detection result after the safe landing point detection is that the safe landing point is not detected, executing step S206 to carry out landing according to the recorded safe landing point.
When landing is performed according to the recorded safe landing point, a second return path can be determined according to the current position of the unmanned aerial vehicle and the recorded safe landing point, when the second return path is determined according to the current position and the recorded safe landing point, the second return path can point to the direction of the recorded safe landing point from the current position (namely, the position near the return target position), further, the unmanned aerial vehicle can be controlled to fly to the recorded safe landing point based on the second return path, and when the binocular vision sensor determines that the current position of the unmanned aerial vehicle is the recorded safe landing point, the unmanned aerial vehicle is controlled to land.
In one embodiment, if the detected result after the detection of the safe landing point is that the safe landing point is detected, step S207 is performed to perform landing according to the detected safe landing point, specifically, whether the detected safe landing point is safe or not needs to be continuously detected during the landing process of the unmanned aerial vehicle, because the unmanned aerial vehicle is at a position of a second preset range from the destination location of the return, usually a position of a user changing a battery or a roadside, etc., the unmanned aerial vehicle is likely to have a person during the process of detecting that the current position is the safe landing point and performing landing, and if the safe landing point is not continuously detected during the landing process, a situation that the unmanned aerial vehicle falls in an unsafe position and causes a safety fault also occurs.
When the unmanned aerial vehicle continuously detects that the detected safe landing point is safe, successfully landing to the detected safe landing point; or if the safe landing point is detected to be unsafe in the landing process, adjusting the current flight height of the unmanned aerial vehicle to be a preset flight height, wherein the preset flight height is the flight height of the unmanned aerial vehicle during the return flight (i.e. the height before landing is performed), and executing the step of landing according to the recorded safe landing point in the step S206.
The preset flight height is set for ensuring that the binocular vision module below the unmanned aerial vehicle is in a higher precision range in the course of the return flight, and can be 2 meters or 2.5 meters, for example.
In still another embodiment, when the unmanned aerial vehicle is in a second preset range from the return target position during the return process, the safe landing point is not detected, and it is determined that the safe landing point is not recorded in the first preset range, the safe landing point detection can be performed according to a preset track in a third preset range from the return target position, wherein the preset track comprises one or more of a spiral track, a zigzag track or a linear track, and the zigzag track can be a Z-shaped track, so that the safe landing point can be detected in the preset track, and the safe landing point can be detected according to the detected safe landing point.
For example, if the unmanned aerial vehicle returns from the point B (i.e. the return start position) to the point a (i.e. the return target position) as shown in fig. 6, when reaching the point D near the return target position (i.e. the current position of the unmanned aerial vehicle is within the second preset range from the return target position), the current position (i.e. the point D) is detected as the safe landing point, if it is determined that the point D is an unsafe landing point, and it is determined that the unmanned aerial vehicle has no safe landing point recorded within the first preset range, the safe landing point may be found from the point D according to a preset trajectory, for example, a gray spiral trajectory as shown in the figure, and if the safe landing point is detected at the point K of the spiral trajectory, the safe landing point is found at the point K.
In still another embodiment, when the unmanned aerial vehicle does not detect the safe landing point in the second preset range from the return target position during the return process, and it is determined that the safe landing point is not recorded in the first preset range from the return target position, the unmanned aerial vehicle can be controlled to hover at the return target position to wait for an operation instruction of a user, and when the unmanned aerial vehicle has a low-power alarm, the unmanned aerial vehicle can be forced to execute landing.
In still another embodiment, when the current position of the unmanned aerial vehicle and the return target position are coincident, that is, when the unmanned aerial vehicle performs vision return to the return target position, the return target position may be detected as a safe landing point, if the return target position is detected as a safe landing point, landing is performed at the return target position, and if the return target position is detected as not being a safe landing point, the step of performing landing according to the recorded safe landing point in step S206 is performed.
In the embodiment of the invention, if the unmanned aerial vehicle loses the navigation signal, the unmanned aerial vehicle can be controlled to return based on the first return path and the return target position after the return target position and the first return path of the unmanned aerial vehicle are determined, and when the current position of the unmanned aerial vehicle is within a first preset range from the return target position, the current position is subjected to plane detection and water surface detection according to a preset plane detection algorithm and a preset water surface detection algorithm, so that the safety guarantee is provided for the return and landing process of the unmanned aerial vehicle, the possibility of safety failure of the unmanned aerial vehicle after landing can be effectively reduced, and/or the safety landing point of the unmanned aerial vehicle which is determined to be a plane is/are/is recorded, the safety landing point detection can be carried out when the unmanned aerial vehicle returns to a second preset range from the return target position, and the safety landing attempt is carried out when the current position of the unmanned aerial vehicle is not detected to the safety landing point in the second preset range from the return target position, and the safety landing speed of the unmanned aerial vehicle is directly improved if the safety landing point is detected to the safety landing point.
In the embodiment of the invention, an application scene diagram of a safe landing method based on an unmanned aerial vehicle is provided, as shown in fig. 7, the unmanned aerial vehicle takes off from a point A and flies to a point B according to navigation signals provided by GNSS, a gray curve is used for marking the flight track of the unmanned aerial vehicle from the point A to the point B in fig. 7, and in the process of flying the unmanned aerial vehicle from the point A to the point B, the position information of the unmanned aerial vehicle is refreshed and recorded in real time according to reliable coordinate information provided by GNSS.
If the unmanned aerial vehicle loses the navigation signal due to the GNSS fault at the point B, please refer to the flow chart of the unmanned aerial vehicle during the safe landing as shown in fig. 8, the unmanned aerial vehicle may trigger the binocular vision module below and perform the visual return according to the position information provided by the visual odometer when determining the lost navigation signal, and during the visual return, the unmanned aerial vehicle may select any position from the preset at least one return position as the return target position of the unmanned aerial vehicle, assuming that the selected target return position is the departure position of the unmanned aerial vehicle, i.e. the position marked by the point a in fig. 7, and the direction towards the return target position recorded by the unmanned aerial vehicle before the loss of the navigation signal is taken as the first return direction, and the first return direction is the direction indicated by the black curve in fig. 7, and further may determine the first return path (i.e. the path indicated by the black curve in fig. according to the first return direction and the first return target position), where the first return direction is the return target position is pointed from the start position of the unmanned aerial vehicle, so as to control the return target path from the start position a.
When the visual odometer determines that the current position of the unmanned aerial vehicle reaches a first preset range from the return target position, namely, when the unmanned aerial vehicle is at a point C shown in fig. 7, in order to avoid the unmanned aerial vehicle to repeatedly detect the safe landing point, the unmanned aerial vehicle can be controlled to start to detect the safe landing point, and the detected safe landing point is recorded. In one embodiment, the detected and recorded safe drop points are assumed to be points a, b, and c marked with stars in FIG. 7.
Due to a return error caused by visual return, the unmanned aerial vehicle may start to perform a safe landing attempt according to the direction of the first return path from the point D which is at a second preset range from the return target position; the drone may also correctly return to point a and make a safe landing attempt at point a. If the safe landing attempt carried out at the point A or the point D fails, searching the safe landing points (namely the points a, b and c) recorded in a second preset range, determining the safe landing point closest to the point A or the point D from the recorded safe landing points to be the point a, and adjusting the flying height of the aircraft to be the preset flying height, and then adjusting the first return path of the aircraft to be a second return path, so that the aircraft flies from the point D to the point a according to the indication of the second return path and carries out safe landing.
If the aircraft does not find a recorded safe landing point and fails to try landing for a plurality of times when returning to the return destination position, or the safe landing point is not detected within a second preset range, the operation of hovering in place and waiting for the user to host can be executed, and the forced landing is executed when the electric quantity is seriously low. It can be appreciated that the unmanned aerial vehicle hovers at the return target position, which can be the actual return target position, namely, the point A, or the point D due to the visual return error.
An embodiment of the present invention provides a safety landing apparatus, which is applied to an unmanned aerial vehicle, and fig. 9 is a structural diagram of the safety landing apparatus applied to the unmanned aerial vehicle, as shown in fig. 9, the safety landing apparatus 900 applied to the unmanned aerial vehicle includes a memory 901 and a processor 902, wherein the memory 902 stores program codes, the processor 902 calls the program codes in the memory, and when the program codes are executed, the processor 902 performs the following operations:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to return based on the first return route and the return target position;
When the current position of the unmanned aerial vehicle is in a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for the unmanned aerial vehicle to land;
landing is performed according to the recorded safe landing points.
In one embodiment, the processor 902 is further configured to perform the following operations:
when the current position of the unmanned aerial vehicle is in a second preset range from the return target position in the return process, detecting a safe landing point, wherein the first preset range is larger than the second preset range;
if the detection result is that the safe landing point is not detected, executing the landing step according to the recorded safe landing point;
and if the detection result is that the safe falling point is detected, falling according to the detected safe falling point.
In one embodiment, the processor 902 performs the following operations when landing according to the recorded safe landing point:
determining a second return path according to the current position of the unmanned aerial vehicle and the recorded safe landing point;
flying to the recorded safe landing point based on the second return path;
And when the current position of the unmanned aerial vehicle is the recorded safe landing point, controlling the unmanned aerial vehicle to land.
In one embodiment, the processor 902 performs the following operations when landing according to the detected safe landing point:
continuously detecting whether the detected safe falling point is safe or not in the falling process;
if the detected safe falling points are continuously detected to be safe, determining that the safe falling points are successfully fallen to the detected safe falling points;
if the detected safe landing point is not safe, adjusting the current flight height of the unmanned aerial vehicle to be a preset flight height, and executing the landing step according to the recorded safe landing point;
the preset flight height is the flight height of the unmanned aerial vehicle during the return voyage.
In one embodiment, the processor 902 is further configured to perform the following operations:
if no safe landing point is recorded in a first preset range from the return target position and no safe landing point is detected in a second preset range from the return target position, detecting the safe landing point according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a broken line track;
And when the safe falling point is detected in the preset track, falling according to the detected safe falling point.
In one embodiment, the processor 902 is further configured to perform the following operations:
and if the safe landing point is not recorded in the first preset range from the return target position and the safe landing point is not detected in the second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
In one embodiment, the processor 902 is further configured to perform the following operations:
when the unmanned aerial vehicle reaches the return target position, detecting a safe landing point of the return target position;
if the detection result is that the return target position is a safe landing point, landing is executed at the return target position;
and if the detection result is that the return target position is not the safe landing point, executing the landing step according to the recorded safe landing point.
In one embodiment, the processor 902, when determining the return target location and the first return path of the drone, performs the following operations:
selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;
Determining a first return direction according to the return target position;
and determining the first return path according to the first return direction and the return target position.
In one embodiment, the processor 902, when controlling the drone to navigate back based on the first destination route and the destination location, performs the following operations:
and controlling the unmanned aerial vehicle to return based on the first return route, the return target position and position information provided by a visual odometer.
In one embodiment, the second return direction corresponding to the second return path is opposite to the first return direction corresponding to the first return path.
In one embodiment, the safety drop point is a planar position and a non-water surface position.
In one embodiment, the processor 902 performs the following operations when performing the safe drop point detection:
and safety landing point detection is carried out based on a binocular vision sensor.
In one embodiment, the processor 902 performs the following operations when performing the safe drop point detection:
performing plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and performing water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
In one embodiment, the navigation signal includes at least one of: signals of the positioning sensor and signals of the compass.
In one embodiment, the processor 902 performs the following operations when performing plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm:
determining a region to be detected for plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the region to be detected;
converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;
and carrying out plane detection on the current position according to the three-dimensional space point set.
In one embodiment, the processor 902 performs the following operations when performing plane detection on the current position according to the three-dimensional space point set:
obtaining a standard plane equation;
calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of internal points in the three-dimensional space point set according to the distance, wherein the internal points are three-dimensional space points with the distance smaller than or equal to a preset distance threshold value;
And when the number of the inner points is larger than or equal to a preset number threshold, determining the current position as a plane.
In one embodiment, the processor 902 performs the following operations when performing water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm:
determining a region to be detected for water surface detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the region to be detected;
and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is a water surface according to the output of the convolutional neural network model.
The safety landing device applied to the unmanned aerial vehicle provided in the embodiment can execute the safety landing method shown in fig. 1 and fig. 2 provided in the foregoing embodiment, and the execution mode and the beneficial effects are similar, and are not repeated here.
The embodiment of the invention provides an unmanned aerial vehicle, which comprises a body, a power system and the safety landing device. The operation of the safety landing apparatus of the unmanned aerial vehicle is the same as or similar to that described above, and will not be described here again. The power system of the unmanned aerial vehicle can comprise a rotor, a motor for driving the rotor to rotate and an electric regulator thereof. The unmanned aerial vehicle can be four-rotor, six-rotor, eight-rotor or other multi-rotor unmanned aerial vehicle, and the unmanned aerial vehicle is lifted vertically to work at the moment. It will be appreciated that the unmanned aerial vehicle may also be a fixed wing unmanned aerial vehicle or a hybrid wing unmanned aerial vehicle.
The unmanned aerial vehicle provided by the embodiment of the invention can further comprise a sensor arranged on the machine body. The sensor comprises a GNSS module for providing position information for the unmanned aerial vehicle. The sensor further comprises at least one of a binocular vision sensor or a vision odometer. In some embodiments, a binocular vision sensor may be disposed below the drone for acquiring images of the drone and generating depth maps, semantic maps, or other information for safe drop point detection. In some embodiments, a visual odometer may be provided on the front side of the drone, such that the drone provides range information to the drone when no GNSS signals or GNSS module malfunctions are not operational.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (35)

1. A method of safe landing of an unmanned aerial vehicle, comprising:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to return based on the first return route and the return target position;
when the current position of the unmanned aerial vehicle is in a first preset range from the return target position in the return process, detecting a first safe landing point, and recording the first safe landing point for the unmanned aerial vehicle to land;
when the current position of the unmanned aerial vehicle is in a second preset range from the return target position in the return process, detecting a second safe landing point, wherein the first preset range is larger than the second preset range;
if the detection result indicates that the second safe landing point is not detected, landing is carried out according to the recorded first safe landing point;
and if the detection result indicates that the second safe falling point is detected, falling according to the detected second safe falling point.
2. The method of claim 1, wherein the landing according to the recorded first safe landing point comprises:
Flying towards the recorded first safe landing point based on a second return path, wherein the second return path is determined according to the current position of the unmanned aerial vehicle and the recorded safe landing point; or,
and when the current position of the unmanned aerial vehicle is the recorded first safe landing point, controlling the unmanned aerial vehicle to land.
3. The method of claim 1, wherein the landing based on the detected second safe landing point comprises:
continuously detecting whether the detected second safe falling point is safe or not in the falling process;
if the detected second safe falling points are all safe, determining that the second safe falling points are successfully fallen to the detected second safe falling points;
if the detected second safe landing point is unsafe, adjusting the current flight height of the unmanned aerial vehicle to be a preset flight height, and executing the landing step according to the recorded first safe landing point;
the preset flight height is the flight height of the unmanned aerial vehicle during the return voyage.
4. The method according to claim 1, wherein the method further comprises:
If the first safe landing point is not recorded in a first preset range from the return target position and the second safe landing point is not detected in a second preset range from the return target position, carrying out safe landing point detection according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a broken line track;
and when a third safe landing point is detected in the preset track, landing is carried out according to the detected third safe landing point.
5. The method according to claim 1, wherein the method further comprises:
and if the first safe landing point is not recorded in a first preset range from the return target position and the second safe landing point is not detected in a second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
6. The method according to claim 1, wherein the method further comprises:
when the unmanned aerial vehicle reaches the return target position, detecting the second safe landing point of the return target position;
If the detection result is that the return target position is the second safe landing point, landing is executed at the return target position;
and if the detection result is that the return target position is not the second safe landing point, executing the step of landing according to the recorded first safe landing point.
7. The method of claim 1, wherein the determining the return target location and the first return path of the drone includes:
selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;
determining a first return direction according to the return target position;
and determining the first return path according to the first return direction and the return target position.
8. The method of claim 1, wherein the controlling the drone to return based on the first return path and the return target location comprises:
and controlling the unmanned aerial vehicle to return based on the first return route, the return target position and position information provided by a visual odometer.
9. The method of claim 2, wherein the second return direction corresponding to the second return path is opposite to the first return direction corresponding to the first return path.
10. The method according to any one of claims 1-9, wherein the first safety drop point and/or the second safety drop point is a planar and non-water surface location.
11. The method according to any one of claims 1-9, wherein said performing a first safety drop point and/or a second safety drop point detection comprises:
and safety landing point detection is carried out based on a binocular vision sensor.
12. The method according to any one of claims 1-9, wherein said performing a first safety drop point and/or a second safety drop point detection comprises:
performing plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and performing water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
13. The method according to any one of claims 1-9, wherein the navigation signal comprises at least one of: signals of the positioning sensor and signals of the compass.
14. The method of claim 12, wherein the performing the plane detection on the current position of the unmanned aerial vehicle according to the preset plane detection algorithm comprises:
determining a region to be detected for plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
Determining a two-dimensional projection image corresponding to the region to be detected;
converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;
and carrying out plane detection on the current position according to the three-dimensional space point set.
15. The method of claim 14, wherein said performing planar detection of said current location from said set of three-dimensional spatial points comprises:
obtaining a standard plane equation;
calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of internal points in the three-dimensional space point set according to the distance, wherein the internal points are three-dimensional space points with the distance smaller than or equal to a preset distance threshold value;
and when the number of the inner points is larger than or equal to a preset number threshold, determining the current position as a plane.
16. The method of claim 12, wherein the performing water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm comprises:
determining a region to be detected for water surface detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
Determining a two-dimensional projection image corresponding to the region to be detected;
and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is a water surface according to the output of the convolutional neural network model.
17. A safety landing device applied to an unmanned aerial vehicle, which is characterized by comprising a memory and a processor;
the memory is used for storing program codes;
the processor invokes the program code, which when executed, is operable to:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to return based on the first return route and the return target position;
when the current position of the unmanned aerial vehicle is in a first preset range from the return target position in the return process, detecting a first safe landing point, and recording the first safe landing point for the unmanned aerial vehicle to land;
when the current position of the unmanned aerial vehicle is in a second preset range from the return target position in the return process, detecting a second safe landing point, wherein the first preset range is larger than the second preset range;
If the detection result indicates that the second safe landing point is not detected, landing is carried out according to the recorded first safe landing point;
and if the detection result indicates that the second safe falling point is detected, falling according to the detected second safe falling point.
18. The apparatus of claim 17, wherein the landing according to the recorded first safe landing point performs the following operations:
flying towards the recorded first safe landing point based on a second return path, wherein the second return path is determined according to the current position of the unmanned aerial vehicle and the recorded safe landing point; or,
and when the current position of the unmanned aerial vehicle is the recorded first safe landing point, controlling the unmanned aerial vehicle to land.
19. The apparatus of claim 17, wherein the landing based on the detected second safe landing point performs the following:
continuously detecting whether the detected second safe falling point is safe or not in the falling process;
if the detected second safe falling points are all safe, determining that the second safe falling points are successfully fallen to the detected second safe falling points;
If the detected second safe landing point is unsafe, adjusting the current flight height of the unmanned aerial vehicle to be a preset flight height, and executing the landing step according to the recorded first safe landing point;
the preset flight height is the flight height of the unmanned aerial vehicle during the return voyage.
20. The apparatus of claim 17, wherein the apparatus is further configured to:
if the first safe landing point is not recorded in a first preset range from the return target position and the second safe landing point is not detected in a second preset range from the return target position, carrying out safe landing point detection according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a broken line track;
and when a third safe landing point is detected in the preset track, landing is carried out according to the detected third safe landing point.
21. The apparatus of claim 17, wherein the apparatus is further configured to:
and if the first safe landing point is not recorded in a first preset range from the return target position and the second safe landing point is not detected in a second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
22. The apparatus of claim 17, wherein the apparatus is further configured to:
when the unmanned aerial vehicle reaches the return target position, detecting the second safe landing point of the return target position;
if the detection result is that the return target position is the second safe landing point, landing is executed at the return target position;
and if the detection result is that the return target position is not the second safe landing point, executing the step of landing according to the recorded first safe landing point.
23. The apparatus of claim 17, wherein the determining the destination location and the first return path of the drone performs the following:
selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;
determining a first return direction according to the return target position;
and determining the first return path according to the first return direction and the return target position.
24. The apparatus of claim 17, wherein the controlling the drone performs the following operations when making a return based on the first return path and the return target location:
And controlling the unmanned aerial vehicle to return based on the first return route, the return target position and position information provided by a visual odometer.
25. The apparatus of claim 18, wherein a second return direction corresponding to the second return path is opposite to a first return direction corresponding to the first return path.
26. The apparatus of any one of claims 17-25, wherein the first safety drop point and/or the second safety drop point are planar and non-water surface locations.
27. The apparatus according to any one of claims 17-25, wherein the first and/or second safety landing point detection is performed by:
and safety landing point detection is carried out based on a binocular vision sensor.
28. The apparatus according to any one of claims 17-25, wherein the first and/or second safety landing point detection is performed by:
performing plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and performing water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
29. The apparatus of any one of claims 17-25, wherein the navigation signal comprises at least one of: signals of the positioning sensor and signals of the compass.
30. The apparatus of claim 28, wherein when performing the plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, performing the following operations:
determining a region to be detected for plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the region to be detected;
converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;
and carrying out plane detection on the current position according to the three-dimensional space point set.
31. The apparatus of claim 30, wherein the performing planar detection on the current position according to the three-dimensional space point set performs the following operations:
obtaining a standard plane equation;
calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of internal points in the three-dimensional space point set according to the distance, wherein the internal points are three-dimensional space points with the distance smaller than or equal to a preset distance threshold value;
And when the number of the inner points is larger than or equal to a preset number threshold, determining the current position as a plane.
32. The apparatus of claim 28, wherein when the current position of the unmanned aerial vehicle is detected according to a preset water surface detection algorithm, the following operations are performed:
determining a region to be detected for water surface detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the region to be detected;
and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is a water surface according to the output of the convolutional neural network model.
33. An unmanned aerial vehicle, comprising:
a body;
the power system is arranged on the machine body and used for providing power for the unmanned aerial vehicle;
and a safety landing gear as claimed in any one of claims 17 to 32.
34. The unmanned aerial vehicle of claim 33, wherein the unmanned aerial vehicle further comprises:
the sensor is installed on the machine body, and the sensor at least comprises one of the following: binocular vision sensors or vision odometers;
The binocular vision sensor is used for detecting a safe landing point;
the visual odometer is used for providing position information of the unmanned aerial vehicle during return voyage.
35. A computer storage medium, characterized in that the computer storage medium has stored therein computer program instructions for performing the method of landing a drone according to any one of claims 1-16 when executed by a processor.
CN201880066282.1A 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium Active CN111615677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410267172.9A CN118092500A (en) 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/117820 WO2020107248A1 (en) 2018-11-28 2018-11-28 Method and device for safe landing of unmanned aerial vehicle, unmanned aerial vehicle, and medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410267172.9A Division CN118092500A (en) 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium

Publications (2)

Publication Number Publication Date
CN111615677A CN111615677A (en) 2020-09-01
CN111615677B true CN111615677B (en) 2024-04-12

Family

ID=70854244

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410267172.9A Pending CN118092500A (en) 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium
CN201880066282.1A Active CN111615677B (en) 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410267172.9A Pending CN118092500A (en) 2018-11-28 2018-11-28 Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium

Country Status (2)

Country Link
CN (2) CN118092500A (en)
WO (1) WO2020107248A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114578855B (en) * 2022-03-03 2022-09-20 北京新科汇智科技发展有限公司 Unmanned aerial vehicle standby landing method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN105867423A (en) * 2016-06-08 2016-08-17 杨珊珊 Course reversal method and course reversal system of unmanned aerial vehicle and unmanned aerial vehicle
CN106527481A (en) * 2016-12-06 2017-03-22 重庆零度智控智能科技有限公司 Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle
CN106927059A (en) * 2017-04-01 2017-07-07 成都通甲优博科技有限责任公司 A kind of unmanned plane landing method and device based on monocular vision
CN107943090A (en) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 The landing method and system of a kind of unmanned plane
CN108474658A (en) * 2017-06-16 2018-08-31 深圳市大疆创新科技有限公司 Ground Morphology observation method and system, unmanned plane landing method and unmanned plane

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101329583B1 (en) * 2013-07-09 2013-11-14 주식회사 두레텍 Air observations using the rotor structure construction method and system for terrain data
CN104881039A (en) * 2015-05-12 2015-09-02 零度智控(北京)智能科技有限公司 Method and system for returning of unmanned plane
DE102015013104A1 (en) * 2015-08-22 2017-02-23 Dania Lieselotte Reuter Method for target approach control of unmanned aerial vehicles, in particular delivery docks
CN107291099A (en) * 2017-07-06 2017-10-24 杨顺伟 Unmanned plane makes a return voyage method and device
CN107479082A (en) * 2017-09-19 2017-12-15 广东容祺智能科技有限公司 A kind of unmanned plane makes a return voyage method without GPS
CN108124471B (en) * 2017-12-11 2021-03-16 深圳市道通智能航空技术有限公司 Unmanned aerial vehicle return flight method and device, storage medium and unmanned aerial vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN105867423A (en) * 2016-06-08 2016-08-17 杨珊珊 Course reversal method and course reversal system of unmanned aerial vehicle and unmanned aerial vehicle
CN106527481A (en) * 2016-12-06 2017-03-22 重庆零度智控智能科技有限公司 Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle
CN106927059A (en) * 2017-04-01 2017-07-07 成都通甲优博科技有限责任公司 A kind of unmanned plane landing method and device based on monocular vision
CN108474658A (en) * 2017-06-16 2018-08-31 深圳市大疆创新科技有限公司 Ground Morphology observation method and system, unmanned plane landing method and unmanned plane
CN107943090A (en) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 The landing method and system of a kind of unmanned plane

Also Published As

Publication number Publication date
CN118092500A (en) 2024-05-28
WO2020107248A1 (en) 2020-06-04
CN111615677A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
US11237572B2 (en) Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof
US20220234733A1 (en) Aerial Vehicle Smart Landing
US10029804B1 (en) On-board, computerized landing zone evaluation system for aircraft
EP3876070B1 (en) Method and device for planning path of unmanned aerial vehicle, and unmanned aerial vehicle
EP3128386B1 (en) Method and device for tracking a moving target from an air vehicle
Odelga et al. Obstacle detection, tracking and avoidance for a teleoperated UAV
US20220244746A1 (en) Method of controlling an aircraft, flight control device for an aircraft, and aircraft with such flight control device
JP2015006874A (en) Systems and methods for autonomous landing using three dimensional evidence grid
US20200012296A1 (en) Unmanned aerial vehicle landing area detection
CN111338383B (en) GAAS-based autonomous flight method and system, and storage medium
US10139493B1 (en) Rotor safety system
EP3210091B1 (en) Optimal safe landing area determination
US10937325B2 (en) Collision avoidance system, depth imaging system, vehicle, obstacle map generator, and methods thereof
EP3771956B1 (en) Systems and methods for generating flight paths for navigating an aircraft
Ortiz et al. Vessel inspection: A micro-aerial vehicle-based approach
CN112789672A (en) Control and navigation system, attitude optimization, mapping and positioning technology
CN112596071A (en) Unmanned aerial vehicle autonomous positioning method and device and unmanned aerial vehicle
CN112379681A (en) Unmanned aerial vehicle obstacle avoidance flight method and device and unmanned aerial vehicle
CN112378397A (en) Unmanned aerial vehicle target tracking method and device and unmanned aerial vehicle
CN111615677B (en) Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium
CN114379802A (en) Automatic safe landing place selection for unmanned flight system
US20230095700A1 (en) Vehicle flight control method and apparatus for unmanned aerial vehicle, and unmanned aerial vehicle
CN112380933A (en) Method and device for identifying target by unmanned aerial vehicle and unmanned aerial vehicle
CN114721441B (en) Multi-information-source integrated vehicle-mounted unmanned aerial vehicle autonomous landing control method and device
Rydell et al. Autonomous UAV-based forest mapping below the canopy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant