CN111615677A - Safe landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and medium - Google Patents
Safe landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and medium Download PDFInfo
- Publication number
- CN111615677A CN111615677A CN201880066282.1A CN201880066282A CN111615677A CN 111615677 A CN111615677 A CN 111615677A CN 201880066282 A CN201880066282 A CN 201880066282A CN 111615677 A CN111615677 A CN 111615677A
- Authority
- CN
- China
- Prior art keywords
- return
- aerial vehicle
- unmanned aerial
- point
- safe
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 238000001514 detection method Methods 0.000 claims description 118
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 34
- 230000000007 visual effect Effects 0.000 claims description 19
- 238000013527 convolutional neural network Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000000575 pesticide Substances 0.000 description 2
- 238000005507 spraying Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006386 neutralization reaction Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A safe landing method and device for an unmanned aerial vehicle, the unmanned aerial vehicle and a computer storage medium are provided, wherein the method comprises the following steps: when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle (S101); controlling the unmanned aerial vehicle to return based on the first return path and the return target position (S102); when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle (S103); and landing according to the recorded safe landing points (S104), wherein the method is helpful for providing guarantee for the safe landing of the unmanned aerial vehicle.
Description
The embodiment of the invention relates to the technical field of computers, in particular to a safe landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium.
Agricultural unmanned aerial vehicle is at the in-process of independently operation, may lose the GNSS signal because of reasons such as environmental disturbance or hardware fault and lead to agricultural unmanned aerial vehicle can't obtain positional information, in order to make unmanned aerial vehicle can realize returning voyage when losing the GNSS signal, can be with the help of the binocular vision module of organism below to make agricultural unmanned aerial vehicle can return voyage according to the positional information that this binocular vision module provided.
However, because reasons such as light, environment make binocular vision module probably can't provide accurate position, lead to unmanned aerial vehicle great deviation to appear on the way of returning to the voyage, because the great deviation that appears when this unmanned aerial vehicle returns to the voyage, probably make unmanned aerial vehicle descend to the region or the aquatic of ground unevenness on the way of returning to the voyage for unmanned aerial vehicle lacks the guarantee of safe landing when descending.
Disclosure of Invention
The embodiment of the invention provides a safe landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium, which are beneficial to providing guarantee for the safe landing of the unmanned aerial vehicle.
The first aspect of the embodiment of the invention provides a safe landing method of an unmanned aerial vehicle, which comprises the following steps:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position;
when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle;
and landing according to the recorded safe landing point.
The second aspect of the embodiment of the present invention provides a safe landing device, which is applied to an unmanned aerial vehicle, and is characterized in that the safe landing device includes a memory and a processor;
the memory is used for storing program codes;
the processor, calling the program code, when the program code is executed, is configured to:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position;
when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle;
and landing according to the recorded safe landing point.
A third aspect of an embodiment of the present invention is to provide an unmanned aerial vehicle, including:
a body;
the power system is arranged on the machine body and used for providing power for the unmanned aerial vehicle;
and a safety landing apparatus as described in the second aspect.
In the embodiment of the invention, when the unmanned aerial vehicle loses a navigation signal, the unmanned aerial vehicle can be controlled to carry out return voyage according to the first return voyage path and the first return voyage path after the return voyage target position and the first return voyage path of the unmanned aerial vehicle are determined, and when the current position of the unmanned aerial vehicle is detected to be within a first preset range from the return voyage target position, a safe landing point is detected and recorded, so that the unmanned aerial vehicle can land based on the recorded safe landing point, reliable guarantee can be provided for the safe landing of the unmanned aerial vehicle, meanwhile, the working efficiency of the unmanned aerial vehicle when the safe landing point is determined is improved, and the processing resources of the unmanned aerial vehicle are effectively saved.
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for safely landing an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for safely landing an unmanned aerial vehicle according to another embodiment of the present invention;
fig. 3 is a schematic diagram of an observation area and an area to be detected of an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of the observation region and the region to be detected shown in FIG. 3 being back-projected into a two-dimensional image according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an inclined flight attitude of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an unmanned aerial vehicle determining a safe landing point according to a preset trajectory according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of a method for safe landing of a drone according to another embodiment of the present invention;
fig. 8 is a schematic view of a return flight flow of an unmanned aerial vehicle according to an embodiment of the present invention;
figure 9 is a schematic block diagram of a safety landing device provided in accordance with an embodiment of the present invention.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive effort based on the embodiments of the present invention, are within the scope of the present invention.
The unmanned aerial vehicle may lose the navigation signal due to environmental interference or hardware failure and other factors in the operation process. The navigation signal of the drone may include at least one of a signal of a positioning sensor or a signal of a compass. The positioning sensor may include a Global Navigation Satellite System (GNSS) module provided on the drone. For example, an agricultural drone performing pesticide spraying operations may malfunction due to interference of environmental factors to the GNSS module (colloquially referred to as the drone missing a star), or cause the drone to lose a navigation signal due to failure of the compass caused by hardware failure. When the unmanned aerial vehicle loses the navigation signal, the unmanned aerial vehicle cannot acquire a reliable navigation signal by virtue of the GNSS module or the compass, namely the unmanned aerial vehicle cannot execute accurate return flight according to reliable coordinate information or direction information, in order to enable the unmanned aerial vehicle losing the navigation signal to return flight to a return flight target position or a position near the return flight target position, visual return flight can be executed by virtue of the binocular vision module of the unmanned aerial vehicle, and in some embodiments, the return flight target position can be called a Home point. The Home point may be a departure position of the unmanned aerial vehicle, or may be a position set by the user (which may not be the same as the departure position). When the unmanned aerial vehicle performs return voyage, the Home point is used as a return voyage target position and flies to the return voyage target position.
Currently, when the unmanned aerial vehicle performs return navigation based on the binocular vision module, the starting position of the unmanned aerial vehicle can be used as the return target position of the unmanned aerial vehicle when a navigation signal is lost, so that a first return direction of the unmanned aerial vehicle can be determined according to the direction of the unmanned aerial vehicle, which is recorded latest before the unmanned aerial vehicle loses the navigation signal, towards the return target position, a first return path of the unmanned aerial vehicle is determined according to the first return direction and the return target position, further, when the unmanned aerial vehicle detects the loss of the navigation signal, the unmanned aerial vehicle performs return navigation towards the return target position according to the indication of the first return path, and when the unmanned aerial vehicle returns to the return target position, the unmanned aerial vehicle performs safety landing point detection and performs landing operation when a safety landing point is detected; if the safe landing point is not detected, the safe landing point needs to be searched again and landing is executed.
By adopting the current unmanned aerial vehicle return method, although the safe landing point detection is carried out on the current landing position when the unmanned aerial vehicle reaches the return target position, if the detection result indicates that the current landing position is not the safe landing point, the unmanned aerial vehicle is required to determine a new flight route again and determine the safe landing point from the new flight route again, so that the working efficiency of the unmanned aerial vehicle when determining the safe landing point is reduced, and the processing resource of the unmanned aerial vehicle is wasted.
Based on this, this application has provided an unmanned aerial vehicle's safe landing method, can improve the work efficiency of unmanned aerial vehicle when confirming safe landing point to effectively save unmanned aerial vehicle's processing resources, and simultaneously for unmanned aerial vehicle's safe landing provides reliable guarantee.
Referring to fig. 1, a schematic flowchart of a method for a safe landing of an unmanned aerial vehicle according to an embodiment of the present invention is shown in fig. 1, where the method may include:
s101, when the unmanned aerial vehicle loses the navigation signal, determining a return flight target position and a first return flight path of the unmanned aerial vehicle.
In one embodiment, if a positioning failure occurs during the flight of the drone, the drone may lose the navigation signal, such as a loss of navigation signal condition caused when the GNSS of the drone fails. When unmanned aerial vehicle detects that the navigation signal is lost, can trigger the binocular vision module of unmanned aerial vehicle below and carry out the vision and return to navigate to make unmanned aerial vehicle can return to preset target position department of returning to navigate. In one embodiment, the preset return target position is determined from at least one preset return position, which may be, for example, a position preset in the drone for battery replacement or flight operations, or a departure position of the drone. If the unmanned aerial vehicle only loses the navigation signal due to the loss of the signal of the positioning sensor, for example, the positioning sensor fails and loses the navigation signal, such as the loss of the navigation signal due to the GNSS failure, the first return direction of the unmanned aerial vehicle during the return flight can be determined by the compass, and the first return path of the unmanned aerial vehicle is determined by combining the determined return target position and the first return direction.
In one embodiment, the drone may also lose navigation signals if a compass fault occurs during flight. At this moment, the unmanned aerial vehicle can perform visual return according to the binocular vision module, and the direction of the unmanned aerial vehicle is determined according to the coordinate signal of the positioning sensor during the return. The preset return target position may be the return position as described above. The unmanned aerial vehicle can rely on positioning sensor like the GNSS module to confirm unmanned aerial vehicle's position this moment to combine the first route of returning the flight target position of confirming this unmanned aerial vehicle.
In another embodiment, if the unmanned aerial vehicle loses the navigation signal due to the loss of the signals of the positioning sensor and the compass, for example, the navigation signal is lost when both the positioning sensor and the compass fail, the unmanned aerial vehicle can identify the surrounding environment according to the vision module and position the position and direction of the unmanned aerial vehicle according to the surrounding environment, and the first return path and the return target position of the unmanned aerial vehicle can be determined by combining the recorded position and direction information before the navigation signal is lost.
At unmanned aerial vehicle from returning the flight target location to losing the in-process that the navigation signal corresponds the position, unmanned aerial vehicle's positioner is normal, can carry out real-time update and record to unmanned aerial vehicle's flight position. When the unmanned aerial vehicle takes off, the return target position of the unmanned aerial vehicle can be recorded. Therefore, when unmanned aerial vehicle is at normal flight in-process, the positioner of normal work can real-time update with take notes unmanned aerial vehicle's current position to according to current position and return voyage target location, real-time update with take notes unmanned aerial vehicle towards the direction of returning voyage target location.
For example, the unmanned aerial vehicle may record the attitude of the unmanned aerial vehicle in real time through an Inertial Measurement Unit (IMU) thereon, so as to record the flight direction of the unmanned aerial vehicle in real time and obtain the direction of the unmanned aerial vehicle toward the return flight target position. When the unmanned aerial vehicle loses the navigation signal, for example, when the GNSS module encounters a signal loss due to interference, a fault or other problems, any position can be selected from at least one preset return position to serve as a return target position, the last recorded flight position before the unmanned aerial vehicle loses the navigation signal can be determined as a return initial position, and the last recorded direction towards the return target position before the unmanned aerial vehicle loses the navigation signal can be determined as a first return direction, so that a first return path of the unmanned aerial vehicle can be determined according to the first return direction, the return target position and the return initial position. The first return direction is the direction in which the return starting position of the unmanned aerial vehicle points to the return target position. For example, the unmanned aerial vehicle may determine a straight path as the first return route to fly toward the Home point according to the return starting position and the first return direction; or determining an original flight path according to the return flight starting position, the Home point and the flight position recorded in the way, taking the original flight path as a first return flight path, and returning the first return flight path to the Home point; of course, the drone may generate the first return path according to other manners, which is not limited herein.
And S102, controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position.
In one embodiment, the unmanned aerial vehicle may use any return position recorded by a positioning device such as a GNSS module as a return target position of the unmanned aerial vehicle, and any return position recorded by the positioning device such as the GNSS module may be, for example, a position where the unmanned aerial vehicle performs flight operations or battery replacement, a position where the agricultural unmanned aerial vehicle performs pesticide spraying, a position set by a user, or the like. In one embodiment, the unmanned aerial vehicle may adjust the first return path of the unmanned aerial vehicle in real time in combination with the current position of the unmanned aerial vehicle during the return journey, and control the unmanned aerial vehicle to return journey towards the Home point (i.e., the return journey target position) according to the indication of the first return path corresponding to each current position during the return journey.
In some cases, the first return path may be substantially unchanged, e.g., the drone flies directly from the return origin location towards the return target location. In other cases, the first return path may be changed, for example, the unmanned aerial vehicle encounters an obstacle during the return process and the first return path is changed due to the start of the obstacle avoidance function, and at this time, the unmanned aerial vehicle may return to the first return path started again after the obstacle avoidance function is finished by bypassing the obstacle, or continue to return to the return path after the first return path is changed.
When the unmanned aerial vehicle is controlled to return based on the first return route and the return target position, the unmanned aerial vehicle can also refer to the position information provided by the visual odometer, and the unmanned aerial vehicle can determine the current position of the unmanned aerial vehicle according to the position information provided by the visual odometer.
S103, when the current position of the unmanned aerial vehicle is within a first preset range of the return target position, the safe landing point detection is carried out, and the safe landing point for landing of the unmanned aerial vehicle is recorded.
In the process of visual return flight when the unmanned aerial vehicle loses the navigation signal, because the accurate coordinate information of the unmanned aerial vehicle cannot be acquired through GNSS and the errors of the visual odometer and/or the compass in the return flight process are added, may not allow the drone to accurately return to the return target location, the final return target location of the drone may be a location near the return target location of the drone, since the nearby location of the return target location may not be a location suitable for safe landing of the drone, therefore, the unmanned aerial vehicle needs to determine the safe landing point for the unmanned aerial vehicle to land again, in order to effectively improve the processing efficiency of the unmanned aerial vehicle for finding the safe landing point, the safety landing point detection can be carried out on the current position of the unmanned aerial vehicle in the process of returning the vehicle, and the determined safety landing point is recorded, so that the unmanned aerial vehicle can land according to the recorded safety landing point in the step S104.
In an embodiment, the unmanned aerial vehicle may start to perform the detection of the safe landing point when the current position is within a first preset range from the return target position, and record the detected safe landing point (that is, store the detected safe landing point), so that when the unmanned aerial vehicle lands at or near the Home point (that is, the return target position), if the detected safe landing point is an unsafe landing point, step S104 may be executed to land according to the recorded safe landing point, thereby avoiding a situation of repeatedly searching for the safe landing point, thereby saving processing resources of the unmanned aerial vehicle, and improving a success rate of landing the unmanned aerial vehicle to the safe landing point. For example, when the current position is within 30 meters of the return target position, the drone may start to perform the safety landing point detection, and record the detected safety landing point, but at this time, the drone does not land and still flies to the Home point.
And S104, landing according to the recorded safe landing point.
In one embodiment, when the unmanned aerial vehicle flies to a return flight target position or a position near the return flight target position, the unmanned aerial vehicle can perform safe landing point detection on the current position of the unmanned aerial vehicle, if the current position of the unmanned aerial vehicle is detected to be a safe landing point, the unmanned aerial vehicle directly lands, the position is continuously subjected to safe landing point detection in the landing process, and if the current position is detected to be an unsafe landing point in the landing process, the flying height is adjusted to be the height before landing is performed, and the unmanned aerial vehicle flies to the recorded safe landing point for landing; if the current position is detected to be a safe landing point in the landing process, the flying height is continuously reduced until the relative height between the unmanned aerial vehicle and the ground is zero, and the power system is closed to enable the unmanned aerial vehicle to land on the ground.
In another embodiment, if it is detected that the current position of the unmanned aerial vehicle is not a safe landing point, landing is performed according to the safe landing point recorded within a first preset range from the return target position, and when the unmanned aerial vehicle records the safe landing point, the position information of the safe landing point is recorded, so that when the unmanned aerial vehicle flies to the recorded safe landing point, a second return route of the unmanned aerial vehicle can be determined according to the current position of the unmanned aerial vehicle and the position information of the recorded safe landing point, and the second return route is a route of the unmanned aerial vehicle flying from the current position to the recorded safe landing point. In some cases, the second return flight direction corresponding to the second return flight path of the safe landing point, where the unmanned aerial vehicle flies to the record, is opposite to the first return flight direction corresponding to the first return flight path, and therefore, when the unmanned aerial vehicle flies to the safe landing point, the direction opposite to the first return flight direction can be directly taken as the second return flight direction.
In the embodiment of the invention, when the unmanned aerial vehicle loses a navigation signal, the unmanned aerial vehicle can be controlled to carry out return voyage according to the first return voyage path and the first return voyage path after the return voyage target position and the first return voyage path of the unmanned aerial vehicle are determined, and when the current position of the unmanned aerial vehicle is detected to be within a first preset range from the return voyage target position, a safe landing point is detected and recorded, so that the unmanned aerial vehicle can land based on the recorded safe landing point, reliable guarantee can be provided for the safe landing of the unmanned aerial vehicle, meanwhile, the working efficiency of the unmanned aerial vehicle when the safe landing point is determined is improved, and the processing resources of the unmanned aerial vehicle are effectively saved.
The method for detecting the safe landing point of the unmanned aerial vehicle at the current position in the embodiment of the invention is specifically described below, and as shown in fig. 2, the method comprises the following steps:
s201, when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle.
S202, controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position.
In an embodiment, the detailed implementation of step S201 and step S202 can refer to the description of step S101 and step S102, which is not described herein again.
S203, when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, carrying out plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and carrying out water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
And S204, recording a safe landing point for landing of the unmanned aerial vehicle.
In steps S203 and S204, the drone may perform plane detection on the current position of the drone based on a binocular vision sensor, and/or perform water surface detection on the current position. When the unmanned aerial vehicle performs plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, a to-be-detected area for performing plane detection is determined from an observation area corresponding to the current position, in one embodiment, an observation area of a downward-looking binocular vision sensor (i.e., a binocular vision module) of the unmanned aerial vehicle is larger, such as a 301 area (i.e., a P area) shown in fig. 31-3D,P2-3D,P3-3D,P4-3DEnclosed area) is assumed to be the observation area of the unmanned aerial vehicle binocular vision sensor, and is right when the current position is subjected to plane detection, only the maximum circumscribed circle corresponding to the unmanned aerial vehicle in the physical space needs to be determined whether flat or not, and whether all areas observable by the binocular vision sensor are flat or not does not need to be detected.
If the area to be detected selected by the unmanned aerial vehicle is too large, the plane detection result of the unmanned aerial vehicle at the current position has errors, namely if the maximum circumcircle corresponding to the current position of the unmanned aerial vehicle is a flat ground, the selected area to be detected contains uneven ground after being enlarged, so that the unmanned aerial vehicle considers that the current position is not a plane, and the error of the detection of the safe landing point occurs; and if the area that waits that unmanned aerial vehicle chose is too little, also can lead to unmanned aerial vehicle to the plane testing result of current position mistake, if promptly the current position at unmanned aerial vehicle place is unevenness ground, and because the regional undersize that waits that selects waits for the area that waits that selects detects for level and smooth ground, then probably leads to unmanned aerial vehicle to regard this unevenness's current position as smooth current position by mistake, and control unmanned aerial vehicle to descend to this unevenness subaerial and the safety fault appears.
Therefore, when performing plane detection, it is necessary to select a Region 302 to be detected with an appropriate size from the observation Region 301 for plane detection, in an embodiment, the selected Region 302 to be detected may be a square Region Of 2 meters by 2 meters in a physical space, so as to determine whether a current position is a plane according to a plane detection result, in an embodiment, the observation Region 301 may also be referred to as a detection range Of the unmanned aerial vehicle, and the Region 302 to be detected may be referred to as an image Interest Range (ROI) Of the unmanned aerial vehicle. It is understood that the area size of the selected area 302 to be detected is not limited to 2 meters by 2 meters, and in some cases, the size may be determined according to the size of the drone, for example, the size of the projection plane on the ground when the drone flies is used as the area size of the area 302 to be detected, or the size of the projection plane is enlarged by 50% as the area size.
After the area to be detected for plane detection is determined from the observation area of the unmanned aerial vehicle, further, a two-dimensional projection image corresponding to the area to be detected may be determined, and specifically, an observation area image and an area image to be detected in a two-dimensional image may be obtained by back-projecting the observation area 301 and the area to be detected 302 in a three-dimensional space as shown in fig. 3, where the observation area 301 in the three-dimensional space corresponds to an observation area image 401 in the two-dimensional image as shown in fig. 4 (i.e., P is a reference number P)1-2D,P2-2D,P3-2D,P4-2DEnclosed region), the region to be detected 302 corresponds to the region to be detected image 402 in the two-dimensional image. In one embodiment, the back projection rule may specifically be:
S·P2d=K·(Rci·P3d+Tci) Formula 2.1
Wherein, and P3dFor three-dimensional space points, P, corresponding to the region to be examined in three-dimensional space2dFor two-dimensional spatial neutralization of P3dThe corresponding two-dimensional image points can be back-projected according to the back-projection rule corresponding to the formula 2.1 based on the external parameters (R, T) and the internal parameters K of the camera for each three-dimensional space point of the to-be-detected region in the three-dimensional space, so that the two-dimensional projection image 402 corresponding to the to-be-detected region can be determined, and the points in the two-dimensional projection image 402 are the points to be observed and processed.
After the two-dimensional projection image corresponding to the area to be detected is determined, converting points in the two-dimensional projection image into three-dimensional space points in a ground coordinate system, wherein the three-dimensional space points in the ground coordinate system are real observation points, and the height of an object higher than a plane in an actual space is represented by the protrusion height of the protrusion points; if the area to be detected selected below the unmanned aerial vehicle comprises a tree, the shape of the point cloud comprises a bulge similar to the top of the tree.
In one embodiment, a binocular vision module may be used to convert points in the two-dimensional projection image into three-dimensional space points in a ground coordinate system, that is, a binocular vision depth map may be used to convert a physical scene of an area to be detected below an aircraft into a three-dimensional space point set, so that a plane detection may be performed on a current position according to the three-dimensional space point set.
When the plane detection is performed on the current position according to the three-dimensional space point set, two specific embodiments are provided:
(1) and fitting a fitting plane which is closest to the three-dimensional space point set by using a plane equation, then judging the percentage of the fitting plane to the inner points of the three-dimensional space point set, if the percentage of the inner points meets the preset percentage number, determining the fitting plane as a plane, and determining the inclination degree of the fitting plane by determining the normal vector of the fitting plane. Wherein the plane equation is:
ax + by + cz + d ═ 0 formula 2.2
In one embodiment, the fitting of the plane equation parameters shown in equation 2.2 is similar to solving the linear system Ax equal to 0, and a large number of points [ x, y, z ] in the three-dimensional space point set and the parameters a, b, c, and d to be solved form the over-determined equation shown in equation 2.3, so that the plane equation can be solved by using a random sampling consistency algorithm (RANSCA).
(2) The plane detection is carried out on the current position, namely whether the area to be detected corresponding to the current position is a safe plane needs to be determined, and the determined safe plane is close to the horizontal plane, so that the plane to be fitted can be forcibly determined to be the horizontal plane according to a formula cz + d being 0, the percentage of the interior points of the three-dimensional space point set relative to the forcibly fitted horizontal plane can be judged, and whether the current position is flat or not can be determined according to the percentage of the interior points.
In a second embodiment, the unmanned aerial vehicle may first obtain a standard plane equation, calculate a distance between any spatial three-dimensional point in the three-dimensional space point set and the standard plane equation, determine the number of interior points in the three-dimensional space point set according to the distance, where the interior points are three-dimensional space points whose distance is less than or equal to a preset distance threshold, and determine that the current position is a plane when the number of interior points is greater than or equal to a preset number threshold.
When the second method is used to determine whether the current position is flat, because the level equation of forced fitting is cz + d is 0, the parameter a in the over-determined equation 2.3 may be made to be 0, and then the over-determined equation is solved, so that the parameter solving process of the over-determined equation is simpler.
After the overdetermined equation 2.3 is solved to obtain a plane equation, each spatial point in the three-dimensional spatial point set can be classified into an inner point and an outer point according to the distance between each spatial point in the three-dimensional spatial point set and the plane equation. In one embodiment, any spatial point in the three-dimensional spatial point set can be selected as an observation point, the distance d between the observation point and the plane equation is calculated, and when the distance d is smaller than or equal to a preset distance threshold value, the observation point is classified as an interior point; when the d is larger than the preset distance threshold, classifying the observation point as an outer point. Wherein if the observation point coordinates are (x, y, z), the distance between the observation point and the plane equation can be calculated according to equation 2.4:
it should be noted that, if the unmanned aerial vehicle flies in the flying attitude shown in fig. 5 and performs safety point detection in the flying process in the flying attitude, because an inclined included angle exists between the flying attitude of the unmanned aerial vehicle and the horizontal plane, when the above method is used to perform plane detection on the current position, the flying height of the unmanned aerial vehicle needs to be compensated according to the flying attitude, the current position, the inclined angle and the flying height H of the unmanned aerial vehicle, and the plane detection is performed on the current position of the unmanned aerial vehicle according to the compensated flying height.
In another embodiment, when the water surface of the current position of the unmanned aerial vehicle is detected according to a preset water surface detection algorithm, the area to be detected 302 for water surface detection may be determined from the observation area 301 corresponding to the current position according to a binocular vision sensor of the unmanned aerial vehicle, and the two-dimensional projection image corresponding to the area to be detected 302 is determined according to a back projection rule of a camera, such as the area 402 in fig. 4. The machine learning algorithm for water surface detection is based on a Convolutional Neural Network (CNN) model obtained by gray-scale image training, namely, the obtained two-dimensional projection image corresponding to the region to be detected can be input into the CNN model, the CNN model can detect whether the region corresponding to each frame of image in the two-dimensional projection image has a water surface, and the detection result is output to determine whether the current position is the water surface.
In one embodiment, according to the plane detection result and the water surface detection result, determining whether the current position is a safe landing point, recording the determined safe landing point, and recording a relative position of the safe landing point corresponding to the return flight target position when the safe landing point is recorded, wherein the determined safe landing point is a position where the current position is a plane or a non-water surface.
In one embodiment, other algorithms may be used to perform level detection and water level detection for the current location, such as using conventional vision-based detection methods for level detection and/or water level detection.
S205, when the current position of the unmanned aerial vehicle is within a second preset range from the return target position in the return process, detecting a safe landing point.
And S206, if the detection result is that the safe falling point is not detected, executing the step of falling according to the recorded safe falling point.
And S207, if the detection result is that the safe falling point is detected, falling is carried out according to the detected safe falling point.
In steps S205 to S207, when the unmanned aerial vehicle is within a first preset range from the return target position, a safe landing point at the current position may be detected, and a safe landing point at which the unmanned aerial vehicle can land may be recorded, and when the unmanned aerial vehicle continues to fly to a second preset range from the return target position, the second preset range from the current position to the return target position is a vicinity of the return target position, where the first preset range is greater than the second preset range, the first preset range may be, for example, 20 meters or 10 meters, and the second preset range may be, for example, 5 meters or 10 meters.
And when the current position of the unmanned aerial vehicle is within a second preset range from the return target position, if the detection result after the safety landing point detection is that the safety landing point is not detected, executing the step S206 of landing according to the recorded safety landing point.
When landing according to the recorded safe landing point, a second return route can be determined according to the current position of the unmanned aerial vehicle and the recorded safe landing point, when the second return route is determined according to the current position and the recorded safe landing point, the second return route is pointed to the direction of the recorded safe landing point by the current position (namely, the position near the return target position), further, the unmanned aerial vehicle can fly to the recorded safe landing point based on the second return route, and the binocular vision sensor determines that the current position of the unmanned aerial vehicle is recorded when the safe landing point, the unmanned aerial vehicle is controlled to land.
In an embodiment, if the detected result after the detection of the safety landing point is that the safety landing point is detected, step S207 is executed to land according to the detected safety landing point, specifically, the unmanned aerial vehicle needs to continuously detect whether the detected safety landing point is safe or not during the landing process, because the unmanned aerial vehicle is located at a position within a second preset range from the return flight target position, usually, the user changes a battery or a roadside or the like, it is likely that someone is present during the process that the unmanned aerial vehicle detects that the current position is the safety landing point and lands, and if the safety landing point is not continuously detected during the landing process, the unmanned aerial vehicle lands at an unsafe position to cause a safety failure.
When the unmanned aerial vehicle continuously detects that the detected safe landing point is safe, the unmanned aerial vehicle successfully lands on the detected safe landing point; or, if it is detected that the safe landing point is unsafe in the landing process, adjusting the current flying height of the unmanned aerial vehicle to a preset flying height, where the preset flying height is the flying height of the unmanned aerial vehicle when the unmanned aerial vehicle returns (i.e., the height before landing is executed), and executing the step of landing according to the recorded safe landing point in S206.
Wherein, predetermine flying height for guaranteeing the binocular vision module of unmanned aerial vehicle below is in higher precision range and sets up at the in-process of returning a journey, predetermine flying height for example can be 2 meters or 2.5 meters etc..
In another embodiment, when the unmanned aerial vehicle is within a second preset range from the return target position during the return process, a safe falling point is not detected, and it is determined that the safe falling point is not recorded within the first preset range, the safe falling point detection may be performed according to a preset track within a third preset range from the return target position, where the preset track includes one or more of a spiral track, a zigzag track, or a linear track, and the zigzag track may be, for example, a "Z" -shaped track, so that when the safe falling point is detected in the preset track, the unmanned aerial vehicle lands according to the detected safe falling point.
For example, if the drone navigates back from point B (i.e., the return-to-air starting position) to point a (i.e., the return-to-air target position) as shown in fig. 6, when reaching point D near the return-to-air target position (i.e., when the current position of the drone is within a second preset range from the return-to-air target position), the current position (i.e., point D) is detected as a safe landing point, if it is determined that point D is an unsafe landing point, and it is determined that the drone has no safe landing point recorded within the first preset range, the safe landing point may be found from point D according to a preset trajectory, such as a gray spiral trajectory as shown in the figure, and if the safe landing point is detected at point K of the spiral trajectory, the drone lands at point K.
In another embodiment, when the unmanned aerial vehicle does not detect a safe landing point within a second preset range from the return target position in the return process, and determines that the safe landing point is not recorded within a first preset range from the return target position, the unmanned aerial vehicle can be controlled to hover at the return target position to wait for an operation instruction of a user, and when the unmanned aerial vehicle has a low power alarm, the unmanned aerial vehicle can be forced to land.
In another embodiment, when the current position of the unmanned aerial vehicle and the return target position coincide with each other, that is, when the unmanned aerial vehicle performs visual return to the return target position, the safety landing point detection may be performed on the return target position, if the return target position is detected to be the safety landing point, the landing may be performed on the return target position, and if the return target position is detected not to be the safety landing point, the landing may be performed according to the recorded safety landing point in step S206.
In the embodiment of the invention, if the unmanned aerial vehicle loses the navigation signal, the unmanned aerial vehicle can be controlled to carry out return voyage based on the first return voyage path and the first return voyage path after the return voyage target position and the first return voyage path of the unmanned aerial vehicle are determined, and when the current position of the unmanned aerial vehicle is within a first preset range from the return voyage target position, the plane detection and the water surface detection are carried out on the current position according to a preset plane detection algorithm and a preset water surface detection algorithm, so that the safety guarantee is provided for the return voyage and landing process of the unmanned aerial vehicle, the possibility of safety failure of the unmanned aerial vehicle after landing can be effectively reduced, and as the safety landing point determined as the plane and/or the safety landing point not on the water surface are recorded, the unmanned aerial vehicle can carry out the safety landing point detection and carry out the safety landing attempt when returning to a second preset range from the return voya, if unmanned aerial vehicle is apart from the current position of the second preset range of target location of returning a voyage does not detect safe landing point, just descend according to the safe landing point of record, if detect safe landing point then directly descend, can improve unmanned aerial vehicle to the definite speed of safe landing point to unmanned aerial vehicle safe landing speed has been improved.
In the embodiment of the invention, an application scene diagram of a safe landing method based on an unmanned aerial vehicle is provided, as shown in fig. 7, the unmanned aerial vehicle takes off from a point a and flies to a point B according to a navigation signal provided by a GNSS, a gray curve is used in fig. 7 to identify a flight trajectory of the unmanned aerial vehicle from the point a to the point B, and in the process of flying from the point a to the point B, the position information of the unmanned aerial vehicle is refreshed and recorded in real time according to reliable coordinate information provided by the GNSS.
If the unmanned aerial vehicle loses the navigation signal due to GNSS failure at point B, please refer to the schematic flow diagram of the unmanned aerial vehicle during safe landing shown in fig. 8, when determining that the navigation signal is lost, the unmanned aerial vehicle may trigger the binocular vision module below and perform visual return according to the position information provided by the visual odometer, when performing visual return, the unmanned aerial vehicle may select any one of the preset at least one return position as the return target position of the unmanned aerial vehicle, and assume that the selected target return position is the departure position of the unmanned aerial vehicle, i.e., the position identified by point a in fig. 7, and the direction recorded by the unmanned aerial vehicle before losing the navigation signal and heading toward the return target position is taken as a first return direction, which is the direction indicated by the black curve in fig. 7, and further may be according to the first return direction and the first return target position, and determining a first return path (namely, a path indicated by a black curve in the figure), wherein the first return direction is a direction from the return starting position to the return target position, so that the unmanned aerial vehicle can be controlled to return from the return starting position B to the return target position A according to the indication of the first return path.
When the vision odometer determines that the current position of the unmanned aerial vehicle reaches a first preset range from a return flight target position, namely when the unmanned aerial vehicle is at point C shown in fig. 7, in order to avoid the process that the unmanned aerial vehicle repeatedly detects a safe landing point, the unmanned aerial vehicle can be controlled to start to perform safe landing point detection, and record the detected safe landing point. In one embodiment, the detected and recorded security drop points are assumed to be points a, b, and c marked with stars in FIG. 7.
Due to a return flight error caused by visual return flight, the unmanned aerial vehicle may fly to a point D within a second preset range from the return flight target position according to the indication of the first return flight path to start a safe landing attempt; the drone may also return to point a correctly and make a safe landing attempt at point a. If the safe landing attempt at the point A or the point D fails, searching the safe landing points (namely points a, b and c) recorded in a second preset range, and determining the safe landing point closest to the point A or the point D from the recorded safe landing points, namely the point a, so that after the flying height of the aircraft is adjusted to be the preset flying height, the first return path of the aircraft is adjusted to be the second return path, and the aircraft flies to the point a from the point D according to the indication of the second return path and executes safe landing.
If the aircraft does not find the recorded safe landing point and fails to land for multiple times when returning to the return flight target position, or the safe landing point is not detected in a second preset range, the operation of waiting for the user to host in a hovering mode in place can be executed, and in case of serious low power, an alarm is given to execute forced landing. It can be understood that the unmanned aerial vehicle hovers at the return target position, and the unmanned aerial vehicle may hover at the actual return target position, that is, at the point a, or at the point D due to the visual return error.
An embodiment of the present invention provides a safety landing device, which is applied to an unmanned aerial vehicle, fig. 9 is a structural diagram of a safety landing device applied to an unmanned aerial vehicle, and as shown in fig. 9, the safety landing device 900 applied to an unmanned aerial vehicle includes a memory 901 and a processor 902, where the memory 902 stores program codes, the processor 902 calls the program codes in the memory, and when the program codes are executed, the processor 902 performs the following operations:
when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position;
when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle;
and landing according to the recorded safe landing point.
In one embodiment, the processor 902 is further configured to perform the following operations:
when the current position of the unmanned aerial vehicle is within a second preset range from the return target position in the return process, detecting a safe landing point, wherein the first preset range is larger than the second preset range;
if the detection result is that the safe landing point is not detected, executing the step of landing according to the recorded safe landing point;
and if the detection result is that the safe falling point is detected, falling is carried out according to the detected safe falling point.
In one embodiment, the processor 902, when landing according to the recorded safe landing point, performs the following operations:
determining a second return route according to the current position of the unmanned aerial vehicle and the recorded safe landing point;
flying to the recorded safe landing point based on the second return route;
and when the current position of the unmanned aerial vehicle is the recorded safe landing point, controlling the unmanned aerial vehicle to land.
In one embodiment, the processor 902, when landing according to the detected safe landing point, performs the following operations:
continuously detecting whether the detected safe landing point is safe or not in the landing process;
if the detected safe landing points are detected to be safe continuously, determining that the landing is successful;
if the detected safe landing point is detected to be unsafe, adjusting the current flying height of the unmanned aerial vehicle to be a preset flying height, and executing the step of landing according to the recorded safe landing point;
and the preset flying height is the flying height of the unmanned aerial vehicle during the back-flying.
In one embodiment, the processor 902 is further configured to perform the following operations:
if no safe falling point is recorded in a first preset range from the return target position and no safe falling point is detected in a second preset range from the return target position, carrying out safe falling point detection according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a zigzag track;
and when a safe falling point is detected in the preset track, falling is carried out according to the detected safe falling point.
In one embodiment, the processor 902 is further configured to perform the following operations:
and if no safe landing point is recorded in a first preset range from the return target position and no safe landing point is detected in a second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
In one embodiment, the processor 902 is further configured to perform the following operations:
when the unmanned aerial vehicle reaches the return target position, carrying out safe landing point detection on the return target position;
if the detection result is that the return voyage target position is a safe landing point, performing landing at the return voyage target position;
and if the detection result is that the return voyage target position is not the safe landing point, executing the step of landing according to the recorded safe landing point.
In one embodiment, the processor 902, when determining the return target position and the first return path of the drone, performs the following:
selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;
determining a first return direction according to the return target position;
and determining the first return route according to the first return direction and the return target position.
In one embodiment, the processor 902, when controlling the drone to return based on the first return path and the return target location, performs the following operations:
and controlling the unmanned aerial vehicle to return based on the first return route, the return target position and the position information provided by the visual odometer.
In one embodiment, a second return direction corresponding to the second return path is opposite to a first return direction corresponding to the first return path.
In one embodiment, the safety landing point is a planar location and is not a location of the water surface.
In one embodiment, the processor 902, when performing a safe touchdown detection, performs the following operations:
and carrying out safe drop point detection based on the binocular vision sensor.
In one embodiment, the processor 902, when performing a safe touchdown detection, performs the following operations:
and carrying out plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and carrying out water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
In one embodiment, the navigation signal includes at least one of: the signal of the positioning sensor, the signal of the compass.
In one embodiment, when performing plane detection on the current position of the drone according to a preset plane detection algorithm, the processor 902 performs the following operations:
determining a region to be detected for carrying out plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the area to be detected;
converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;
and carrying out plane detection on the current position according to the three-dimensional space point set.
In one embodiment, the processor 902 performs the following operations when performing plane detection on the current position according to the three-dimensional space point set:
acquiring a standard plane equation;
calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of interior points in the three-dimensional space point set according to the distance, wherein the interior points are three-dimensional space points of which the distance is smaller than or equal to a preset distance threshold value;
and when the number of the inner points is greater than or equal to a preset number threshold, determining that the current position is a plane.
In one embodiment, the processor 902 performs the following operations when performing the water surface detection on the current position of the drone according to a preset water surface detection algorithm:
determining a region to be detected for water surface detection from the observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;
determining a two-dimensional projection image corresponding to the area to be detected;
and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is the water surface or not according to the output of the convolutional neural network model.
The safety landing device applied to the unmanned aerial vehicle provided by the embodiment can execute the safety landing method provided by the embodiment and shown in fig. 1 and fig. 2, and the execution mode and the beneficial effect are similar, and are not described again here.
The embodiment of the invention provides an unmanned aerial vehicle which comprises a body, a power system and the safety landing device. The safe landing device of the unmanned aerial vehicle works the same as or similar to the above, and the description is omitted here. The power system of the unmanned aerial vehicle can comprise a rotor, a motor for driving the rotor to rotate and an electric regulator thereof. Unmanned aerial vehicle can be four rotors, six rotors, eight rotors or other many rotor unmanned aerial vehicle, and unmanned aerial vehicle VTOL carries out work this moment. It will be appreciated that the drone may also be a fixed wing drone or a hybrid wing drone.
The unmanned aerial vehicle provided by the embodiment of the invention can also comprise a sensor arranged on the body. The sensor comprises a GNSS module and is used for providing position information for the unmanned aerial vehicle. The sensor further includes at least one of a binocular vision sensor or a visual odometer. In some embodiments, the binocular vision sensor may be disposed below the drone for acquiring images below the drone and generating depth maps, semantic maps, or other information for safe landing point detection. In some embodiments, the visual odometer may be disposed on a front side of the drone, such that the drone provides mileage information for the drone when no GNSS signals or GNSS module failure is not operational.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (37)
- A safe landing method of an unmanned aerial vehicle is characterized by comprising the following steps:when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position;when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle;and landing according to the recorded safe landing point.
- The method of claim 1, further comprising:when the current position of the unmanned aerial vehicle is within a second preset range from the return target position in the return process, detecting a safe landing point, wherein the first preset range is larger than the second preset range;if the detection result is that the safe falling point is not detected, executing the step of falling according to the recorded safe falling point;and if the detection result is that the safe falling point is detected, falling is carried out according to the detected safe falling point.
- The method of claim 2, wherein said landing according to the recorded safe landing point comprises:determining a second return route according to the current position of the unmanned aerial vehicle and the recorded safe landing point;flying to the recorded safe landing point based on the second return route;and when the current position of the unmanned aerial vehicle is the recorded safe landing point, controlling the unmanned aerial vehicle to land.
- The method of claim 2, wherein said landing in accordance with the detected safe landing point comprises:continuously detecting whether the detected safe landing point is safe or not in the landing process;if the detected safe landing points are detected to be safe continuously, determining that the landing is successful;if the detected safe landing point is detected to be unsafe, adjusting the current flying height of the unmanned aerial vehicle to be a preset flying height, and executing the step of landing according to the recorded safe landing point;and the preset flying height is the flying height of the unmanned aerial vehicle during the back-flying.
- The method of claim 2, further comprising:if no safe falling point is recorded in a first preset range from the return target position and no safe falling point is detected in a second preset range from the return target position, carrying out safe falling point detection according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a zigzag track;and when a safe falling point is detected in the preset track, falling is carried out according to the detected safe falling point.
- The method of claim 2, further comprising:and if no safe landing point is recorded in a first preset range from the return target position and no safe landing point is detected in a second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
- The method of claim 1, further comprising:when the unmanned aerial vehicle reaches the return target position, carrying out safe landing point detection on the return target position;if the detection result is that the return voyage target position is a safe landing point, performing landing at the return voyage target position;and if the detection result is that the return voyage target position is not the safe landing point, executing the step of landing according to the recorded safe landing point.
- The method of claim 1, wherein determining the return target position and the first return path of the drone comprises:selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;determining a first return direction according to the return target position;and determining the first return route according to the first return direction and the return target position.
- The method of claim 1, wherein said controlling said drone to return based on said first return path and said return target location comprises:and controlling the unmanned aerial vehicle to return based on the first return route, the return target position and the position information provided by the visual odometer.
- The method of claim 3, wherein a second return direction corresponding to the second return path is opposite to a first return direction corresponding to the first return path.
- The method of any one of claims 1-10, wherein the safety drop point is a planar and non-surface location.
- The method of any one of claims 1-10, wherein performing safety drop point detection comprises:and carrying out safe drop point detection based on the binocular vision sensor.
- The method of any one of claims 1-10, wherein performing safety drop point detection comprises:and carrying out plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and carrying out water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
- The method of any of claims 1-10, wherein the navigation signal comprises at least one of: the signal of the positioning sensor, the signal of the compass.
- The method of claim 13, wherein the performing plane detection on the current position of the drone according to a preset plane detection algorithm comprises:determining a region to be detected for carrying out plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;determining a two-dimensional projection image corresponding to the area to be detected;converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;and carrying out plane detection on the current position according to the three-dimensional space point set.
- The method of claim 14, wherein the performing plane detection on the current position according to the three-dimensional space point set comprises:acquiring a standard plane equation;calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of interior points in the three-dimensional space point set according to the distance, wherein the interior points are three-dimensional space points of which the distance is smaller than or equal to a preset distance threshold value;and when the number of the inner points is greater than or equal to a preset number threshold, determining that the current position is a plane.
- The method of claim 13, wherein said detecting the current position of the drone for the water surface according to a preset water surface detection algorithm comprises:determining a region to be detected for water surface detection from the observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;determining a two-dimensional projection image corresponding to the area to be detected;and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is the water surface or not according to the output of the convolutional neural network model.
- A safe landing device is applied to an unmanned aerial vehicle and is characterized by comprising a memory and a processor;the memory is used for storing program codes;the processor, invoking the program code, when executed, is configured to:when the unmanned aerial vehicle loses the navigation signal, determining a return target position and a first return path of the unmanned aerial vehicle;controlling the unmanned aerial vehicle to carry out return voyage based on the first return voyage path and the return voyage target position;when the current position of the unmanned aerial vehicle is within a first preset range from the return target position in the return process, detecting a safe landing point, and recording the safe landing point for landing of the unmanned aerial vehicle;and landing according to the recorded safe landing point.
- The apparatus of claim 18, wherein the apparatus is further configured to:when the current position of the unmanned aerial vehicle is within a second preset range from the return target position in the return process, detecting a safe landing point, wherein the first preset range is larger than the second preset range;if the detection result is that the safe landing point is not detected, executing the step of landing according to the recorded safe landing point;and if the detection result is that the safe falling point is detected, falling is carried out according to the detected safe falling point.
- The apparatus of claim 19, wherein upon landing according to the recorded safe landing point, performing the following operations:determining a second return route according to the current position of the unmanned aerial vehicle and the recorded safe landing point;flying to the recorded safe landing point based on the second return route;and when the current position of the unmanned aerial vehicle is the recorded safe landing point, controlling the unmanned aerial vehicle to land.
- The apparatus of claim 19, wherein upon landing in accordance with the detected safe landing point, performing the following:continuously detecting whether the detected safe landing point is safe or not in the landing process;if the detected safe landing points are detected to be safe continuously, determining that the landing is successful;if the detected safe landing point is detected to be unsafe, adjusting the current flying height of the unmanned aerial vehicle to be a preset flying height, and executing the step of landing according to the recorded safe landing point;and the preset flying height is the flying height of the unmanned aerial vehicle during the back-flying.
- The apparatus of claim 19, wherein the apparatus is further configured to:if no safe falling point is recorded in a first preset range from the return target position and no safe falling point is detected in a second preset range from the return target position, carrying out safe falling point detection according to a preset track in a third preset range from the return target position, wherein the preset track comprises a spiral track or a zigzag track;and when a safe falling point is detected in the preset track, falling is carried out according to the detected safe falling point.
- The apparatus of claim 19, wherein the apparatus is further configured to:and if no safe landing point is recorded in a first preset range from the return target position and no safe landing point is detected in a second preset range from the return target position, controlling the unmanned aerial vehicle to hover at the return target position.
- The apparatus of claim 18, wherein the apparatus is further configured to:when the unmanned aerial vehicle reaches the return target position, carrying out safe landing point detection on the return target position;if the detection result is that the return voyage target position is a safe landing point, performing landing at the return voyage target position;and if the detection result is that the return voyage target position is not the safe landing point, executing the step of landing according to the recorded safe landing point.
- The apparatus of claim 18, wherein the determining the return target position and the first return path of the drone performs the following operations:selecting any position from at least one preset return position as a return target position of the unmanned aerial vehicle;determining a first return direction according to the return target position;and determining the first return route according to the first return direction and the return target position.
- The apparatus of claim 18, wherein the control of the drone to perform the following operations while navigating back based on the first return path and the return target location:and controlling the unmanned aerial vehicle to return based on the first return route, the return target position and the position information provided by the visual odometer.
- The apparatus of claim 20, wherein a second return direction corresponding to the second return path is opposite to a first return direction corresponding to the first return path.
- The apparatus of any one of claims 18-27, wherein the safety landing point is a planar and non-surface location.
- The apparatus according to any one of claims 18 to 27, wherein the following is performed when performing the safety landing point detection:and carrying out safe drop point detection based on the binocular vision sensor.
- The apparatus according to any one of claims 18 to 27, wherein the following is performed when performing the safety landing point detection:and carrying out plane detection on the current position of the unmanned aerial vehicle according to a preset plane detection algorithm, and carrying out water surface detection on the current position of the unmanned aerial vehicle according to a preset water surface detection algorithm.
- The apparatus of any one of claims 18-27, wherein the navigation signal comprises at least one of: the signal of the positioning sensor, the signal of the compass.
- The apparatus of claim 30, wherein when performing plane detection on the current position of the drone according to a preset plane detection algorithm, the following operations are performed:determining a region to be detected for carrying out plane detection from an observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;determining a two-dimensional projection image corresponding to the area to be detected;converting any pixel point in the two-dimensional projection image into a three-dimensional space point to obtain a three-dimensional space point set corresponding to the two-dimensional projection image;and carrying out plane detection on the current position according to the three-dimensional space point set.
- The apparatus according to claim 31, wherein the plane detection of the current position according to the three-dimensional space point set is performed as follows:acquiring a standard plane equation;calculating the distance between any three-dimensional space point in the three-dimensional space point set and the standard plane equation, and determining the number of interior points in the three-dimensional space point set according to the distance, wherein the interior points are three-dimensional space points of which the distance is smaller than or equal to a preset distance threshold value;and when the number of the inner points is greater than or equal to a preset number threshold, determining that the current position is a plane.
- The apparatus of claim 30, wherein the following operations are performed when the current position of the drone is detected according to a preset water level detection algorithm:determining a region to be detected for water surface detection from the observation region corresponding to the current position, wherein the region to be detected is smaller than the observation region;determining a two-dimensional projection image corresponding to the area to be detected;and inputting the two-dimensional projection image into a convolutional neural network model, and determining whether the current position is the water surface or not according to the output of the convolutional neural network model.
- An unmanned aerial vehicle, comprising:a body;the power system is arranged on the machine body and used for providing power for the unmanned aerial vehicle;and a safety landing device of any one of claims 18-34.
- The drone of claim 35, further comprising:a sensor mounted to the body, the sensor including at least one of: a binocular vision sensor or a visual odometer;the binocular vision sensor is used for detecting a safe landing point;the visual odometer is used for providing position information of the unmanned aerial vehicle during the back-navigation.
- A computer storage medium having computer program instructions stored therein, which when executed by a processor, is configured to perform a method of secure landing of a drone according to any one of claims 1 to 17.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410267172.9A CN118092500A (en) | 2018-11-28 | 2018-11-28 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/117820 WO2020107248A1 (en) | 2018-11-28 | 2018-11-28 | Method and device for safe landing of unmanned aerial vehicle, unmanned aerial vehicle, and medium |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410267172.9A Division CN118092500A (en) | 2018-11-28 | 2018-11-28 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111615677A true CN111615677A (en) | 2020-09-01 |
CN111615677B CN111615677B (en) | 2024-04-12 |
Family
ID=70854244
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410267172.9A Pending CN118092500A (en) | 2018-11-28 | 2018-11-28 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
CN201880066282.1A Active CN111615677B (en) | 2018-11-28 | 2018-11-28 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410267172.9A Pending CN118092500A (en) | 2018-11-28 | 2018-11-28 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN118092500A (en) |
WO (1) | WO2020107248A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114578855A (en) * | 2022-03-03 | 2022-06-03 | 北京新科汇智科技发展有限公司 | Unmanned aerial vehicle standby landing method and system |
CN114625164A (en) * | 2022-02-22 | 2022-06-14 | 上海赫千电子科技有限公司 | Unmanned aerial vehicle intelligent return method based on unmanned aerial vehicle mother vehicle |
CN115033019A (en) * | 2022-06-01 | 2022-09-09 | 天津飞眼无人机科技有限公司 | Unmanned aerial vehicle obstacle avoidance method |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022094961A1 (en) * | 2020-11-06 | 2022-05-12 | 深圳市大疆创新科技有限公司 | Non-human-controlled robot control method and apparatus, and non-human-controlled robot |
CN114415700B (en) * | 2021-12-28 | 2024-09-17 | 西北工业大学 | Unmanned aerial vehicle autonomous vision landing method based on depth hybrid camera array |
CN115683213A (en) * | 2022-10-14 | 2023-02-03 | 扬州市职业大学(扬州开放大学) | Unmanned aerial vehicle flight landing positioning and evaluating system based on Internet of things |
CN116203600B (en) * | 2023-02-22 | 2024-08-02 | 上海长弓中急管科技有限公司 | Method for tracking motion trail with power after communication signal of unmanned aerial vehicle is lost |
CN118129788B (en) * | 2024-05-07 | 2024-07-16 | 深圳市爱都科技有限公司 | Track return method, device and equipment for wearable equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104049641A (en) * | 2014-05-29 | 2014-09-17 | 深圳市大疆创新科技有限公司 | Automatic landing method and device and air vehicle |
CN105867423A (en) * | 2016-06-08 | 2016-08-17 | 杨珊珊 | Course reversal method and course reversal system of unmanned aerial vehicle and unmanned aerial vehicle |
CN106527481A (en) * | 2016-12-06 | 2017-03-22 | 重庆零度智控智能科技有限公司 | Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle |
CN106927059A (en) * | 2017-04-01 | 2017-07-07 | 成都通甲优博科技有限责任公司 | A kind of unmanned plane landing method and device based on monocular vision |
CN107943090A (en) * | 2017-12-25 | 2018-04-20 | 广州亿航智能技术有限公司 | The landing method and system of a kind of unmanned plane |
CN108474658A (en) * | 2017-06-16 | 2018-08-31 | 深圳市大疆创新科技有限公司 | Ground Morphology observation method and system, unmanned plane landing method and unmanned plane |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101329583B1 (en) * | 2013-07-09 | 2013-11-14 | 주식회사 두레텍 | Air observations using the rotor structure construction method and system for terrain data |
CN104881039A (en) * | 2015-05-12 | 2015-09-02 | 零度智控(北京)智能科技有限公司 | Method and system for returning of unmanned plane |
DE102015013104A1 (en) * | 2015-08-22 | 2017-02-23 | Dania Lieselotte Reuter | Method for target approach control of unmanned aerial vehicles, in particular delivery docks |
CN107291099A (en) * | 2017-07-06 | 2017-10-24 | 杨顺伟 | Unmanned plane makes a return voyage method and device |
CN107479082A (en) * | 2017-09-19 | 2017-12-15 | 广东容祺智能科技有限公司 | A kind of unmanned plane makes a return voyage method without GPS |
WO2019113727A1 (en) * | 2017-12-11 | 2019-06-20 | 深圳市道通智能航空技术有限公司 | Unmanned aerial vehicle return method and device, storage medium, and unmanned aerial vehicle |
-
2018
- 2018-11-28 CN CN202410267172.9A patent/CN118092500A/en active Pending
- 2018-11-28 CN CN201880066282.1A patent/CN111615677B/en active Active
- 2018-11-28 WO PCT/CN2018/117820 patent/WO2020107248A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104049641A (en) * | 2014-05-29 | 2014-09-17 | 深圳市大疆创新科技有限公司 | Automatic landing method and device and air vehicle |
CN105867423A (en) * | 2016-06-08 | 2016-08-17 | 杨珊珊 | Course reversal method and course reversal system of unmanned aerial vehicle and unmanned aerial vehicle |
CN106527481A (en) * | 2016-12-06 | 2017-03-22 | 重庆零度智控智能科技有限公司 | Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle |
CN106927059A (en) * | 2017-04-01 | 2017-07-07 | 成都通甲优博科技有限责任公司 | A kind of unmanned plane landing method and device based on monocular vision |
CN108474658A (en) * | 2017-06-16 | 2018-08-31 | 深圳市大疆创新科技有限公司 | Ground Morphology observation method and system, unmanned plane landing method and unmanned plane |
CN107943090A (en) * | 2017-12-25 | 2018-04-20 | 广州亿航智能技术有限公司 | The landing method and system of a kind of unmanned plane |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114625164A (en) * | 2022-02-22 | 2022-06-14 | 上海赫千电子科技有限公司 | Unmanned aerial vehicle intelligent return method based on unmanned aerial vehicle mother vehicle |
CN114578855A (en) * | 2022-03-03 | 2022-06-03 | 北京新科汇智科技发展有限公司 | Unmanned aerial vehicle standby landing method and system |
CN115033019A (en) * | 2022-06-01 | 2022-09-09 | 天津飞眼无人机科技有限公司 | Unmanned aerial vehicle obstacle avoidance method |
Also Published As
Publication number | Publication date |
---|---|
CN111615677B (en) | 2024-04-12 |
CN118092500A (en) | 2024-05-28 |
WO2020107248A1 (en) | 2020-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111615677B (en) | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium | |
US11237572B2 (en) | Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof | |
Yagfarov et al. | Map comparison of lidar-based 2d slam algorithms using precise ground truth | |
Barry et al. | High‐speed autonomous obstacle avoidance with pushbroom stereo | |
EP3876070B1 (en) | Method and device for planning path of unmanned aerial vehicle, and unmanned aerial vehicle | |
US10618673B2 (en) | Systems and methods for dynamic planning and operation of autonomous systems using image observation and information theory | |
Odelga et al. | Obstacle detection, tracking and avoidance for a teleoperated UAV | |
US20220244746A1 (en) | Method of controlling an aircraft, flight control device for an aircraft, and aircraft with such flight control device | |
CN110221625B (en) | Autonomous landing guiding method for precise position of unmanned aerial vehicle | |
US11922819B2 (en) | System and method for autonomously landing a vertical take-off and landing (VTOL) aircraft | |
JP2015006874A (en) | Systems and methods for autonomous landing using three dimensional evidence grid | |
EP3210091B1 (en) | Optimal safe landing area determination | |
US10796148B2 (en) | Aircraft landing protection method and apparatus, and aircraft | |
US20210200246A1 (en) | Method and system for determining the position of a moving object | |
EP3213158B1 (en) | Space partitioning for motion planning | |
Ortiz et al. | Vessel inspection: A micro-aerial vehicle-based approach | |
Masselli et al. | A novel marker based tracking method for position and attitude control of MAVs | |
Brockers et al. | Autonomous safe landing site detection for a future mars science helicopter | |
US20230095700A1 (en) | Vehicle flight control method and apparatus for unmanned aerial vehicle, and unmanned aerial vehicle | |
Saripalli et al. | An experimental study of the autonomous helicopter landing problem | |
US20210199798A1 (en) | Continuous wave radar terrain prediction method, device, system, and unmanned aerial vehicle | |
Silva et al. | Saliency-based cooperative landing of a multirotor aerial vehicle on an autonomous surface vehicle | |
CN112597946A (en) | Obstacle representation method and device, electronic equipment and readable storage medium | |
Olivares-Mendez et al. | Autonomous landing of an unmanned aerial vehicle using image-based fuzzy control | |
Spasojevic et al. | Robust localization of aerial vehicles via active control of identical ground vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |