CN113031633A - Landing method and device of unmanned aerial vehicle, unmanned aerial vehicle and medium - Google Patents
Landing method and device of unmanned aerial vehicle, unmanned aerial vehicle and medium Download PDFInfo
- Publication number
- CN113031633A CN113031633A CN202110292860.7A CN202110292860A CN113031633A CN 113031633 A CN113031633 A CN 113031633A CN 202110292860 A CN202110292860 A CN 202110292860A CN 113031633 A CN113031633 A CN 113031633A
- Authority
- CN
- China
- Prior art keywords
- landing
- obstacle avoidance
- aerial vehicle
- unmanned aerial
- landing point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 132
- 230000008569 process Effects 0.000 claims abstract description 68
- 238000004891 communication Methods 0.000 claims description 16
- 230000002159 abnormal effect Effects 0.000 description 16
- 238000012545 processing Methods 0.000 description 12
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000036544 posture Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000005856 abnormality Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005336 cracking Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The embodiment of the invention discloses a landing method and a landing device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium, wherein the method comprises the following steps: when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, a target obstacle avoidance landing point is determined in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle; controlling the unmanned aerial vehicle to land to a target obstacle avoidance landing point; the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle. According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the emergency forced landing is carried out according to the determined target obstacle avoidance landing point, so that the unmanned aerial vehicle can autonomously avoid the obstacle and land, and the landing safety of the unmanned aerial vehicle during forced landing is improved.
Description
Technical Field
The embodiment of the invention relates to the unmanned aerial vehicle technology, in particular to a landing method and a landing device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium.
Background
The unmanned aerial vehicle product on the existing market can only start an emergency return command when the condition of losing connection or communication link failure occurs in the task execution process.
However, if the cruising ability of the unmanned aerial vehicle can safely return, uncertain factors exist, and the risk is huge. For example, in the case of long-distance loss of connection of the existing fixed wing and unmanned aerial vehicle, the unmanned aerial vehicle can automatically hover without landing or emergency forced landing, and has no functions of identifying and avoiding any autonomous obstacle, and usually the unmanned aerial vehicle easily collides with the obstacle in the return journey to cause a fryer. Meanwhile, a suitable landing site nearby cannot be distinguished, so that landing impact or secondary accidents are caused. Therefore, how to realize the autonomous obstacle avoidance landing of the unmanned aerial vehicle and improve the landing safety of the unmanned aerial vehicle when the unmanned aerial vehicle is lost becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the invention provides a landing method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a medium, which can realize the autonomous obstacle avoidance landing of the unmanned aerial vehicle and improve the landing safety of the unmanned aerial vehicle during forced landing.
In a first aspect, an embodiment of the present invention provides a landing method for an unmanned aerial vehicle, including:
when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, a target obstacle avoidance landing point is determined in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
In a second aspect, an embodiment of the present invention further provides a landing apparatus for an unmanned aerial vehicle, including:
the system comprises a landing point determining module, a target obstacle avoidance landing point determining module and a target obstacle avoidance landing point determining module, wherein the landing point determining module is used for determining a target obstacle avoidance landing point in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle when the unmanned aerial vehicle is detected to enter a forced landing mode;
the landing control module is used for controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
In a third aspect, an embodiment of the present invention further provides an unmanned aerial vehicle, where the unmanned aerial vehicle includes:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, the one or more programs cause the one or more processors to implement a method for landing a drone as provided by any embodiment of the invention.
In a fourth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a landing method for a drone, as provided in any of the embodiments of the present invention.
According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the target obstacle avoidance landing point is determined in the passing area in the normal or forced landing flight process of the unmanned aerial vehicle, and the unmanned aerial vehicle is controlled to land to the target obstacle avoidance landing point. According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the emergency forced landing is carried out according to the determined target obstacle avoidance landing point, so that the unmanned aerial vehicle can autonomously avoid the obstacle and land, and the landing safety of the unmanned aerial vehicle during forced landing is improved.
Drawings
Fig. 1 is a flowchart of a landing method of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a flowchart of another method for landing an unmanned aerial vehicle according to a second embodiment of the present invention;
fig. 3 is a flowchart illustrating that an unmanned aerial vehicle according to a second embodiment of the present invention determines and stores an obstacle avoidance landing point according to a terrain image acquired in real time during a normal flight process;
FIG. 4 is a schematic diagram of an emergency landing arcade pattern according to a second embodiment of the present invention;
fig. 5 is a flowchart of a specific method for processing data of an aerial survey topographic image by an unmanned aerial vehicle according to a second embodiment of the present invention;
fig. 6 is a flowchart of a specific method for controlling the unmanned aerial vehicle to land to the target obstacle avoidance and landing point according to the second embodiment of the present invention;
fig. 7 is a schematic view of takeoff and landing of an unmanned aerial vehicle according to a second embodiment of the present invention;
fig. 8 is a flowchart of a landing method of a further unmanned aerial vehicle according to a third embodiment of the present invention;
fig. 9 is a flowchart of a specific method for determining whether there are obstacle-avoidance and landing points meeting a return flight condition according to the stored types of the obstacle-avoidance and landing points, and acquiring a target obstacle-avoidance and landing point from the obstacle-avoidance and landing points meeting the return flight condition, according to the third embodiment of the present invention;
fig. 10 is a flowchart of a landing method of a further drone according to the fourth embodiment of the present invention;
fig. 11 is a schematic structural diagram of a landing device of an unmanned aerial vehicle according to a fifth embodiment of the present invention;
fig. 12 is a schematic structural diagram of an unmanned aerial vehicle according to a sixth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a landing method of an unmanned aerial vehicle according to an embodiment of the present invention, where the present embodiment is applicable to a situation where the unmanned aerial vehicle avoids obstacles to land when being forced to land. The device can be configured in unmanned aerial vehicle. As shown in fig. 1, the method includes:
and S110, when the unmanned aerial vehicle is detected to enter the forced landing mode, determining a target obstacle avoidance landing point in a passing area in the normal or forced landing flight process of the unmanned aerial vehicle.
Wherein, the forced landing mode can be used to instruct the unmanned aerial vehicle to carry out forced landing. For example, under the scenes that the unmanned aerial vehicle is disconnected, breaks down and the like, the unmanned aerial vehicle can enter a forced landing mode to land in an emergency. In a specific embodiment, during the flight of the unmanned aerial vehicle, the connection state of the unmanned aerial vehicle and the controller is detected in real time, and when the unmanned aerial vehicle is determined to be disconnected with the controller, the unmanned aerial vehicle enters a forced landing mode. Wherein, the controller can be the equipment that is used for carrying out the communication with unmanned aerial vehicle. When the communication between unmanned aerial vehicle and the controller is normal, the connection state of unmanned aerial vehicle and controller is normal, and the controller can send control command to unmanned aerial vehicle through communication link, and unmanned aerial vehicle can transmit data to the controller through communication link. When the unmanned aerial vehicle is forced to land or the communication is broken, the connection state of the unmanned aerial vehicle and the controller is abnormal, and the flying control end of the unmanned aerial vehicle can trigger the unmanned aerial vehicle to enter an intelligent forced landing mode.
The target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle. For example, the unmanned aerial vehicle can collect a terrain image in a normal flight process, determine an obstacle avoidance and landing point based on the terrain image, and store position information of the obstacle avoidance and landing point, so as to determine a target obstacle avoidance and landing point in the obstacle avoidance and landing point meeting a return flight condition after the unmanned aerial vehicle is forced to land. Or if the obstacle avoidance landing point is detected to not meet the return flight condition after the forced landing of the unmanned aerial vehicle, the terrain image is collected again after the forced landing, and the target obstacle avoidance landing point is determined based on the newly collected terrain image. Or, if the unmanned aerial vehicle does not determine the obstacle avoidance and landing point in the normal flight process, the unmanned aerial vehicle can directly acquire a terrain image after forced landing, and determine the target obstacle avoidance and landing point based on the terrain image. Or other target obstacle avoidance and landing point determination modes. The terrain image can be a terrain photograph or a terrain video. The terrain video can be regarded as being formed by combining a plurality of terrain photos according to a certain sequence.
Illustratively, the unmanned aerial vehicle can collect a terrain image in a flight process, generate point cloud data corresponding to the terrain image, identify an obstacle according to the point cloud data, acquire obstacle description information of the obstacle, and determine an obstacle avoidance and drop point or a target obstacle avoidance and drop point according to the point cloud data and the obstacle description information of the obstacle. And the obstacle avoidance and landing point and the target obstacle avoidance and landing point are both landing points avoiding the obstacle.
And S120, controlling the unmanned aerial vehicle to land to a target obstacle avoidance landing point.
Illustratively, after the target obstacle avoidance and landing point is determined, the unmanned aerial vehicle is controlled to fly to the sky above the target obstacle avoidance and landing point, the unmanned aerial vehicle is controlled to hover and land, and during the generation planning period of the safe path, the unmanned aerial vehicle flies to the target obstacle avoidance and landing point according to the safe path after the safe path is generated. Or if the safe path is not generated after the unmanned aerial vehicle is hovered and landed to the preset height, any path from the current position to the position of the target obstacle avoidance landing point can be considered to be safe, and the unmanned aerial vehicle can be controlled to be switched from the fixed wing mode to the rotor mode at the current position and landed to the target obstacle avoidance landing point. Or directly generating a safe path according to the obstacle description information of the obstacle between the current position of the unmanned aerial vehicle and the target obstacle avoiding and landing point, and controlling the unmanned aerial vehicle to land to the target obstacle avoiding and landing point according to the safe path.
According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the target obstacle avoidance landing point is determined in the passing area in the normal or forced landing flight process of the unmanned aerial vehicle, and the unmanned aerial vehicle is controlled to land to the target obstacle avoidance landing point. According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the emergency forced landing is carried out according to the determined target obstacle avoidance landing point, so that the unmanned aerial vehicle can autonomously avoid the obstacle and land, and the landing safety of the unmanned aerial vehicle during forced landing is improved.
Example two
Fig. 2 is a flowchart of another landing method of an unmanned aerial vehicle according to a second embodiment of the present invention, where the second embodiment is optimized based on the foregoing embodiment, and as shown in fig. 2, the method includes:
and S210, detecting the operation mode of the unmanned aerial vehicle in real time.
The unmanned aerial vehicle may include at least a camera end, a visual Processing Unit (CPU) Processing center, and a flight control end. The camera end can be used for collecting terrain images in real time. A vision CPU processing center may be used to process the data. The flight control end can be used for controlling the unmanned aerial vehicle to fly.
The operating modes may include, but are not limited to, a normal flight mode, a forced landing mode, a mission execution mode, a takeoff mode, and the like. The normal flight mode is used for instructing the unmanned aerial vehicle to fly normally. The forced landing mode is used for indicating the unmanned aerial vehicle to carry out emergency forced landing. And the execution route task mode is used for indicating the unmanned aerial vehicle to execute the route task issued by the controller. And the takeoff mode is used for indicating the unmanned aerial vehicle to take off according to the control instruction of the controller.
Alternatively, step S210 may be implemented by: and detecting the running state of the unmanned aerial vehicle in real time to determine the running mode of the unmanned aerial vehicle. For example, the operation state of the drone is detected in real time, and when the drone is detected to be in an abnormal communication state (for example, the drone is out of connection), or when the drone is detected to be in an abnormal hardware state (for example, a battery is abnormal, a radar is abnormal, or a motor is abnormal), or when the drone is detected to be in a severe environment flight state (for example, the drone encounters magnetic field interference exceeding a preset threshold value or the drone is in a flight abnormal state due to severe weather), the drone can be controlled to enter the forced landing mode.
Step S220, when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, whether an obstacle avoidance landing point meeting a return flight condition is stored or not is judged, if yes, step S230 is executed, and if not, step S240 is executed.
The return flight condition can be determined based on the type of the obstacle avoidance landing point, the current cruising electric quantity of the unmanned aerial vehicle and/or the distance value between the obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle.
The obstacle avoidance and landing point is determined and stored by the unmanned aerial vehicle according to the real-time collected terrain image in the normal flight process.
Optionally, fig. 3 is a flowchart illustrating that the obstacle avoidance and landing point is determined and stored according to the real-time acquired terrain image in the normal flight process of the unmanned aerial vehicle according to the second embodiment of the present invention. As shown in fig. 3, in the normal flight process, the unmanned aerial vehicle determines and stores the obstacle avoidance and landing point according to the terrain image acquired in real time, and the steps S221 to S222 may be implemented as follows:
and S221, controlling the unmanned aerial vehicle to acquire the terrain image in the normal flight process in real time in the normal flight process of the unmanned aerial vehicle.
Exemplarily, in the normal flight process of the unmanned aerial vehicle, the unmanned aerial vehicle flies along a normal flight route, a camera end of the unmanned aerial vehicle collects terrain images in the flight process in real time, and transmits the terrain images collected in real time to a vision CPU processing center of the unmanned aerial vehicle for processing.
Step S222, determining at least one obstacle avoidance and landing point according to the terrain image, and storing each determined obstacle avoidance and landing point.
Illustratively, a vision CPU processing center of the unmanned aerial vehicle processes the terrain image, determines at least one obstacle avoidance and landing point, and stores position information and the like of each determined obstacle avoidance and landing point.
According to the method, the selectable obstacle avoidance and landing points are determined and stored in the normal flight process, so that the target obstacle avoidance and landing points are preferentially determined from the stored obstacle avoidance and landing points when the forced landing is performed subsequently, the determination rate of the target obstacle avoidance and landing points is favorably improved, and the forced landing rate in the forced landing state is further improved.
And step S230, obtaining target obstacle avoidance and landing points from the obstacle avoidance and landing points meeting the return flight conditions, and continuing to execute the step S270.
Specifically, if the stored obstacle avoidance and landing points have obstacle avoidance and landing points meeting the return flight conditions, the target obstacle avoidance and landing points are acquired from the obstacle avoidance and landing points meeting the return flight conditions. For example, the obstacle avoidance and landing point closest to the target obstacle avoidance and landing point is selected as the target obstacle avoidance and landing point. Or selecting the obstacle avoidance and landing point with the highest flatness or the largest occupied area as the target obstacle avoidance and landing point so as to obtain the optimal target obstacle avoidance and landing point.
And S240, controlling the unmanned aerial vehicle to collect terrain images in real time in the forced landing flight process of the unmanned aerial vehicle.
Exemplarily, in the forced landing flight process of the unmanned aerial vehicle, the terrain images are collected in real time through a camera end of the unmanned aerial vehicle. Wherein, the camera end can be the load camera equipment when unmanned aerial vehicle carries out the task. For example, the camera end may include an orthographic visible light camera, a pan-tilt visible light camera, an APS-C (Advanced Photo System type-C, C-type Advanced photography System), a full-frame and/or 5-lens camera, and the like. The load camera device carried by the unmanned aerial vehicle in the embodiment of the invention is not limited to the above type.
Optionally, in step S230, controlling the drone to collect the terrain image in real time may include:
generating a preset landing point identification route according to the current positioning point of the unmanned aerial vehicle and the route type; and controlling the unmanned aerial vehicle to fly along a preset landing point identification air line, and acquiring a terrain image in real time in the flying process.
The course type can be used for high-precision three-dimensional aerial surveying. For example, the types of routes may include a spiral route, a bow route, a tic-tac-toe route, or a 5-sided route, etc. The type of the route used by the embodiment of the invention is not limited to any route mode for measuring the geographic information of the unmanned aerial vehicle. The preset landing point identification route can be a route for determining a obstacle avoidance landing point when the unmanned aerial vehicle needs to land.
Illustratively, the airline type is a bow airline. By presetting and storing the route type in the flight control end in advance, the unmanned aerial vehicle under the condition urgently needs to quickly acquire the three-dimensional data information of the current environment so as to determine the current positioning point of the unmanned aerial vehicle. And generating a preset bow-shaped air route according to the current positioning point and the air route type of the unmanned aerial vehicle. The bow-shaped air route has the advantages of short flight path and short flight time in the same measurement and drawing area size, the set drawing area of the air route is 500m multiplied by 500m, and the overlapping degree of the course and the lateral direction is respectively 40 and 60. Fig. 4 is a schematic diagram of an emergency landing zigzag route according to the second embodiment of the present invention. As shown in fig. 4, the horizontal accuracy of the three-dimensional point cloud in the mapping area is less than or equal to 5cm, and the elevation is less than or equal to 10cm, so that the requirements of the fixed-wing unmanned aerial vehicle obstacle and landing field identification accuracy are met. The current positioning point of the unmanned aerial vehicle can be located at the center position or the starting point position of the bow-shaped air route. The direction of the long route of the bow-shaped aerial line and the direction of the unmanned aerial vehicle belong to the same direction. The direction of the starting point of the air route is positioned on the right of the direction of the head of the unmanned aerial vehicle. For example, the current anchor point of the drone may be located at the position of point viii with the head oriented downwards.
As shown in fig. 4, the drone can be pointed to fromThe business entry starts to perform aerial photography,and finishing the aerial photography task, namely finishing the acquisition of the terrain image.The three-dimensional reconstruction capability is enhanced along with the subsequent increase of the calculation power of the CPU of the unmanned aerial vehicle,becomes large, i.e., the course range becomes large.
After the terrain image is collected, a network or a ground-based Real-time kinematic (RTK) base station can be connected, and Real-time solution is performed through a Process Capability Index (PPK). The aerial terrain images may be stored on an SD (Secure Digital) card of the unmanned aerial vehicle load camera device.
According to the embodiment, the preset landing point identification route is generated according to the current positioning point and the route type of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to collect the terrain image in real time along the preset landing point identification route, and the speed, the efficiency and the integrity of the terrain image acquisition are improved.
And S250, determining at least one obstacle avoidance and landing point according to the terrain image.
During the aerial photography task of the unmanned aerial vehicle, starting from shooting the second terrain image, the vision CPU processing center of the unmanned aerial vehicle enters a jigsaw process, the purpose of processing the terrain image data while executing the task is achieved, the processing time of the 500m multiplied by 500m area is less than or equal to 15min, and the data processing capacity is close to the time of executing the airline task.
Optionally, step S250 may include:
generating point cloud data and DSM (Digital Surface Model) according to the terrain image; carrying out preprocessing operation on the point cloud data; partitioning the point cloud data after the preprocessing operation based on a preset partitioning algorithm to obtain obstacle point cloud data; clustering the obstacle point cloud data to determine obstacles in the obstacle point cloud data; marking obstacle description information of an obstacle according to the DSM; and generating a DOM (Digital ortho image), and determining at least one obstacle avoiding and landing point in the DOM according to the obstacle description information.
Wherein the pre-processing operation may comprise at least one of: range clipping point cloud operation, point cloud denoising operation, point cloud grouping operation and the like. The preset segmentation algorithm may include a conventional plane fitting segmentation earth surface point cloud algorithm or a DL (Deep Learning) method-based 3D point cloud segmentation algorithm, etc. The algorithm of the clustering process may include a density clustering algorithm or a hierarchical clustering algorithm, etc. The obstacle description information includes, but is not limited to, location, size, shape, and/or category, etc. The obstacle with the description information is completely detected, and the result is returned.
Further, fig. 5 is a flowchart of a specific method for processing data of the aerial survey topographic image by the unmanned aerial vehicle according to the second embodiment of the present invention. As shown in fig. 5, generating point cloud data and DSM from a terrain image may include:
performing three-dimensional reconstruction processing on the terrain image, and generating sparse point cloud data and the posture of each frame of terrain image by using an SFM (Structure-From-Motion) method; based on the sparse point cloud data and the posture of each frame of terrain image, generating dense point cloud data by using an MVS (Multi View Stereo) method; the sparse point cloud data and/or the dense point cloud data form point cloud data; and generating DSMs corresponding to the point cloud data based on an interpolation algorithm.
Illustratively, the three-dimensional reconstruction processing of the terrain image utilizes an SFM method to generate sparse point cloud data and the pose of each frame of the terrain image, which can be implemented by the following steps: serializing the terrain images according to the GPS positions corresponding to the terrain images of each frame, and recording adjacent frames of the terrain images of each frame; extracting feature points of each frame of topographic image, and matching the feature points of each frame of topographic image to obtain feature point matching pairs; the Feature point extraction method is not limited to SIFT (Scale-Invariant Feature Transform), SURF (Speeded Up Robust Features) or ORB (organized Fast and organized Brief), and the like, and the Feature point matching method is not limited to ANNL2(Artificial Neural Network Algorithm), Bruth Force Algorithm 2 (brute Force cracking Algorithm) or Cascade Hashing Algorithm 2 (Cascade Hashing Algorithm), and the like; for each terrain image pair, according to a feature point matching pair contained in the terrain image pair and a Global Positioning System (GPS) position of each frame of terrain image, solving the attitude of each frame of terrain image in an iterative BA (Bundle Adjustment) optimization mode; and calculating sparse point cloud data of the whole scene in a triangularization mode according to the posture of each frame of topographic image and each feature point matching pair.
Illustratively, based on the sparse point cloud data and the pose of each frame of terrain image, dense point cloud data is generated by using an MVS method, which can be implemented by the following method: for two adjacent frames of topographic images, performing three-dimensional correction according to the postures of the current frame and the next frame of topographic images, and for two adjacent frames of topographic images, performing three-dimensional correction according to the postures of the current frame and the next frame of topographic images so as to enable the two adjacent frames of topographic images to be coplanar and horizontally parallel; calculating a depth map of each frame of terrain image by using a binocular stereo matching algorithm; the binocular stereo matching algorithm comprises a binocular local stereo matching algorithm, a binocular semi-global stereo matching algorithm or a binocular global stereo matching algorithm; and fusing the depth maps of the frames to eliminate abnormal points and redundant points to obtain dense point cloud of the whole scene.
Illustratively, a DSM corresponding to the point cloud data is generated based on an interpolation algorithm. Wherein the point cloud data may comprise sparse point cloud data and/or dense point cloud data. The interpolation algorithm may include an Inverse-weight interpolation algorithm (IDW), a Nearest-neighbor interpolation algorithm (Nearest-neighbor), or a Delaunay triangulated interpolation algorithm (Delaunay Triangular), among others.
Illustratively, generating the DOM may be implemented by: for each grid point on the DSM, calculating an included angle between a connecting line between the three-dimensional coordinates of the grid point and the three-dimensional coordinates of each frame of topographic image and a ground perpendicular line, and taking a target topographic image with the minimum included angle as a projection image of the grid point; for each grid point on the DSM, reversely projecting the three-dimensional coordinates of the grid point to a projection image corresponding to the grid point, and acquiring a target pixel value of the grid point on the projection image; and generating the DOM according to the target pixel values corresponding to the grid points on the DSM.
Exemplarily, determining at least one obstacle avoidance and landing point in the DOM according to the obstacle description information may be implemented by: acquiring preset landing site conditions of the unmanned aerial vehicle; the preset landing site condition is used for indicating the requirements of the unmanned aerial vehicle on the area, the flatness and/or the site attribute of the site during landing; determining a landing area meeting preset landing site conditions in the DOM; and determining the position information of at least one obstacle avoiding and landing point according to the obstacle description information and the DOM in the landing area. The range of the landing zone can be determined or preset according to actual conditions. For example, a radius of 50m may be preset.
And step S260, acquiring target obstacle avoidance and landing points from the determined obstacle avoidance and landing points.
And S270, controlling the unmanned aerial vehicle to land to a target obstacle avoidance landing point.
Optionally, fig. 6 is a flowchart of a specific method for controlling the unmanned aerial vehicle to land to the target obstacle avoidance and landing point according to the second embodiment of the present invention. As shown in fig. 6, step S270 may include:
and S271, controlling the unmanned aerial vehicle to fly to the overhead position of the target obstacle avoidance and landing point, and controlling the unmanned aerial vehicle to hover at the overhead position for height reduction.
Exemplarily, fig. 7 is a schematic diagram of takeoff and landing of an unmanned aerial vehicle according to a second embodiment of the present invention. As shown in fig. 7, if the unmanned aerial vehicle enters the forced landing mode when taking off to the point a, a bow-shaped air route is generated, and the flight control end controls the unmanned aerial vehicle to fly to the starting point position of the bow-shaped air route, that is, the point B. And starting to enter the bow-shaped air route after the direction of the nose is adjusted until the nose flies to the end position of the bow-shaped air route, namely the point C. And the acquisition of the terrain image is carried out in the flight process. And after the vision CPU processing center determines the target obstacle avoidance and drop point according to the terrain image, sending the positioning point and the spiral radius of the target obstacle avoidance and drop point to the flight control end. Wherein the radius of the spiral may be between 80m and 120 m. And the flight control end controls the unmanned aerial vehicle to fly to the overhead position of the target obstacle avoidance and landing point, namely the D point, after the head direction is adjusted at the C point, and controls the unmanned aerial vehicle to hover and land at the overhead position according to the hovering radius.
Optionally, when the unmanned aerial vehicle is controlled to hover and descend at the overhead position, the method may further include:
and detecting the position information of the obstacle on the spiral path in real time, and generating an obstacle avoidance track according to the position information of the obstacle when the obstacle is detected to be present so as to avoid the obstacle to fly.
Illustratively, the flight control end detects position information of an obstacle uploaded by the vision CPU processing center, and if the obstacle is detected to exist on the spiral path, the obstacle avoidance cabinet is planned and generated in advance according to the position information of the obstacle so as to avoid the obstacle to fly.
This embodiment descends the barrier of high in-process through detecting the spiral, and the obstacle track is kept away in the generation, has improved unmanned aerial vehicle's security in data processing process.
Step S272, when the unmanned aerial vehicle is detected to descend to a preset height, whether a safe path flying to a target obstacle avoidance landing point is generated or not is detected; if yes, go to step S273, otherwise go to step S274.
The safe path is determined based on the position information of the obstacle between the unmanned aerial vehicle and the target obstacle avoiding and landing point after the target obstacle avoiding and landing point is located. For example, a safe path can be planned jointly according to flight parameters of the unmanned aerial vehicle and position information of the obstacles.
Exemplarily, as shown in fig. 7, when it is detected that the unmanned aerial vehicle descends to a preset height, that is, descends to a point E, the flight control end detects whether the vision CPU processing center uploads a safe path that flies to the target obstacle avoidance landing point.
And step S273, controlling the unmanned aerial vehicle to fly to a target obstacle avoidance and landing point according to the safe path.
Specifically, if the safe path is generated, the unmanned aerial vehicle is controlled to fly to the target obstacle avoidance and landing point from the spiral path according to the safe path. For example, to point F in fig. 7.
And step S274, controlling the unmanned aerial vehicle to switch from the fixed wing mode to the rotor wing mode at the current position and land to a target obstacle avoidance landing point.
Specifically, if the safe path is not generated, any path from the current position of the unmanned aerial vehicle to the target obstacle avoidance and landing point can be considered to be safe, and the unmanned aerial vehicle can be controlled to be switched from the fixed wing mode to the rotor mode at the current position and land to the target obstacle avoidance and landing point. For example, to point F in fig. 7.
According to the embodiment, after the target obstacle avoidance and landing point is determined, the unmanned aerial vehicle is controlled to fly to the overhead position of the target obstacle avoidance and landing point and perform spiral descending, so that time is reserved for generation of a safe path, meanwhile, the spiral descending is performed in the data processing process, and the descending time is shortened. This embodiment is through when descending preset height, whether detect and generate the safe route that flies to the target obstacle avoidance landing point, if generate, then descend according to the safe route, if not generate, then switch flight mode and directly descend, realized automatic planning a set of route of landing and independently safe forced landing, the descending security when further having improved unmanned aerial vehicle forced landing.
And step S280, sending alarm forced landing information through cloud communication.
Wherein, the forced landing alarm information at least comprises at least one of the following information: the current position information of the unmanned aerial vehicle, the landing environment geographic information and the shortest route information between the position of the controller and the current position of the unmanned aerial vehicle. For example, the current location information of the drone may include GPS coordinates, or the like. The cloud communication may be 4G cloud communication or the like.
According to the embodiment of the invention, when forced landing of the unmanned aerial vehicle and the controller is determined, whether the obstacle avoidance and landing points meeting the return flight conditions are stored or not is judged, if yes, the target obstacle avoidance and landing points are obtained from the obstacle avoidance and landing points meeting the return flight conditions, so that the determination rate of the target obstacle avoidance and landing points is increased, and the forced landing speed is accelerated. Otherwise, acquiring a terrain image in real time, determining a target obstacle avoidance and landing point based on the terrain image, and landing. According to the embodiment of the invention, the obstacle influencing flight safety and the site capable of safely landing are identified and analyzed according to the current environment condition, so that the target obstacle avoiding landing point is intelligently selected in real time, and the landing safety of the unmanned aerial vehicle in forced landing is improved. According to the embodiment of the invention, the alarm forced landing information is sent through cloud communication, so that a user can find the unmanned aerial vehicle quickly.
EXAMPLE III
Fig. 8 is a flowchart of a landing method of another unmanned aerial vehicle according to a third embodiment of the present invention, where the third embodiment is optimized based on the foregoing embodiment, and as shown in fig. 8, the method includes:
and S3010, detecting the operation mode of the unmanned aerial vehicle in real time in the flight process of the unmanned aerial vehicle.
And S3020, controlling the unmanned aerial vehicle to acquire the terrain image in the normal flight process in real time in the normal flight process of the unmanned aerial vehicle.
Step S3030, determining at least one obstacle avoidance and landing point according to the terrain image, and storing each determined obstacle avoidance and landing point.
The type of the obstacle avoidance and landing point is determined based on preset landing site conditions, the landing site conditions comprise an optimal landing condition and a sub-optimal landing condition, and the type of the obstacle avoidance and landing point comprises an optimal obstacle avoidance and landing point and a sub-optimal obstacle avoidance and landing point. The landing site conditions may be determined based on the area, flatness, and/or site attributes of the site. The optimal landing condition is used for determining the optimal obstacle avoidance landing point, and the sub-optimal landing condition is used for determining the sub-optimal obstacle avoidance landing point. For example, the optimal obstacle avoidance and drop point may be an obstacle avoidance and drop point whose flatness satisfies the optimal flatness condition. The sub-optimal obstacle avoidance and landing point can be an obstacle avoidance and landing point for avoiding dangerous areas such as crowds, high-voltage wires and highways, and can be a roof, a shrub forest or a field.
Step S3040, when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, whether obstacle avoidance landing points meeting the return flight conditions exist or not is judged according to the stored types of the obstacle avoidance landing points, if yes, step S3050 is executed, and if not, step S3060 is executed.
The return flight condition can be determined based on the type of the obstacle avoidance landing point, the current cruising electric quantity of the unmanned aerial vehicle and/or the distance value between the obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle.
Illustratively, whether the obstacle avoidance and landing points meeting the return flight condition exist in the stored obstacle avoidance and landing points is judged according to the type of the stored obstacle avoidance and landing points, the current cruising electric quantity of the unmanned aerial vehicle and the distance value between the obstacle avoidance and landing points and the current positioning point of the unmanned aerial vehicle.
In the embodiment, the obstacle avoidance and landing points meeting the return flight conditions are determined from the stored at least one obstacle avoidance and landing point, and then the target obstacle avoidance and landing points are obtained from the obstacle avoidance and landing points meeting the return flight conditions, so that the selection rate of the target obstacle avoidance and landing points is improved.
And S3050, acquiring a target obstacle avoidance and landing point from the obstacle avoidance and landing points meeting the return flight conditions, and continuously executing the step S3130.
And S3060, controlling the unmanned aerial vehicle to collect terrain images in real time in the forced landing and flying process of the unmanned aerial vehicle.
Specifically, when the stored obstacle avoidance and landing points do not have obstacle avoidance and landing points meeting the return flight conditions, the unmanned aerial vehicle is controlled to collect terrain images in real time in the forced landing flight process so as to determine the target obstacle avoidance and landing points again.
And S3070, determining at least one obstacle avoidance and landing point according to the terrain image.
The determined at least one obstacle avoidance and landing point may include an optimal obstacle avoidance and landing point and a sub-optimal obstacle avoidance and landing point, or may also include only an optimal obstacle avoidance and landing point or a sub-optimal obstacle avoidance and landing point.
And S3080, judging whether the determined obstacle avoidance and landing points comprise at least one optimal obstacle avoidance and landing point, if so, executing the step S3090, otherwise, executing the step S3100.
Specifically, it is determined whether each determined obstacle avoidance and landing point includes at least one optimal obstacle avoidance and landing point, for example, whether only the optimal obstacle avoidance and landing point is included, or whether the optimal obstacle avoidance and landing point and the sub-optimal obstacle avoidance and landing point are included.
And S3090, determining a target obstacle avoidance and landing point in the optimal obstacle avoidance and landing point, and continuing to execute the step S3130.
Specifically, if each determined obstacle avoidance and drop point comprises at least one optimal obstacle avoidance and drop point, determining a target obstacle avoidance and drop point in the optimal obstacle avoidance and drop points. For example, the target optimal obstacle avoidance and landing point which is closest to the unmanned aerial vehicle, has the highest flatness, or has the largest occupied area is selected as the target obstacle avoidance and landing point from the optimal obstacle avoidance and landing points.
Step S3100, determining whether the current cruising power of the unmanned aerial vehicle is less than or equal to a preset power threshold, if so, executing step S3110, otherwise, executing step S3120.
Specifically, if only the sub-optimal obstacle avoidance and landing point is included in each determined obstacle avoidance and landing point, whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value is judged. Wherein, the electric quantity threshold value can be preset according to the computing power of unmanned aerial vehicle in advance, for example, the electric quantity threshold value can be 35%.
And S3110, determining a target obstacle avoidance and drop point from the sub-optimal obstacle avoidance and drop points, and continuing to execute S3130.
For example, the target sub-optimal obstacle avoidance and landing point which is closest to the unmanned aerial vehicle, has the highest flatness, or has the largest occupied area is selected as the target obstacle avoidance and landing point from the sub-optimal obstacle avoidance and landing points.
And S3120, controlling the unmanned aerial vehicle to fly forward for a preset distance, controlling the unmanned aerial vehicle to acquire the terrain image in real time again, and returning to execute the step S3070.
Specifically, if only including the suboptimal obstacle avoidance landing point in each obstacle avoidance landing point that confirms, but unmanned aerial vehicle's current continuation of the journey electric quantity is greater than predetermined electric quantity threshold value, then explain that unmanned aerial vehicle does not find suitable target obstacle avoidance landing point, but unmanned aerial vehicle's current continuation of the journey electric quantity can satisfy unmanned aerial vehicle and look for the demand of keeping away obstacle landing point again, then can control unmanned aerial vehicle and fly forward preset distance, the real-time topography image of gathering of the unmanned aerial vehicle of controling again to look for again and keep away obstacle landing point. The preset distance may be preset, and may be 500m, for example.
Illustratively, as shown in fig. 4, when the drone is flying toPoints, but still no suitable target obstacle avoidance landing points have been determined, may be followedAnd (3) presetting the direction of a landing point identification route at the point, flying forwards for 500m, regenerating the preset landing point identification route when flying to a target position behind 500m, and collecting a terrain image along the regenerated preset landing point identification route to re-determine an obstacle avoidance landing point.
And S3130, controlling the unmanned aerial vehicle to land to a target obstacle avoidance landing point.
According to the embodiment of the invention, in the normal flight process of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to determine and store the obstacle avoidance landing point according to the terrain image acquired in the normal flight process in real time. When the forced landing of the unmanned aerial vehicle and the controller is determined, whether the obstacle-avoiding landing points meeting the return flight conditions exist is judged according to the stored types of the obstacle-avoiding landing points, and if the obstacle-avoiding landing points meet the return flight conditions, the target obstacle-avoiding landing points are obtained from the obstacle-avoiding landing points of the return flight conditions. If the target obstacle avoidance and landing point does not exist, the unmanned aerial vehicle is controlled to determine the obstacle avoidance and landing point according to the newly collected terrain image, and then whether the target obstacle avoidance and landing point is obtained from the determined obstacle avoidance and landing point or the unmanned aerial vehicle is controlled to fly forward for a preset distance is determined according to the type of the obstacle avoidance and landing point and the current cruising electric quantity of the unmanned aerial vehicle, so that the obstacle avoidance and landing point is determined again. And after the target obstacle avoidance and landing point is determined, controlling the unmanned aerial vehicle to land to the target obstacle avoidance and landing point. The embodiment of the invention preferentially determines the target obstacle avoiding and dropping point based on the stored obstacle avoiding and dropping point, and improves the positioning rate of the target obstacle avoiding and dropping point. According to the embodiment of the invention, the obstacle avoidance and landing points are divided into the optimal obstacle avoidance and landing point and the sub-optimal obstacle avoidance and landing point, the target landing point is positioned according to the cruising ability of the unmanned aerial vehicle and the type of the obstacle avoidance and landing point, the selection accuracy of the target obstacle avoidance and landing point is improved, and the forced landing success rate is improved.
Further, fig. 9 is a flowchart of a specific method for determining whether there is an obstacle-avoiding and landing point meeting the return flight condition according to the stored types of the obstacle-avoiding and landing points, and acquiring a target obstacle-avoiding and landing point from among the obstacle-avoiding and landing points meeting the return flight condition, according to the third embodiment of the present invention. Specifically, step S3040 to step S3060 are optimized. That is, steps S3040 to S3060 may be replaced with steps S3041 to S3046, steps S3051 to S3052, and step S3060 shown in fig. 9. As shown in fig. 9, steps S3040 to S3060 are optimized as:
step S3041, when it is detected that the unmanned aerial vehicle enters the forced landing mode, determining whether each of the stored obstacle avoidance and landing points includes at least one optimal obstacle avoidance and landing point, if so, executing step S3042, otherwise, executing step S3045.
Each stored obstacle avoidance and landing point may include an optimal obstacle avoidance and landing point and a sub-optimal obstacle avoidance and landing point.
Step S3042, calculating a distance value between each optimal obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle.
Specifically, if each stored obstacle avoidance and landing point includes at least one optimal obstacle avoidance and landing point, that is, only the optimal obstacle avoidance and landing point, or the optimal obstacle avoidance and landing point and the sub-optimal obstacle avoidance and landing point, a distance value between each optimal obstacle avoidance and the current positioning point of the unmanned aerial vehicle is calculated.
Step S3043, determining whether there is at least one target optimal obstacle avoidance and landing point whose distance value is less than or equal to a preset threshold, if so, executing step S3044, otherwise, executing step S3045.
And S3044, determining the optimal obstacle avoidance and landing point of each target as an obstacle avoidance and landing point meeting the return flight condition, and continuing to execute the step S3051.
Specifically, if at least one target optimal obstacle avoidance and landing point with a distance value smaller than or equal to a preset threshold is determined to exist, and the distance between the target optimal obstacle avoidance and landing point and the unmanned aerial vehicle is appropriate, each target optimal obstacle avoidance and landing point is determined to be an obstacle avoidance and landing point meeting the return flight condition.
Step S3045, it is determined whether the current cruising power of the unmanned aerial vehicle is less than or equal to a preset power threshold, if so, step S3046 is executed, otherwise, step S3060 is executed.
Specifically, if at least one optimal obstacle avoidance and landing point is not included in the stored obstacle avoidance and landing points, that is, only at least one sub-optimal obstacle avoidance and landing point is included, or if it is determined that there is no target optimal obstacle avoidance and landing point with a distance value smaller than or equal to a preset threshold, it is determined whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold. For example, whether the current endurance capacity of the unmanned aerial vehicle is less than or equal to 35% is determined.
And S3046, determining each optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting the return flight condition, and continuing to execute the step S3052.
Specifically, if it is determined whether the current cruising power of the unmanned aerial vehicle is less than or equal to a preset power threshold, which indicates that the current cruising power of the unmanned aerial vehicle is not enough to support the repositioning obstacle avoidance and landing point, each sub-optimal obstacle avoidance and landing point can be directly determined as an obstacle avoidance and landing point meeting the return flight conditions, so as to obtain a target obstacle avoidance and landing point from the sub-optimal obstacle avoidance and landing point.
And S3051, determining a target obstacle avoidance and landing point from the target optimal obstacle avoidance and landing points, and continuing to execute the step S3130.
Illustratively, the target optimal obstacle avoidance and landing point which is closest to the unmanned aerial vehicle and has the highest flatness or the largest occupied area is selected from the target optimal obstacle avoidance and landing points as the target obstacle avoidance and landing points.
And S3052, determining a target obstacle avoidance and landing point from the sub-optimal obstacle avoidance and landing points, and continuing to execute the step S3130.
Illustratively, the target sub-optimal obstacle avoidance and landing point which is closest to the unmanned aerial vehicle and has the highest flatness or the largest occupied area is selected from the target sub-optimal obstacle avoidance and landing points as the target obstacle avoidance and landing points.
And S3060, controlling the unmanned aerial vehicle to collect terrain images in real time in the forced landing and flying process of the unmanned aerial vehicle.
According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, if the obstacle avoidance and landing points comprise at least one optimal obstacle avoidance and landing point, the determination mode of the target obstacle avoidance and landing points is determined based on two factors of distance and cruising electric quantity. And if the obstacle avoidance and landing points only comprise at least one sub-optimal obstacle avoidance and landing point, determining a determination mode of the target obstacle avoidance and landing point directly based on one factor of the cruising electric quantity. According to the embodiment of the invention, the target obstacle avoidance and landing point is comprehensively determined according to different types of the obstacle avoidance and landing points and the current actual situation of the unmanned aerial vehicle, so that the selection accuracy and speed of the target obstacle avoidance and landing point are improved, and the success rate and speed of forced landing are further improved.
Example four
Fig. 10 is a flowchart of a landing method of another unmanned aerial vehicle according to a fourth embodiment of the present invention, where the present embodiment is optimized based on the foregoing embodiment, and as shown in fig. 10, the method includes:
and S410, detecting the abnormal condition of the unmanned aerial vehicle in real time.
And step S420, judging the emergency degree of the current forced landing according to the abnormal condition of the unmanned aerial vehicle.
The emergency degree of forced landing can be preset based on the reason of abnormal situations of the unmanned aerial vehicle and/or the air route task execution situation of the unmanned aerial vehicle.
And S430, determining a target obstacle avoidance and landing point in a passing area in the normal or forced landing flight process of the unmanned aerial vehicle according to the emergency degree.
For example, when the unmanned aerial vehicle is in a hardware abnormal state (for example, battery abnormality, radar abnormality, motor abnormality, or the like) or in a severe environment flight state, in order to ensure safety of the unmanned aerial vehicle, the unmanned aerial vehicle needs to be forced to land as soon as possible, and at this time, a target obstacle avoidance landing point can be preferentially determined in obstacle avoidance landing points determined in a normal flight process.
Exemplarily, when the unmanned aerial vehicle is in an abnormal communication state (for example, the unmanned aerial vehicle is out of link), under the condition of full autonomous flight operation of the unmanned aerial vehicle, the requirement on timeliness of forced landing of the unmanned aerial vehicle is low, at this time, the obstacle avoidance landing point meeting the return flight condition can be determined in the normal flight process of the unmanned aerial vehicle, and/or the target obstacle avoidance landing point is determined in the obstacle avoidance landing point determined according to the terrain image acquired in real time in the forced landing flight process of the unmanned aerial vehicle.
Further, the current air route task execution condition of the unmanned aerial vehicle can be detected when the unmanned aerial vehicle is in an abnormal communication state, if the air route task execution completion degree exceeds a preset completion degree threshold value, the unmanned aerial vehicle can be controlled to continuously fly and execute and complete the rest air route tasks in the abnormal communication state based on a task priority principle, and after the whole air route task is executed and completed, the optimal obstacle avoidance landing point is selected as a target obstacle avoidance landing point. Wherein the preset completion threshold may be set to 85%.
And S440, controlling the unmanned aerial vehicle to land to a target obstacle avoidance landing point.
According to the embodiment of the invention, the operation mode of the unmanned aerial vehicle is detected in real time, the emergency degree of the current forced landing is judged according to the abnormal condition of the unmanned aerial vehicle, the target obstacle avoidance landing point is determined in the passing area of the unmanned aerial vehicle in the normal or forced landing flight process according to the emergency degree, and the unmanned aerial vehicle is controlled to land to the target obstacle avoidance landing point. According to the embodiment of the invention, the target obstacle-avoiding landing point is determined according to the emergency degree by detecting the emergency degree of the current forced landing of the unmanned aerial vehicle, so that the adaptability of forced landing of the unmanned aerial vehicle is improved, and the target obstacle-avoiding landing point is intelligently selected.
EXAMPLE five
Fig. 11 is a schematic structural diagram of a landing device of an unmanned aerial vehicle according to a fifth embodiment of the present invention. The device can be realized by software and/or hardware, generally can integrate in unmanned aerial vehicle, can realize unmanned aerial vehicle's the descending of independently keeping away the barrier through the landing method of carrying out unmanned aerial vehicle, the descending security when improving unmanned aerial vehicle and compelling to land. As shown in fig. 11, the apparatus includes:
a landing point determining module 510, configured to determine a target obstacle avoidance landing point in a passing area of the unmanned aerial vehicle in a normal or forced landing flight process when it is detected that the unmanned aerial vehicle enters a forced landing mode;
a landing control module 520, configured to control the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
The embodiment of the invention provides a landing device of an unmanned aerial vehicle, which determines a target obstacle avoidance landing point in a normal or forced landing flight process of the unmanned aerial vehicle and controls the unmanned aerial vehicle to land to the target obstacle avoidance landing point when detecting that the unmanned aerial vehicle enters a forced landing mode. According to the embodiment of the invention, when the unmanned aerial vehicle is detected to enter the forced landing mode, the emergency forced landing is carried out according to the determined target obstacle avoidance landing point, so that the unmanned aerial vehicle can autonomously avoid the obstacle and land, and the landing safety of the unmanned aerial vehicle during forced landing is improved.
Optionally, the drop point determining module 510 is specifically configured to:
judging whether an obstacle avoidance and landing point meeting a return flight condition is stored at present, wherein the obstacle avoidance and landing point is determined and stored according to a terrain image acquired in real time in the normal flight process of the unmanned aerial vehicle;
if so, acquiring a target obstacle avoidance and landing point from the obstacle avoidance and landing points meeting the return flight conditions;
otherwise, in the forced landing flight process of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to collect terrain images in real time, determining at least one obstacle avoidance and landing point according to the terrain images, and acquiring a target obstacle avoidance and landing point in each determined obstacle avoidance and landing point.
Optionally, the drop point determining module 510 is further specifically configured to:
controlling the unmanned aerial vehicle to acquire a terrain image in the normal flight process in real time in the normal flight process of the unmanned aerial vehicle;
and determining at least one obstacle avoidance and landing point according to the terrain image, and storing each determined obstacle avoidance and landing point.
Optionally, the drop point determining module 510 is specifically configured to:
generating a preset landing point identification route according to the current positioning point of the unmanned aerial vehicle and the route type;
and controlling the unmanned aerial vehicle to fly along the preset landing point identification air line, and acquiring a terrain image in real time in the flying process.
Optionally, the type of the obstacle avoidance and landing point is determined based on a preset landing site condition, the landing site condition includes an optimal landing condition and a sub-optimal landing condition, and the type of the obstacle avoidance and landing point includes an optimal obstacle avoidance and landing point and a sub-optimal obstacle avoidance and landing point.
Optionally, the drop point determining module 510 is specifically configured to:
if each determined obstacle avoidance and drop point comprises at least one optimal obstacle avoidance and drop point, determining a target obstacle avoidance and drop point in the optimal obstacle avoidance and drop points;
if the determined obstacle avoidance and landing points only comprise the sub-optimal obstacle avoidance and landing point, judging whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value;
if so, determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points;
otherwise, controlling the unmanned aerial vehicle to fly forward for a preset distance, controlling the unmanned aerial vehicle to acquire the terrain image in real time again, and returning to execute the step of determining at least one obstacle avoidance and landing point according to the terrain image.
Optionally, the drop point determining module 510 is specifically configured to:
and judging whether the obstacle avoidance and landing points meeting the return flight conditions exist or not according to the stored types of the obstacle avoidance and landing points.
Optionally, the drop point determining module 510 is specifically configured to:
if each stored obstacle avoidance landing point comprises at least one optimal obstacle avoidance landing point, calculating a distance value between each optimal obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle;
if at least one target optimal obstacle avoidance and landing point with the distance value smaller than or equal to a preset threshold is determined to exist, determining each target optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting a return flight condition;
and determining a target obstacle avoidance and drop point in the target optimal obstacle avoidance and drop point.
Optionally, the drop point determining module 510 is further specifically configured to:
after calculating the distance value between each optimal obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle, if it is determined that no target optimal obstacle avoidance landing point with the distance value smaller than or equal to the preset threshold exists, judging whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to the preset electric quantity threshold;
if so, determining each sub-optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting the return flight condition;
and determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points.
Optionally, the drop point determining module 510 is specifically configured to:
if the stored obstacle avoidance and landing points only comprise at least one sub-optimal obstacle avoidance and landing point, judging whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value;
if so, determining each sub-optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting the return flight condition;
and determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points.
Optionally, the landing control module 520 is specifically configured to:
controlling the unmanned aerial vehicle to fly to the overhead position of the target obstacle avoidance and landing point, and controlling the unmanned aerial vehicle to hover and descend at the overhead position;
when the unmanned aerial vehicle is detected to descend to a preset height, whether a safe path flying to the target obstacle avoidance and landing point is generated or not is detected; wherein the safe path is determined based on position information of an obstacle between the unmanned aerial vehicle and the target obstacle avoidance and landing point after the target obstacle avoidance and landing point is located;
if the safe path is generated, controlling the unmanned aerial vehicle to fly to the target obstacle avoidance and landing point according to the safe path;
and if the safe path is not generated, controlling the unmanned aerial vehicle to be switched from a fixed wing mode to a rotor wing mode at the current position and to land to the target obstacle avoidance landing point.
Optionally, the landing control module 520 is further specifically configured to:
when the unmanned aerial vehicle is controlled to hover and descend at the overhead position, the position information of the obstacle on the hovering path is detected in real time, and when the obstacle is detected to exist, an obstacle avoiding track is generated according to the position information of the obstacle so as to avoid the obstacle to fly.
Optionally, the apparatus further comprises:
the information warning module is used for sending warning forced landing information through cloud communication after controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point; wherein the forced landing alarm information at least comprises at least one of the following information: the current position information of the unmanned aerial vehicle, the landing environment geographic information and the shortest route information between the position of the controller and the current position of the unmanned aerial vehicle.
The landing device of the unmanned aerial vehicle, provided by the embodiment of the invention, can execute the landing method of the unmanned aerial vehicle, provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
EXAMPLE six
Fig. 12 is a schematic structural diagram of a drone according to a sixth embodiment of the present invention, as shown in fig. 12, the drone includes a processor 600, a memory 610, an input device 620, and an output device 630; the number of processors 600 in the drone may be one or more, and one processor 600 is taken as an example in fig. 12; the processor 600, memory 610, input device 620, and output device 630 in the drone may be connected by a bus or other means, as exemplified by the bus connection in fig. 12.
The memory 610 may be used as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions and/or modules corresponding to the landing method of the drone (e.g., the landing point determination module 510 and the landing control module 520 in the landing device of the drone) in the embodiments of the present invention. The processor 600 executes various functional applications and data processing of the drone by running software programs, instructions and modules stored in the memory 610, that is, the landing method of the drone is realized. The method comprises the following steps:
when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, a target obstacle avoidance landing point is determined in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
The memory 610 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 610 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 610 may further include memory remotely located from the processor 600, which may be connected to the drone over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 620 may be used to receive entered numerical or character information and generate key signal inputs related to user settings and function control of the drone. The output device 630 may include a display device such as a display screen.
EXAMPLE seven
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for landing a drone, the method including:
when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, a target obstacle avoidance landing point is determined in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
Of course, the storage medium provided in the embodiment of the present invention includes computer-executable instructions, and the computer-executable instructions are not limited to the method operations described above, and may also perform related operations in the landing method of the drone provided in any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the landing device of the unmanned aerial vehicle, each unit and each module included in the landing device are only divided according to functional logic, but are not limited to the above division, as long as the corresponding function can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (16)
1. A landing method of an unmanned aerial vehicle, comprising:
when the fact that the unmanned aerial vehicle enters the forced landing mode is detected, a target obstacle avoidance landing point is determined in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
2. The method of claim 1, wherein determining a target obstacle avoidance landing point in a transit area during normal or forced landing flight of the drone comprises:
judging whether an obstacle avoidance and landing point meeting a return flight condition is stored at present, wherein the obstacle avoidance and landing point is determined and stored according to a terrain image acquired in real time in the normal flight process of the unmanned aerial vehicle;
if so, acquiring a target obstacle avoidance and landing point from the obstacle avoidance and landing points meeting the return flight conditions;
otherwise, in the forced landing flight process of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to collect terrain images in real time, determining at least one obstacle avoidance and landing point according to the terrain images, and acquiring a target obstacle avoidance and landing point in each determined obstacle avoidance and landing point.
3. The method of claim 2, further comprising:
controlling the unmanned aerial vehicle to acquire a terrain image in the normal flight process in real time in the normal flight process of the unmanned aerial vehicle;
and determining at least one obstacle avoidance and landing point according to the terrain image, and storing each determined obstacle avoidance and landing point.
4. The method of claim 2, wherein the controlling the drone to acquire the terrain image in real-time comprises:
generating a preset landing point identification route according to the current positioning point of the unmanned aerial vehicle and the route type;
and controlling the unmanned aerial vehicle to fly along the preset landing point identification air line, and acquiring a terrain image in real time in the flying process.
5. The method according to claim 2 or 3, wherein the type of the obstacle avoidance and landing point is determined based on preset landing site conditions, the landing site conditions include an optimal landing condition and a sub-optimal landing condition, and the type of the obstacle avoidance and landing point includes an optimal obstacle avoidance and landing point and a sub-optimal obstacle avoidance and landing point.
6. The method according to claim 5, wherein the obtaining a target obstacle avoidance and landing point from the determined obstacle avoidance and landing points comprises:
if each determined obstacle avoidance and drop point comprises at least one optimal obstacle avoidance and drop point, determining a target obstacle avoidance and drop point in the optimal obstacle avoidance and drop points;
if the determined obstacle avoidance and landing points only comprise the sub-optimal obstacle avoidance and landing point, judging whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value;
if so, determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points;
otherwise, controlling the unmanned aerial vehicle to fly forward for a preset distance, controlling the unmanned aerial vehicle to acquire the terrain image in real time again, and returning to execute the step of determining at least one obstacle avoidance and landing point according to the terrain image.
7. The method as claimed in claim 5, wherein the determining whether there is currently an obstacle avoidance and landing point satisfying a return flight condition stored includes:
and judging whether the obstacle avoidance and landing points meeting the return flight conditions exist or not according to the stored types of the obstacle avoidance and landing points.
8. The method according to claim 7, wherein the determining whether there is an obstacle avoidance and landing point satisfying a return flight condition according to the stored type of each obstacle avoidance and landing point comprises:
if each stored obstacle avoidance landing point comprises at least one optimal obstacle avoidance landing point, calculating a distance value between each optimal obstacle avoidance landing point and the current positioning point of the unmanned aerial vehicle;
if at least one target optimal obstacle avoidance and landing point with the distance value smaller than or equal to a preset threshold is determined to exist, determining each target optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting a return flight condition;
the acquiring target obstacle avoidance and landing points from the obstacle avoidance and landing points meeting the return flight conditions comprises the following steps:
and determining a target obstacle avoidance and drop point in the target optimal obstacle avoidance and drop point.
9. The method of claim 8, wherein after calculating the distance value between each optimal obstacle-avoidance landing point and the current location point of the drone, further comprising:
if the target optimal obstacle avoidance and landing point with the distance value smaller than or equal to the preset threshold value does not exist, judging whether the current endurance electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value;
if so, determining each sub-optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting the return flight condition;
the acquiring target obstacle avoidance and landing points from the obstacle avoidance and landing points meeting the return flight conditions comprises the following steps:
and determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points.
10. The method according to claim 7, wherein the determining whether there is an obstacle avoidance and landing point satisfying a return flight condition according to the stored type of each obstacle avoidance and landing point comprises:
if the stored obstacle avoidance and landing points only comprise at least one sub-optimal obstacle avoidance and landing point, judging whether the current cruising electric quantity of the unmanned aerial vehicle is smaller than or equal to a preset electric quantity threshold value;
if so, determining each sub-optimal obstacle avoidance and landing point as an obstacle avoidance and landing point meeting the return flight condition;
the acquiring target obstacle avoidance and landing points from the obstacle avoidance and landing points meeting the return flight conditions comprises the following steps:
and determining a target obstacle avoidance and landing point in the sub-optimal obstacle avoidance and landing points.
11. The method of claim 1, wherein the controlling the drone to land to the target obstacle avoidance landing point comprises:
controlling the unmanned aerial vehicle to fly to the overhead position of the target obstacle avoidance and landing point, and controlling the unmanned aerial vehicle to hover and descend at the overhead position;
when the unmanned aerial vehicle is detected to descend to a preset height, whether a safe path flying to the target obstacle avoidance and landing point is generated or not is detected; wherein the safe path is determined based on position information of an obstacle between the unmanned aerial vehicle and the target obstacle avoidance and landing point after the target obstacle avoidance and landing point is located;
if the safe path is generated, controlling the unmanned aerial vehicle to fly to the target obstacle avoidance and landing point according to the safe path;
and if the safe path is not generated, controlling the unmanned aerial vehicle to be switched from a fixed wing mode to a rotor wing mode at the current position and to land to the target obstacle avoidance landing point.
12. The method of claim 11, further comprising, while controlling the drone to hover for high at the overhead location:
the method comprises the steps of detecting position information of an obstacle on a spiral path in real time, and generating an obstacle avoidance track according to the position information of the obstacle when the obstacle is detected to be present so as to avoid the obstacle to fly.
13. The method of claim 1, further comprising, after controlling the drone to land to the target obstacle avoidance landing point:
sending alarm forced landing information through cloud communication; wherein the forced landing alarm information at least comprises at least one of the following information: the current position information of the unmanned aerial vehicle, the landing environment geographic information and the shortest route information between the position of the controller and the current position of the unmanned aerial vehicle.
14. The utility model provides an unmanned aerial vehicle's landing device which characterized in that includes:
the system comprises a landing point determining module, a target obstacle avoidance landing point determining module and a target obstacle avoidance landing point determining module, wherein the landing point determining module is used for determining a target obstacle avoidance landing point in a passing area in the normal or forced landing flying process of the unmanned aerial vehicle when the unmanned aerial vehicle is detected to enter a forced landing mode;
the landing control module is used for controlling the unmanned aerial vehicle to land to the target obstacle avoidance landing point;
the target obstacle avoidance and landing point is determined according to a terrain image acquired in the flight process of the unmanned aerial vehicle.
15. A drone, characterized in that it comprises:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of landing a drone as recited in any one of claims 1-13.
16. A storage medium containing computer-executable instructions, which when executed by a computer processor, operate to perform a method of landing a drone according to any one of claims 1-13.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110292860.7A CN113031633B (en) | 2021-03-18 | 2021-03-18 | Unmanned aerial vehicle landing method and device, unmanned aerial vehicle and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110292860.7A CN113031633B (en) | 2021-03-18 | 2021-03-18 | Unmanned aerial vehicle landing method and device, unmanned aerial vehicle and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113031633A true CN113031633A (en) | 2021-06-25 |
CN113031633B CN113031633B (en) | 2024-06-14 |
Family
ID=76471552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110292860.7A Active CN113031633B (en) | 2021-03-18 | 2021-03-18 | Unmanned aerial vehicle landing method and device, unmanned aerial vehicle and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113031633B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113428371A (en) * | 2021-07-23 | 2021-09-24 | 中国科学院地理科学与资源研究所 | Unmanned aerial vehicle capable of performing emergency treatment and unmanned aerial vehicle emergency treatment method |
CN113485440A (en) * | 2021-07-31 | 2021-10-08 | 武夷科技信息(北京)有限公司 | Direction control method for landing flight of unmanned aerial vehicle |
CN113625759A (en) * | 2021-08-25 | 2021-11-09 | 深圳市道通智能航空技术股份有限公司 | Unmanned aerial vehicle forced landing control method and device, remote control device and storage medium |
CN114415701A (en) * | 2022-01-20 | 2022-04-29 | 安胜(天津)飞行模拟系统有限公司 | Emergency automatic landing method, device, equipment and storage medium |
CN114527800A (en) * | 2022-02-25 | 2022-05-24 | 广州极飞科技股份有限公司 | Unmanned aerial vehicle landing field selection method, device and system and readable storage medium |
CN114924585A (en) * | 2022-05-19 | 2022-08-19 | 广东工业大学 | Safe landing method and system of rotor unmanned aerial vehicle on rugged ground surface based on vision |
CN115167500A (en) * | 2022-05-19 | 2022-10-11 | 北京远度互联科技有限公司 | Unmanned aerial vehicle flight control method, device and system and readable storage medium |
CN115657731A (en) * | 2022-10-14 | 2023-01-31 | 重庆德明尚品电子商务有限公司 | Unmanned aerial vehicle flight control method based on power supply monitoring |
CN117201565A (en) * | 2023-10-11 | 2023-12-08 | 西安月之峰电子科技有限公司 | Internet-connected unmanned aerial vehicle management cloud platform based on 5G transmission |
WO2024046261A1 (en) * | 2022-08-30 | 2024-03-07 | 峰飞航空科技(昆山)有限公司 | Transport method, ground station, transport plane, and transport system |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106054903A (en) * | 2016-07-27 | 2016-10-26 | 中南大学 | Multi-rotor unmanned aerial vehicle self-adaptive landing method and system |
CN106325299A (en) * | 2016-09-13 | 2017-01-11 | 上海顺砾智能科技有限公司 | Unmanned plane return flight landing method based on visual sense |
CN107291093A (en) * | 2017-07-04 | 2017-10-24 | 西北工业大学 | Unmanned plane Autonomous landing regional selection method under view-based access control model SLAM complex environment |
CN109508029A (en) * | 2017-09-14 | 2019-03-22 | 比亚迪股份有限公司 | Unmanned plane and its control method, device, system and computer readable storage medium |
CN110515390A (en) * | 2018-05-22 | 2019-11-29 | 北京京东尚科信息技术有限公司 | Aircraft Autonomous landing method and device, electronic equipment, storage medium |
US20210012666A1 (en) * | 2019-07-09 | 2021-01-14 | The Boeing Company | Landing an unmanned aerial vehicle in a contingency scenario |
CN112306084A (en) * | 2019-07-30 | 2021-02-02 | 广州极飞科技有限公司 | Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium |
-
2021
- 2021-03-18 CN CN202110292860.7A patent/CN113031633B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106054903A (en) * | 2016-07-27 | 2016-10-26 | 中南大学 | Multi-rotor unmanned aerial vehicle self-adaptive landing method and system |
CN106325299A (en) * | 2016-09-13 | 2017-01-11 | 上海顺砾智能科技有限公司 | Unmanned plane return flight landing method based on visual sense |
CN107291093A (en) * | 2017-07-04 | 2017-10-24 | 西北工业大学 | Unmanned plane Autonomous landing regional selection method under view-based access control model SLAM complex environment |
CN109508029A (en) * | 2017-09-14 | 2019-03-22 | 比亚迪股份有限公司 | Unmanned plane and its control method, device, system and computer readable storage medium |
CN110515390A (en) * | 2018-05-22 | 2019-11-29 | 北京京东尚科信息技术有限公司 | Aircraft Autonomous landing method and device, electronic equipment, storage medium |
US20210012666A1 (en) * | 2019-07-09 | 2021-01-14 | The Boeing Company | Landing an unmanned aerial vehicle in a contingency scenario |
CN112306084A (en) * | 2019-07-30 | 2021-02-02 | 广州极飞科技有限公司 | Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium |
Non-Patent Citations (1)
Title |
---|
DANIEL FITZGERALD 等: "A Vision Based Forced Landing Site Selection System for an Autonomous UAV", 《2005 INTERNATIONAL CONFERENCE ON INTELLIGENT SENSORS, SENSOR NETWORKS AND INFORMATION PROCESSING》, pages 1 - 7 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113428371A (en) * | 2021-07-23 | 2021-09-24 | 中国科学院地理科学与资源研究所 | Unmanned aerial vehicle capable of performing emergency treatment and unmanned aerial vehicle emergency treatment method |
CN113485440A (en) * | 2021-07-31 | 2021-10-08 | 武夷科技信息(北京)有限公司 | Direction control method for landing flight of unmanned aerial vehicle |
CN113625759A (en) * | 2021-08-25 | 2021-11-09 | 深圳市道通智能航空技术股份有限公司 | Unmanned aerial vehicle forced landing control method and device, remote control device and storage medium |
CN114415701A (en) * | 2022-01-20 | 2022-04-29 | 安胜(天津)飞行模拟系统有限公司 | Emergency automatic landing method, device, equipment and storage medium |
CN114527800A (en) * | 2022-02-25 | 2022-05-24 | 广州极飞科技股份有限公司 | Unmanned aerial vehicle landing field selection method, device and system and readable storage medium |
CN114924585A (en) * | 2022-05-19 | 2022-08-19 | 广东工业大学 | Safe landing method and system of rotor unmanned aerial vehicle on rugged ground surface based on vision |
CN115167500A (en) * | 2022-05-19 | 2022-10-11 | 北京远度互联科技有限公司 | Unmanned aerial vehicle flight control method, device and system and readable storage medium |
WO2024046261A1 (en) * | 2022-08-30 | 2024-03-07 | 峰飞航空科技(昆山)有限公司 | Transport method, ground station, transport plane, and transport system |
CN115657731A (en) * | 2022-10-14 | 2023-01-31 | 重庆德明尚品电子商务有限公司 | Unmanned aerial vehicle flight control method based on power supply monitoring |
CN115657731B (en) * | 2022-10-14 | 2023-12-19 | 国网甘肃省电力公司金昌供电公司 | Unmanned aerial vehicle flight control method based on power supply monitoring |
CN117201565A (en) * | 2023-10-11 | 2023-12-08 | 西安月之峰电子科技有限公司 | Internet-connected unmanned aerial vehicle management cloud platform based on 5G transmission |
CN117201565B (en) * | 2023-10-11 | 2024-08-20 | 合肥耀安科技有限公司 | Internet-connected unmanned aerial vehicle management cloud platform based on 5G transmission |
Also Published As
Publication number | Publication date |
---|---|
CN113031633B (en) | 2024-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113031633B (en) | Unmanned aerial vehicle landing method and device, unmanned aerial vehicle and medium | |
CN113494913A (en) | Unmanned aerial vehicle power inspection planning method and device, computer equipment and storage medium | |
JP6210522B2 (en) | Unmanned aircraft flight control method, flight data processing method, unmanned aircraft, and server | |
US9177481B2 (en) | Semantics based safe landing area detection for an unmanned vehicle | |
CN108508916B (en) | Control method, device and equipment for unmanned aerial vehicle formation and storage medium | |
CN107563373B (en) | Unmanned aerial vehicle landing area active safety detection method based on stereoscopic vision and application | |
CN109960269B (en) | Simplified environment modeling method serving autonomous navigation of unmanned aerial vehicle | |
CN108509820A (en) | Method for obstacle segmentation and device, computer equipment and readable medium | |
US10766616B2 (en) | Environment map automatic creation device | |
CN109816780B (en) | Power transmission line three-dimensional point cloud generation method and device of binocular sequence image | |
CN112650218A (en) | Transformer substation inspection route planning method and device based on collision detection | |
CN112106006A (en) | Control method and device for unmanned aerial vehicle and computer readable storage medium | |
CN112180955A (en) | Visual feedback-based secondary rechecking method and system for automatically polling unmanned aerial vehicle | |
CN112166394A (en) | Air route planning method, equipment, system and storage medium | |
CN110515390A (en) | Aircraft Autonomous landing method and device, electronic equipment, storage medium | |
WO2021056139A1 (en) | Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium | |
CN115100379A (en) | Fan blade transportation supervision method, system, equipment and medium | |
CN113566825A (en) | Unmanned aerial vehicle navigation method and system based on vision and storage medium | |
CN115752474A (en) | Robot navigation planning method and device under non-flat ground environment and robot | |
CN116661497A (en) | Intelligent aerocar | |
CN114387533A (en) | Method and device for identifying road violation, electronic equipment and storage medium | |
CN115599119A (en) | Unmanned aerial vehicle keeps away barrier system | |
CN117649785B (en) | Unmanned aerial vehicle multi-operator distributed cooperative conflict resolving method and system | |
CN107609468B (en) | Class optimization aggregation analysis method for active safety detection of unmanned aerial vehicle landing area and application | |
CN113674343B (en) | Unmanned aerial vehicle inspection method, unmanned aerial vehicle inspection device, unmanned aerial vehicle inspection equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |