CN112306084A - Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium - Google Patents

Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN112306084A
CN112306084A CN201910697458.XA CN201910697458A CN112306084A CN 112306084 A CN112306084 A CN 112306084A CN 201910697458 A CN201910697458 A CN 201910697458A CN 112306084 A CN112306084 A CN 112306084A
Authority
CN
China
Prior art keywords
area
unmanned aerial
aerial vehicle
forced landing
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910697458.XA
Other languages
Chinese (zh)
Other versions
CN112306084B (en
Inventor
陈有生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN201910697458.XA priority Critical patent/CN112306084B/en
Publication of CN112306084A publication Critical patent/CN112306084A/en
Application granted granted Critical
Publication of CN112306084B publication Critical patent/CN112306084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides an unmanned aerial vehicle forced landing method and device, an unmanned aerial vehicle and a storage medium. The method comprises the following steps: determining a forced landing area in the flight process of the unmanned aerial vehicle; acquiring position information of a forced landing area; storing the location information to a memory; when a forced landing event is detected, reading the position information of each forced landing area from a memory; determining target forced landing areas according to the position information of each forced landing area; and controlling the unmanned aerial vehicle to force to land in the target forced landing area. Through controlling unmanned aerial vehicle to save the regional positional information of definite forced landing in flight process, avoid unmanned aerial vehicle when need forced landing, need spend a large amount of time to look for the region that can force the landing, cause the problem that unmanned aerial vehicle can't descend in time, the unmanned aerial vehicle among the emergency is reduced in the time of staying in the air for unmanned aerial vehicle can in time carry out forced landing in suitable region, guaranteed unmanned aerial vehicle safety forced landing, the damage when having reduced unmanned aerial vehicle forced landing.

Description

Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium
Technical Field
The embodiment of the invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle forced landing method, an unmanned aerial vehicle forced landing device, an unmanned aerial vehicle and a storage medium.
Background
With the gradual maturity of unmanned aerial vehicle technology, unmanned aerial vehicles have gained wide application in a plurality of fields, for example, unmanned aerial vehicles can carry out flight operations such as survey and drawing, plant protection, exploration, aerial photography according to the air route that plans in advance.
In the prior art, before the unmanned aerial vehicle takes off, a landing point is arranged in a planned air route in advance, and the unmanned aerial vehicle returns to the landing point to land after executing a flight task or lands on site after executing the flight task.
However, in the flight process of the unmanned aerial vehicle, emergency needs are probably met and forced landing is required, for example, the unmanned aerial vehicle has insufficient electric quantity, mechanical failure, sensor fault and other conditions, so that the unmanned aerial vehicle can not arrive at the landing point set in advance when descending. At this moment, unmanned aerial vehicle begins to look for the region that can force to land at random from the current position, causes unmanned aerial vehicle to look for the time of forcing to land the region longer, probably also can't seek suitable region before unmanned aerial vehicle can't continue to fly and force to land even, causes unmanned aerial vehicle to fall to damage.
Disclosure of Invention
The invention provides a forced landing method and device for an unmanned aerial vehicle, the unmanned aerial vehicle and a storage medium, and aims to solve the problem that when the unmanned aerial vehicle needs forced landing, a large amount of time is needed to find a region capable of forced landing in an emergency situation.
In a first aspect, an embodiment of the present invention provides an unmanned aerial vehicle forced landing method, including:
determining a forced landing area in the flight process of the unmanned aerial vehicle;
acquiring the position information of the forced landing area;
storing the location information to a memory;
when a forced landing event is detected, reading the position information of each forced landing area from the memory;
determining target forced landing areas according to the position information of each forced landing area;
and controlling the unmanned aerial vehicle to force to land in the target forced landing area.
Optionally, the determining a target forced landing area according to the location information of each forced landing area includes:
determining a location of the drone;
determining the position of each forced landing area based on the position information;
calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle;
and determining the forced landing area corresponding to the minimum distance as a target forced landing area.
Optionally, the determining a target forced landing area according to the location information of each forced landing area includes:
determining a location of the drone;
determining the position of each forced landing area based on the position information;
planning a transition path from the unmanned aerial vehicle to each forced landing area according to the historical flight path;
and determining the forced landing area corresponding to the shortest transition path as a target forced landing area.
Optionally, planning a transition path from the drone to each forced landing area according to the historical flight path, including:
determining areas corresponding to a plurality of paths parallel to each other in the historical flight paths as safe areas;
and planning the transition path according to the safety area.
Optionally, planning a transition path from the drone to each forced landing area according to the historical flight path, including:
widening the historical flight path according to the flight width of the unmanned aerial vehicle to form a safety zone;
and planning the transition path according to the safety area.
Optionally, the drone is configured with a depth of field sensor and a radar;
in the flight process of the unmanned aerial vehicle, the forced landing area is determined, and the method comprises the following steps:
when a forced landing event is detected, controlling a depth sensor to acquire a depth image of a coverage area of the depth sensor;
controlling a radar to acquire radar data of the unmanned aerial vehicle to the coverage area;
determining a candidate area with flatness meeting preset landing conditions from the coverage area according to the depth image and the radar data;
and determining a landing area from the candidate area according to the radar data, wherein the landing area is an area in which the penetration thickness of the radar signal is smaller than a preset thickness and the signal intensity of the reflected radar signal is larger than a preset threshold value.
Optionally, the control radar obtains radar data of the drone to the coverage area, including
Acquiring the landing area of the unmanned aerial vehicle;
dividing the coverage area into a plurality of sub-areas based on the landing area and the depth image;
controlling the radar to transmit radar signals to a current sub-area and receiving echo signals reflected by the radar signals by the current sub-area, wherein the current sub-area is a sub-area right below the unmanned aerial vehicle;
and acquiring the radar data of the current sub-area based on the radar signal and the echo signal.
Optionally, the determining, from the coverage area according to the depth image and the radar data, a candidate area with flatness meeting a preset landing condition includes:
aiming at the current sub-area, acquiring depth values of all object points in the current sub-area based on the depth image;
determining whether the current subarea meets a preset landing condition or not based on the depth value of each object point and the first distance;
and if so, determining the current sub-region as a candidate region.
Optionally, the determining whether the current sub-region meets a preset landing condition based on the depth value of each object point and the first distance includes:
calculating the mean value and the variance of the depth values of all object points in the sub-area;
determining the maximum value and the minimum value of the depth values of all object points in the subarea;
calculating a difference value between the average value and the first distance to obtain a first difference value;
when the first difference value, the variance, the maximum value and the minimum value accord with a preset landing condition, determining that the sub-region accords with the preset landing condition;
when the first difference value, the variance, the maximum value and the minimum value do not accord with a preset landing condition, determining that the sub-region does not accord with the preset landing condition;
wherein the preset landing condition is at least one of the following conditions: the first difference is smaller than a first preset difference threshold, the variance is smaller than a preset variance threshold, the maximum value is smaller than a preset maximum value threshold, and the minimum value is larger than a preset minimum threshold.
Optionally, the radar data includes a first distance and a second distance from the drone to a first surface and a second surface of the candidate area, respectively, and a signal strength of an echo signal of the first surface;
the determining a landing area from the candidate area according to the radar data includes:
calculating a difference value between the first distance and the second distance to obtain a second difference value;
judging whether the second difference is larger than a second preset difference threshold value or not;
if so, determining that the candidate area is not an area with the penetration thickness of the radar signal smaller than the preset thickness, and the candidate area is a non-landing area;
if not, determining that the candidate area is an area with the penetration thickness of the radar signal smaller than the preset thickness, and judging whether the signal intensity is smaller than a preset intensity threshold value or not;
when the signal intensity is greater than a preset intensity threshold value, determining that the candidate area is an area with the signal intensity of the reflected radar signal greater than the preset threshold value, wherein the candidate area is a landing area;
and when the signal intensity is smaller than a preset intensity threshold value, determining that the candidate area is not an area with the signal intensity of the reflected radar signal larger than the preset threshold value, wherein the candidate area is a non-landing area.
In a second aspect, an embodiment of the present invention provides an unmanned aerial vehicle forced landing apparatus, including:
the forced landing area determining module is used for determining a forced landing area in the flight process of the unmanned aerial vehicle;
the position information acquisition module is used for acquiring the position information of the forced landing area;
the position information storage module is used for storing the position information to a memory;
the position information reading module is used for reading the position information of each forced landing area from the memory when a forced landing event is detected;
the target forced landing area determining module is used for determining a target forced landing area according to the position information of each forced landing area;
and the forced landing execution module is used for controlling the unmanned aerial vehicle to carry out forced landing in the target forced landing area.
Optionally, the target forced landing area determining module includes:
a location determination submodule for determining a location of the drone;
the forced landing area position determining module is used for determining the position of each forced landing area based on the position information;
the distance calculation submodule is used for calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle;
and the first target forced landing area determining submodule is used for determining the forced landing area corresponding to the minimum distance as the target forced landing area.
Optionally, the target forced landing area determining module includes:
a location determination submodule for determining a location of the drone;
the forced landing area position determining module is used for determining the position of each forced landing area based on the position information;
the transition path planning module is used for planning a transition path from the unmanned aerial vehicle to each forced landing area according to the historical flight path;
and the second target forced landing area determining submodule is used for determining the forced landing area corresponding to the shortest transition path as the target forced landing area.
Optionally, the transition path planning module includes:
a safety region determining unit, configured to determine, as a safety region, a region corresponding to a plurality of paths parallel to each other in the historical flight path;
and the transition path planning unit is used for planning the transition path according to the safety area.
Optionally, the transition path planning module includes:
the safety region expanding unit is used for widening the historical flight path according to the flight width of the unmanned aerial vehicle to form a strip-shaped safety region;
and the transition path planning unit is used for planning the transition path according to the safety area.
Optionally, the forced landing area determining module includes:
the depth image acquisition sub-module is used for controlling the depth sensor to acquire a depth image of a coverage area of the depth sensor when the forced landing event is detected;
the radar data acquisition submodule is used for controlling a radar to acquire radar data of the unmanned aerial vehicle to the coverage area;
the candidate area determining submodule is used for determining a candidate area with the flatness meeting a preset landing condition from the coverage area according to the depth image and the radar data;
and the landing area determining submodule is used for determining a landing area from the candidate area according to the radar data, wherein the landing area is an area in which the penetration thickness of the radar signal is smaller than the preset thickness and the signal intensity of the reflected radar signal is larger than the preset threshold value.
Optionally, the radar data acquisition sub-module further includes:
the landing area acquisition unit is used for acquiring the landing area of the unmanned aerial vehicle;
a sub-region dividing unit configured to divide the coverage region into a plurality of sub-regions based on the landing area and the depth image;
the radar signal receiving and transmitting unit is used for controlling the radar to transmit radar signals to a current sub-area and receiving echo signals reflected by the radar signals by the current sub-area, and the current sub-area is a sub-area right below the unmanned aerial vehicle;
and the radar data acquisition unit is used for acquiring radar data based on the radar signal and the echo signal.
Optionally, the candidate region determining sub-module includes:
the depth value acquisition unit is used for acquiring the depth value of each object point in the current sub-area based on the depth image aiming at the current sub-area;
the landing condition judging unit is used for determining whether the current subarea meets the preset landing condition or not based on the depth value of each object point and the first distance;
and the candidate region determining unit is used for determining the current sub-region as a candidate region.
Optionally, the landing condition determining unit includes:
the mean value and variance calculating subunit is used for calculating the mean value and the variance of the depth values of all the object points in the sub-area;
the maximum value and minimum value determining subunit is used for determining the maximum value and the minimum value of the depth values of all object points in the subarea;
the first searching and calculating subunit is used for calculating a difference value between the mean value and the first distance to obtain a first difference value;
a landing condition conforming determining subunit, configured to determine that the sub-region conforms to a preset landing condition when the first difference, the variance, the maximum value, and the minimum value conform to the preset landing condition;
the non-landing-condition determining subunit is used for determining that the sub-region does not meet the preset landing condition when the first difference value, the variance, the maximum value and the minimum value do not meet the preset landing condition;
wherein the preset landing condition is at least one of the following conditions: the first difference is smaller than a first preset difference threshold, the variance is smaller than a preset variance threshold, the maximum value is smaller than a preset maximum value threshold, and the minimum value is larger than a preset minimum threshold.
Optionally, the landing zone determination sub-module includes:
the second difference calculation unit is used for calculating the difference between the first distance and the second distance to obtain a second difference;
a difference value judging unit, configured to judge whether the second difference value is greater than a second preset difference value threshold;
the first non-landing area determining unit is used for determining that the candidate area is not an area with the penetration thickness of the radar signal smaller than the preset thickness, and the candidate area is a non-landing area;
the signal intensity judging unit is used for judging whether the signal intensity is smaller than a preset intensity threshold value or not;
the landing area determining unit is used for determining that the candidate area is an area which cannot be penetrated by radar signals when the signal intensity is greater than a preset intensity threshold value, and the candidate area is a landing area;
and the second non-landing area determining unit is used for determining that the candidate area is not an area which cannot be penetrated by the radar signal when the signal intensity is smaller than a preset intensity threshold value, and the candidate area is a non-landing area.
In a third aspect, an embodiment of the present invention provides an unmanned aerial vehicle, where the unmanned aerial vehicle includes:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of forced drone descent as described in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for forcing the landing of a drone according to any one of the first aspect.
According to the forced landing method of the unmanned aerial vehicle, in the flight process of the unmanned aerial vehicle, the depth of field sensor and the radar are controlled to acquire ground data so as to determine a forced landing area; acquiring the position information of the forced landing area; storing the location information to a memory. Through controlling unmanned aerial vehicle to save the regional positional information of definite forced landing in flight process, avoid unmanned aerial vehicle when need forced landing, need spend a large amount of time to look for the region that can force the landing, cause the problem that unmanned aerial vehicle can't descend in time, the unmanned aerial vehicle among the emergency is reduced in the time of staying in the air for unmanned aerial vehicle can in time carry out forced landing in suitable region, guaranteed unmanned aerial vehicle safety forced landing, the damage when having reduced unmanned aerial vehicle forced landing.
Drawings
Fig. 1 is a flowchart illustrating steps of a method for forcing an unmanned aerial vehicle to land according to an embodiment of the present invention;
fig. 2A is a flowchart illustrating steps of a forced landing method for an unmanned aerial vehicle according to a second embodiment of the present invention;
FIG. 2B is a schematic view of a depth sensor according to an embodiment of the present invention;
FIG. 2C is a diagram illustrating radar ranging according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an unmanned aerial vehicle forced landing device provided in the fourth embodiment of the present invention.
Detailed Description
In order to make the technical problems solved, technical solutions adopted and technical effects achieved by the present invention clearer, the technical solutions of the embodiments of the present invention will be described in further detail below with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Fig. 1 is a flowchart of steps of a forced landing method for an unmanned aerial vehicle according to an embodiment of the present invention, where the embodiment of the present invention is applicable to an emergency forced landing situation of an unmanned aerial vehicle, and the method may be executed by an unmanned aerial vehicle forced landing device of an unmanned aerial vehicle according to an embodiment of the present invention, and the device may be implemented in a software and/or hardware manner and integrated into an unmanned aerial vehicle, as shown in fig. 1, the method specifically includes the following steps:
s101, determining a forced landing area in the flight process of the unmanned aerial vehicle.
In practical application, unmanned aerial vehicle loads and has degree of depth sensor and radar, wherein degree of depth sensor and radar all can be towards unmanned aerial vehicle's below, and degree of depth sensor has certain visual angle, and it can acquire the depth image to the region that can cover on the ground.
Alternatively, the depth Of field sensor may be a camera, the camera may be one Of a binocular camera, a monocular camera, and a multi-view camera, the camera captures an image and calculates a depth Of field, and then generates a depth image, the depth Of field sensor may also be an array TOF (time Of flight), the array TOF is a light radar system, the array TOF may transmit light pulses from the transmitter to a coverage area thereof, the receiver may determine a distance from an object point on the coverage area to the array TOF by calculating a running time Of the light pulses from the transmitter to the coverage area and returning to the receiver, so as to generate the depth image, and the depth Of field sensor may also be another device capable Of measuring distance. In addition, the radar may be a range radar for measuring a distance from the unmanned aerial vehicle to the ground directly below the unmanned aerial vehicle, for example, the radar may be an ultrasonic radar or a millimeter wave radar.
S102, acquiring the position information of the forced landing area.
In the embodiment of the present invention, the position coordinate of the forced landing area refers to a position coordinate of the forced landing area in the world coordinate system, and the position coordinate may be read according to a positioning system arranged on the unmanned aerial vehicle, for example, longitude and latitude data and height data of the unmanned aerial vehicle are read as the position coordinate of the forced landing area, and further, the position coordinate is used as the position information of the forced landing area.
In practical application, a world coordinate system is constructed, wherein the world coordinate system comprises an X axis, a Y axis and a Z axis, a plane constructed by the X axis and the Y axis is parallel to the ground, and the Z axis is perpendicular to the plane constructed by the X axis and the Y axis. When the unmanned aerial vehicle flies to a certain position, the position of the unmanned aerial vehicle on the ground is (x, y), and the flying height of the unmanned aerial vehicle is z. At this time, when the position of the unmanned aerial vehicle is recorded as (x, y, z), it is determined that the position below the unmanned aerial vehicle can be used as the forced landing area, and then (x, y, z') can be used as the position information of the forced landing area. The determination of z 'depends on the ground height determined by radar, if the forced landing area is the ground, z' is 0, and if z 'is the top of a mountain or a valley, the value of z' is determined according to z.
And S103, storing the position information to a memory.
After determining the location information of the forced landing area, the location information is stored to the memory, for example, the coordinate information of the forced landing area is stored to the memory.
And S104, when the forced landing event is detected, reading the position information of each forced landing area from the memory.
In the embodiment of the invention, the forced landing event can be an event which causes that the unmanned aerial vehicle cannot continue normal flight, for example, the forced landing event can be a mechanical fault, insufficient electric quantity or a sensor fault, and when the forced landing event is detected, the position information of each forced landing area is read from the memory.
S105, determining target forced landing areas according to the position information of each forced landing area.
In an implementation, a location of the drone is determined; determining the position of each forced landing area based on the position information; calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle; and determining the forced landing area corresponding to the minimum distance as a target forced landing area.
And S106, controlling the unmanned aerial vehicle to forcedly land in the target forced landing area.
According to the forced landing method of the unmanned aerial vehicle, in the flight process of the unmanned aerial vehicle, the depth of field sensor and the radar are controlled to acquire ground data so as to determine a forced landing area; acquiring the position information of the forced landing area; storing the location information to a memory. Through controlling unmanned aerial vehicle to save the regional positional information of definite forced landing in flight process, avoid unmanned aerial vehicle when need forced landing, need spend a large amount of time to look for the region that can force the landing, cause the problem that unmanned aerial vehicle can't descend in time, the unmanned aerial vehicle among the emergency is reduced in the time of staying in the air for unmanned aerial vehicle can in time carry out forced landing in suitable region, guaranteed unmanned aerial vehicle safety forced landing, the damage when having reduced unmanned aerial vehicle forced landing.
Example two
Fig. 2A is a flow chart of steps of a forced landing method for an unmanned aerial vehicle according to a second embodiment of the present invention, which is optimized based on the first embodiment of the present invention, and provides an exemplary implementation method for determining a forced landing area and controlling forced landing of an unmanned aerial vehicle in the forced landing area, specifically, as shown in fig. 2A, the method according to the second embodiment of the present invention may include the following steps:
s201, when a forced landing event is detected, controlling a depth sensor to acquire a depth image of a coverage area of the depth sensor.
In the embodiment of the invention, the forced landing event can be an event which causes that the unmanned aerial vehicle cannot continue to fly normally, for example, the forced landing event can be a mechanical fault, insufficient electric quantity or a sensor fault and the like.
In practical application, unmanned aerial vehicle loads and has degree of depth sensor and radar, wherein degree of depth sensor and radar all can be towards unmanned aerial vehicle's below, and degree of depth sensor has certain visual angle, and it can acquire the depth image to the region that its can cover on the ground.
Optionally, the depth sensor may be a camera, the camera may be one of a binocular camera, a monocular camera, and a multi-view camera, and the depth image is generated after the depth of field is calculated by shooting an image with the camera, for example, for the monocular camera, under the condition that the focal length of the camera is known, the coordinates of the unmanned aerial vehicle and the coordinates of the pixel point in the shot image are combined, the distance from the object point to the unmanned aerial vehicle is calculated by the triangulation distance measuring principle, that is, the depth value of the object point relative to the unmanned aerial vehicle, and for the binocular camera, the depth value from the object point to the unmanned aerial vehicle may be calculated by the binocular distance measuring principle, so that the depth image may be generated according to the depth value.
The depth Of field sensor may also be an array TOF (time Of flight), the array TOF being a light radar system, the array TOF being capable Of transmitting light pulses from the transmitter to the coverage area thereof, and the receiver being capable Of determining the distance from an object point on the coverage area to the array TOF by calculating the running time Of the light pulses from the transmitter to the coverage area and back to the receiver, so as to generate a depth image, and the depth Of field sensor may also be other devices capable Of measuring distance.
S202, controlling a radar to acquire radar data of the unmanned aerial vehicle to the coverage area. And determining a candidate region with the flatness meeting a preset forced landing condition from the coverage region according to the depth image and the first distance.
In the embodiment of the invention, the radar can be a ranging radar which is used for measuring the distance from the unmanned aerial vehicle to the ground right below the unmanned aerial vehicle, for example, the radar can be an ultrasonic wave radar or a millimeter wave radar, and the type of the radar is not limited in the embodiment of the invention.
The radar data can be distance or signal strength, and when the radar data is obtained, the radar can be controlled to transmit radar signals to the coverage area and receive echo signals reflected by the coverage area on the radar signals, and the distance is calculated based on the radar signals and the echo signals. Optionally, a difference frequency signal may be generated by mixing a radar signal and an echo signal, so as to calculate a distance value according to the difference frequency signal, and of course, in practical application, the distance may also be estimated according to a time difference between a radar signal being transmitted and a radar signal being received.
In practical application, for an area covered by plants, the surface of the plants and the ground reflect radar signals to form echo signals, and then the radar data may include a first distance from the unmanned aerial vehicle to a first surface of the current sub-area below, a second distance from the unmanned aerial vehicle to a second surface of the current sub-area below, and signal strength of the radar signals reflected by the first surface. Specifically, after a difference frequency signal is generated by using a radar signal and an echo signal, the difference frequency signal is sampled, subjected to FFT conversion, and then subjected to detection processing, so as to determine signal frequencies and signal amplitudes of frequency points corresponding to the first surface and the second surface, respectively, calculate a first distance and a second distance according to the corresponding signal frequencies, and use the signal amplitude of the frequency point corresponding to the first surface as the signal intensity.
For an area not covered by plants, a distance, i.e., a first distance, can be acquired through the radar signal and the echo signal, and the signal strength of the echo signal can be acquired.
In a possible implementation, step S202 may include the following sub-steps:
s2021, obtaining the landing area of the unmanned aerial vehicle.
Unmanned aerial vehicle's descending area is relevant with unmanned aerial vehicle's overall dimension, can obtain descending demand region toward outer skew safe distance on overall dimension, and the regional area of this descending demand is unmanned aerial vehicle's descending area promptly
S2022, dividing the coverage area into a plurality of sub-areas based on the landing area and the depth image.
Specifically, the area of the coverage area is calculated based on the depth image, the coverage area is divided into a plurality of continuous sub-areas having an area equal to the landing area when the area of the coverage area is larger than the landing area, and the coverage area is regarded as a sub-area when the area of the coverage area is smaller than the landing area.
As shown in fig. 2B, the depth of field sensor is the camera, the visual angle of camera is a, unmanned aerial vehicle is h1 to the height of coverage area, height h1 can obtain through the distance that radar measurement unmanned aerial vehicle arrives the coverage area, then can confirm the length of side of coverage area according to trigonometric function relation, thereby confirm the area of coverage area, and then can divide the coverage area into the continuous subregion that a plurality of areas and unmanned aerial vehicle landing area equal, if because unmanned aerial vehicle height h1 is low excessively, when the area of coverage area is less than unmanned aerial vehicle landing area, directly regard whole coverage area as a subregion.
S2023, controlling the radar to transmit a radar signal to a current sub-area and receiving an echo signal reflected by the radar signal by the current sub-area, wherein the current sub-area is a sub-area right below the unmanned aerial vehicle;
specifically, the sub-region right below the unmanned aerial vehicle is the current sub-region, and the unmanned aerial vehicle can transmit radar signals to the current sub-region and receive echo signals reflected by the current sub-region on the radar signals.
S2024, acquiring radar data of the current sub-area based on the radar signal and the echo signal, wherein the radar data comprises a first distance.
In the embodiment of the present invention, a first distance and a second distance from the drone to the current sub-area may be determined based on the echo signal and the radar signal, where the first distance and the second distance are distances from the drone to a first surface and a second surface of the current sub-area, respectively; and acquiring the signal strength of the echo signal reflected by the first surface based on the echo signal and the radar signal.
Specifically, after a difference frequency signal is generated by using a radar signal and an echo signal, the difference frequency signal is sampled and subjected to FFT (fast Fourier transform) conversion and then subjected to detection processing, if the current sub-region is not covered by plants, the distance from the first surface of the current sub-region to the unmanned aerial vehicle can be determined as a first distance, and if the current sub-region is covered by plants, the distance from the second surface of the current sub-region to the unmanned aerial vehicle can be determined as a second distance. As shown in fig. 2C, for a sub-region, a first distance h1 and a second distance h2 from the unmanned aerial vehicle to the sub-region can be obtained by using a radar, that is, in practical application, the radar signal is a continuous periodic signal, if the sub-region is covered by a plant, a part of the radar signal emitted by the radar is reflected once by the top surface (first surface) of the plant to form an echo signal, a part of the radar signal penetrates through the plant layer and reaches the ground and is reflected by the ground (second surface) to form an echo signal, the radar receives the two parts of echo signals as echo signals, then a difference frequency signal is generated by using the radar signal and the received echo signal, the difference frequency signal is sampled and FFT-converted to a swept frequency signal, the swept frequency signal has a plurality of frequency points, each frequency point has a signal frequency and a signal amplitude, and after the swept frequency signal is detected, the frequency points corresponding to the first surface and the second surface can be determined from the swept frequency signal, and calculating the distance according to the signal frequency of the corresponding frequency point, wherein the signal amplitude also represents the signal intensity, and the signal intensity of the radar signal reflected by the first surface can be obtained.
S203, determining a candidate area with the flatness meeting preset landing conditions from the coverage area according to the depth image and the radar data. And controlling the radar to acquire a second distance and a third distance from the unmanned aerial vehicle to the candidate area and the signal strength of a radar signal reflected by the candidate area.
Specifically, each pixel point on the depth image is associated with a depth value, and the depth value is a distance from an object point on the coverage area of each pixel point corresponding to the depth sensor to the unmanned aerial vehicle, so that the depth value from the object point in the coverage area corresponding to the pixel point on the depth image to the unmanned aerial vehicle can be obtained.
For the coverage area covered by the depth sensor, a to-be-determined area may be determined as a current sub-area according to a landing area required by the unmanned aerial vehicle, for example, if the area below the unmanned aerial vehicle is determined as the current sub-area, the radar may be controlled to acquire radar data from the unmanned aerial vehicle to the current sub-area, the radar data may include a first distance from the unmanned aerial vehicle to the surface of the current sub-area, and then a flatness of the current sub-area may be determined according to a depth value of each object point in the current sub-area and the first distance from the unmanned aerial vehicle to the current sub-area measured by the radar, the flatness expressing a distance from each object point in the current.
Optionally, the flatness may be expressed by a plurality of parameters, for example, a difference between a mean value of the depth values and the first distance may be used as the flatness, or a variance of the depth values may be used as the flatness, a minimum value of a maximum value of the depth values may be obtained as the flatness, or a combination of two or more of the foregoing values may be used as the flatness, when the flatness meets a preset landing condition, the current sub-area is determined to be a candidate area meeting the landing condition, otherwise, the unmanned aerial vehicle continues to fly to a next area to be determined as the current sub-area, and after the first distance of the current sub-area is obtained, it is determined whether the preset landing condition is met, until a sub-area with the flatness meeting the preset landing condition is determined as the candidate area.
In a possible implementation, step S203 may include the following sub-steps:
s2031, aiming at the current sub-area, obtaining the depth value of each object point in the current sub-area based on the depth image.
In the embodiment of the invention, the sub-region below the unmanned aerial vehicle can be used as the current sub-region, and the pixel point corresponding to the current sub-region can be determined from the depth image, so that the depth value of the object point corresponding to the pixel point in the current sub-region can be obtained through the pixel point corresponding to the current sub-region.
S2032, determining whether the current sub-area meets preset landing conditions or not based on the depth values of the object points and the first distances. If yes, executing step S2033; if not, go to S2034.
In the embodiment of the present invention, for the current sub-region, a mean and a variance of depth values of all object points in the sub-region may be calculated, a difference between the mean and the first distance may be calculated, a first difference may be obtained, and a maximum value and a minimum value of depth values of each object point in the sub-region may be determined.
When the unmanned aerial vehicle is located right above the current sub-region, the first distance is the distance from the unmanned aerial vehicle acquired by the radar to the current sub-region, the mean value is the arithmetic mean value of the depth values of all object points in the sub-region, the variance expresses the fluctuation range of the depth values of all object points in the sub-region, the maximum value of the depth values expresses the distance from the lowest position in the sub-region to the unmanned aerial vehicle, and the minimum value of the depth values expresses the distance from the highest position in the sub-region to the unmanned aerial vehicle.
In an embodiment of the present invention, the preset landing condition is at least one of the following conditions: the first difference is smaller than a first preset difference threshold, the variance is smaller than a preset variance threshold, the maximum is smaller than a preset maximum threshold, and the minimum is larger than a preset minimum threshold.
In particular, the amount of the solvent to be used,
judging whether the difference between the average value and the first distance obtained by radar measurement is smaller than a first preset difference threshold value, if so, indicating that the whole current sub-area is relatively flat, and if not, indicating that the current sub-area is uneven;
judging whether the variance is smaller than a preset variance threshold value, if so, indicating that the fluctuation of the depth value of the current subregion is small, and if not, indicating that the fluctuation of the depth value of the current subregion is large;
judging whether the minimum value of the depth values is larger than a preset minimum threshold value or not, if so, indicating that no tiny abrupt objects exist in the current sub-area, and if not, indicating that tiny protruding objects possibly exist in the current sub-area, such as telegraph poles erected on a flat ground or bald trunks and the like;
and judging whether the maximum value of the depth value is smaller than a preset maximum threshold value, if so, indicating that no deeper pot holes exist in the current sub-area, and otherwise, indicating that larger and deeper pot holes possibly exist in the current sub-area.
In the above condition, the mean and the variance are globally confirmed for the flatness of the entire sub-region as a whole, and the maximum and minimum values of the depth values are confirmed for the flatness of the sub-region from the local region. In practical application, whether the flatness of the current sub-region meets the preset landing condition or not can be determined through at least one of the mean value, the variance, the maximum value and the minimum value, if so, S2033 is executed, and if not, S2034 is executed.
It should be noted that the preset difference threshold, the preset variance threshold, the preset maximum threshold and the preset minimum threshold are related to the physical characteristics of the unmanned aerial vehicle itself, and if the requirement of the unmanned aerial vehicle landing on the ground flatness is low, the preset difference threshold and the preset variance threshold may be larger, otherwise, the value is smaller.
S2033, determining the current sub-region as a candidate region.
If the flatness of the current sub-region meets the preset landing condition through at least one of the mean, the variance, the maximum and the minimum, the current sub-region is relatively flat, the current sub-region can be determined to be a candidate region, and S2041 is executed to further confirm whether the candidate region is used as a landing region.
S2034, determining that the current sub-region is not a candidate region, and returning to S2023.
If the flatness of the current sub-area is determined to be not in accordance with the preset landing condition through at least one of the mean value, the variance, the maximum value and the minimum value, the fact that the current sub-area is not in accordance with the landing condition is indicated, damage to the unmanned aerial vehicle is possibly caused when the current sub-area lands, and the current sub-area cannot be used as a candidate area, the operation returns to S2023, traversal is performed in the coverage area, the unmanned aerial vehicle is controlled to fly to the next sub-area, the next sub-area is used as the current sub-area, and S2023-S2031.
According to the embodiment of the invention, after the coverage area is divided into a plurality of sub-areas according to the landing area of the unmanned aerial vehicle, the depth data of the sub-areas is determined according to the depth image, then each sub-area is traversed, and whether the flatness of the sub-area meets the preset landing condition is determined according to the depth value of each object point in the sub-area and the first distance from the unmanned aerial vehicle to the sub-area during traversal, so that the candidate area with the flatness meeting the landing condition is determined, the unmanned aerial vehicle is prevented from landing in an uneven area, the safe landing of the unmanned aerial vehicle is ensured, and the damage of the unmanned aerial.
And S204, determining a landing area from the candidate area according to the radar data. And determining a region with the surface as a preset surface from the candidate regions according to the second distance, the third distance and the signal strength as a forced landing region.
Specifically, for a candidate area with flatness meeting a preset landing condition, the first distance expresses the closest distance from the candidate area to the drone, for example, the first distance may be the distance from the top surface of a plant in the candidate area to the drone, if the radar data includes a second distance, it indicates that the candidate area is covered by the plant, and the second distance expresses the farthest distance from the candidate area to the drone, for example, the distance from the ground in the candidate area to the drone, so that it may be determined whether the candidate area is covered by the plant according to the first distance and the second distance, that is, whether the candidate area has a penetrating layer penetrated by a radar signal, optionally, a difference between the first distance and the second distance may be calculated, if the difference is smaller than a preset value, it indicates that the thickness of the penetrating layer of the radar signal is small or even impossible to penetrate, it indicates that the plant on the candidate area is short or not covered by the plant (the difference is 0), if the difference value is larger than the preset value, the candidate area is covered by a higher plant, namely the penetration layer of the radar signal is thicker and cannot be used as a landing area.
For a candidate area with short plants or even without plant coverage, that is, a candidate area with a small thickness of a penetrating layer of a radar signal or even with a small thickness of an impenetrable penetrating layer, it can be determined whether the candidate area is an area with a suitable hardness for falling according to the signal strength of an echo signal of the radar signal, for example, a firm ground, specifically, when the signal strength is less than a preset strength threshold, it can be determined that the candidate area is a water surface with weak reflection to the radar signal or a soft surface such as a marsh, and the candidate area is not an area with the signal strength of the reflection radar signal greater than the preset threshold and cannot be used as a falling area, otherwise, it is an area with the signal strength of the reflection radar signal greater than the preset threshold and is a suitable falling area with a hard surface and can be used as.
After the landing area is determined, the unmanned aerial vehicle can be controlled to land in the landing area, otherwise, the unmanned aerial vehicle is controlled to fly to the next sub-area in the coverage area, and the radar is controlled to acquire radar data of the sub-area until the landing area is determined.
In a possible implementation, step S204 may include the following sub-steps:
s2041, calculating a difference value between the first distance and the second distance to obtain a second difference value.
As shown in fig. 2C, the difference between the second distance h2 and the first distance h1 is the height of the plant, i.e., the second difference is the height of the plant, i.e., the penetration thickness of the radar signal.
Of course, when the radar data only includes the first distance, it is determined that the candidate region cannot be penetrated by the radar signal, that is, the penetration thickness of the radar signal is 0, and the candidate region is a region having a penetration thickness smaller than the preset thickness, and S2043 is directly performed without performing S2041.
And S211, judging whether the second difference is larger than a second preset difference threshold value, if so, executing S2042, and if not, executing S2043.
In practical application, the second difference has expressed the height of plant on the candidate region, namely radar signal's penetration thickness, because the plant height in this candidate region is comparatively unanimous, lead to confirming that this region is comparatively level through depth of field sensor and radar, consequently need get rid of the candidate region for the region that the plant covered, specifically, can compare the size of second difference and the preset difference threshold value of second, wherein, the preset difference threshold value of second can be when candidate region is covered by the plant, the height when unmanned aerial vehicle descends is not influenced to the height of this plant, for example, shorter plant such as turf can not cause the loss to unmanned aerial vehicle's descending. S2042 is performed when the second difference is greater than a second preset difference threshold, and S2043 is performed when the second difference is less than the second preset difference threshold.
S2042, determining that the candidate area is not an area with the penetration thickness of the radar signal smaller than the preset thickness, and returning to S2023, wherein the candidate area is a non-landing area.
If the second difference is greater than the second preset difference threshold, it is indicated that the candidate area is covered by a higher plant and loss is caused to the landing of the unmanned aerial vehicle, the candidate area is determined to be unsuitable for landing, and the candidate area is determined to be a non-landing area, and then the step returns to S2023 to re-determine the candidate area.
Obtain first distance and the second distance of candidate region to unmanned aerial vehicle through the radar, and calculate the difference of first distance and second distance, compare this difference with preset difference threshold value, confirm that the candidate region is not the region that pierces through thickness and is less than preset thickness of radar signal when the difference is greater than preset difference threshold value, this candidate region is not the descending region, it is the region that is covered by higher plant to have got rid of comparatively level and smooth candidate region, unmanned aerial vehicle has been avoided in the woods, the regional descending problem that causes the unmanned aerial vehicle loss such as farmland, unmanned aerial vehicle safe descending has been guaranteed, the loss that unmanned aerial vehicle descends has been reduced.
S2043, determining that the candidate area is an area with the penetration thickness of the radar signal smaller than the preset thickness, and judging whether the signal intensity is smaller than a preset intensity threshold value.
If the second difference is smaller than a second preset difference threshold, it indicates that the candidate area is an area where the penetration thickness of the radar signal is smaller than a preset thickness, and the candidate area is an area covered by shorter plants, or an uncovered ground not covered by plants, or a water surface area, a swamp area, or the like, for example, when the candidate area is an uncovered ground not covered by plants, a water surface area, a swamp area, the first distance and the second distance are equal, and the difference may be 0. In order to exclude the candidate area as the water surface area or the swamp area, the characteristic that the reflection intensity of the water surface or the softer swamp area to the radar signal is weak may be utilized to determine whether the signal intensity of the echo signal is smaller than a preset intensity threshold, if so, S2045 is executed, and if not, S2044 is executed.
S2044, determining that the candidate area is an area with the signal intensity of the reflected radar signal being greater than a preset threshold value, and the candidate area is a landing area.
If the signal intensity of echo signal is greater than preset intensity threshold value, it shows that echo signal is through harder surface reflection formation, and signal loss is few, and signal intensity is big, and candidate region is harder ground promptly, is fit for unmanned aerial vehicle to descend, then this candidate region is the landing area.
S2045, determining that the candidate area is not an area with the signal intensity of the reflected radar signal being greater than a preset threshold value, and the candidate area is a non-landing area.
If the signal intensity of the echo signal is smaller than the preset intensity threshold value, the echo signal is formed by the reflection of a softer surface, the signal loss is large, the signal intensity is small, namely the candidate area is a softer water surface or a swamp area and is not suitable for the unmanned aerial vehicle to land, and the candidate area is a non-landing area.
After the candidate area is determined to be the landing area, the unmanned aerial vehicle can be controlled to land in the landing area, otherwise, the unmanned aerial vehicle is controlled to fly to the next sub-area of the coverage area of the depth-of-field sensor, the sub-area is used as the current sub-area, and the step returns to the step S2023.
S205, acquiring the position information of the forced landing area.
And S206, storing the position information into a memory.
And S207, when the forced landing event is detected, reading the position information of each forced landing area from the memory.
In the embodiment of the invention, the forced landing event can be an event which causes that the unmanned aerial vehicle cannot continue normal flight, for example, the forced landing event can be a mechanical fault, insufficient electric quantity or a sensor fault, and when the forced landing event is detected, the position information of each forced landing area is read from the memory.
And S208, determining the position of each forced landing area based on the position information.
In an implementation, a location of the drone is determined; determining the position of each forced landing area based on the position information; calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle; and determining the forced landing area corresponding to the minimum distance as a target forced landing area.
S209, calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle.
S201, determining the forced landing area corresponding to the minimum distance as a target forced landing area.
According to the forced landing method of the unmanned aerial vehicle, in the flight process of the unmanned aerial vehicle, the depth of field sensor and the radar are controlled to acquire ground data so as to determine a forced landing area; acquiring the position information of the forced landing area; storing the location information to a memory. Through controlling unmanned aerial vehicle to save the regional positional information of definite forced landing in flight process, avoid unmanned aerial vehicle when need forced landing, need spend a large amount of time to look for the region that can force the landing, cause the problem that unmanned aerial vehicle can't descend in time, the unmanned aerial vehicle among the emergency is reduced in the time of staying in the air for unmanned aerial vehicle can in time carry out forced landing in suitable region, guaranteed unmanned aerial vehicle safety forced landing, the damage when having reduced unmanned aerial vehicle forced landing.
EXAMPLE III
Fig. 3 is a flowchart of steps of a forced landing method for an unmanned aerial vehicle according to a third embodiment of the present invention, which is optimized based on the first embodiment of the present invention, and provides an exemplary implementation method for determining a forced landing area and controlling forced landing of an unmanned aerial vehicle in the forced landing area, specifically, as shown in fig. 3, the method according to the third embodiment of the present invention may include the following steps:
s301, determining a forced landing area in the flight process of the unmanned aerial vehicle.
S302, acquiring the position information of the forced landing area.
And S303, storing the position information into a memory.
S304, when the forced landing event is detected, reading the position information of each forced landing area from the memory.
S305, determining the position of the unmanned aerial vehicle.
S306, determining the position of each forced landing area based on the position information.
The processes of step S301 to step S306 may refer to the implementation manner of the first embodiment or the second embodiment, and are not described in detail in this implementation.
S307, planning a transition path from the unmanned aerial vehicle to each forced landing area according to the historical flight path.
The historical flight path refers to the road strength passed by the unmanned aerial vehicle in the flight process. The historical flight path can be a historical flight path acquired in the current flight task; or the acquired historical road flight path can be acquired because the unmanned aerial vehicle executes the same flight task for a long time. The former mode is adopted to determine the historical flight path, so that the accuracy of judgment of the forced landing area can be improved, the influence of the environment can be caused in the process of determining the forced landing area, for example, one area can be used as the forced landing area for a long time, but due to weather (rainfall, hail and the like), the existing mode can be eliminated if the existing mode is not suitable for being used as the forced landing area in the flight mission, and the safety of forced landing of the unmanned aerial vehicle is better protected. The historical flight path is confirmed to adopt the latter mode and can be reduced the calculation pressure of calculation subassembly, if want the position of real-time collection unmanned aerial vehicle, go to update unmanned aerial vehicle's forced landing area according to the image that unmanned aerial vehicle sent back and the information of radar in real time, can produce huge data bulk, and this has proposed higher requirement to the computing equipment that unmanned aerial vehicle carried, has also improved unmanned aerial vehicle's manufacturing cost.
The transition path refers to a path which the unmanned aerial vehicle passes from the current position to the forced landing area. The transition path may be a straight line or a broken line determined by other rules.
In a feasible implementation manner, areas corresponding to a plurality of paths parallel to each other in the historical flight paths are determined as safe areas, and the transition path is planned according to the safe areas. The areas corresponding to the multiple paths parallel to each other refer to areas through which the unmanned aerial vehicle passes before reaching the current position, and the areas through which the unmanned aerial vehicle passes are determined as safety areas.
In a feasible implementation manner, the historical flight path is widened according to the flight width of the unmanned aerial vehicle to form a strip-shaped safety zone, and the transition path is planned according to the safety zone. The general transition path is expressed by a line segment mode, and the unmanned aerial vehicle can fly to the forced landing area smoothly in convenience by considering the size problem of the unmanned aerial vehicle.
And S308, determining the forced landing area corresponding to the shortest transition path as a target forced landing area.
The length of each transition path is determined, and the shortest transition path is determined. And determining the forced landing area corresponding to the shortest transition path as a target forced landing area.
S309, controlling the unmanned aerial vehicle to forcedly land in the target forced landing area.
And controlling the unmanned aerial vehicle to carry out forced landing in the target forced landing area.
According to the forced landing method of the unmanned aerial vehicle, in the flight process of the unmanned aerial vehicle, the depth of field sensor and the radar are controlled to acquire ground data so as to determine a forced landing area; acquiring the position information of the forced landing area; storing the location information to a memory. Through controlling unmanned aerial vehicle to save the regional positional information of definite forced landing in flight process, avoid unmanned aerial vehicle when need forced landing, need spend a large amount of time to look for the region that can force the landing, cause the problem that unmanned aerial vehicle can't descend in time, the unmanned aerial vehicle among the emergency is reduced in the time of staying in the air for unmanned aerial vehicle can in time carry out forced landing in suitable region, guaranteed unmanned aerial vehicle safety forced landing, the damage when having reduced unmanned aerial vehicle forced landing.
Example four
Fig. 4 is a schematic structural diagram of an unmanned aerial vehicle forced landing apparatus provided in the fourth embodiment of the present invention, as shown in fig. 4, specifically, the forced landing apparatus may include:
the forced landing area determining module 41 is used for determining a forced landing area in the flight process of the unmanned aerial vehicle;
a position information obtaining module 42, configured to obtain position information of the forced landing area;
a location information storage module 43, configured to store the location information in a memory;
a position information reading module 44, configured to read position information of each forced landing area from the memory when a forced landing event is detected;
a target forced landing area determining module 45, configured to determine a target forced landing area according to the location information of each forced landing area;
and a forced landing execution module 46 for controlling the unmanned aerial vehicle to carry out forced landing in the target forced landing area.
Optionally, the target forced landing area determining module 45 includes:
a location determination submodule for determining a location of the drone;
the forced landing area position determining module is used for determining the position of each forced landing area based on the position information;
the distance calculation submodule is used for calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle;
and the first target forced landing area determining submodule is used for determining the forced landing area corresponding to the minimum distance as the target forced landing area.
Optionally, the forced landing area determining module 41 includes:
the depth image acquisition sub-module is used for controlling the depth sensor to acquire a depth image of a coverage area of the depth sensor when the forced landing event is detected;
the radar data acquisition submodule is used for controlling a radar to acquire radar data of the unmanned aerial vehicle to the coverage area;
the candidate area determining submodule is used for determining a candidate area with the flatness meeting a preset landing condition from the coverage area according to the depth image and the radar data;
and the landing area determining submodule is used for determining a landing area from the candidate area according to the radar data, wherein the landing area is an area in which the penetration thickness of the radar signal is smaller than the preset thickness and the signal intensity of the reflected radar signal is larger than the preset threshold value.
Optionally, the target forced landing area determining module includes:
a location determination submodule for determining a location of the drone;
the forced landing area position determining module is used for determining the position of each forced landing area based on the position information;
the transition path planning module is used for planning a transition path from the unmanned aerial vehicle to each forced landing area according to the historical flight path;
and the second target forced landing area determining submodule is used for determining the forced landing area corresponding to the shortest transition path as the target forced landing area.
Optionally, the transition path planning module includes:
a safety region determining unit, configured to determine, as a safety region, a region corresponding to a plurality of paths parallel to each other in the historical flight path;
and the transition path planning unit is used for planning the transition path according to the safety area.
Optionally, the transition path planning module includes:
the safety region expanding unit is used for widening the historical flight path according to the flight width of the unmanned aerial vehicle to form a strip-shaped safety region;
and the transition path planning unit is used for planning the transition path according to the safety area.
Optionally, the radar data acquisition sub-module further includes:
the landing area acquisition unit is used for acquiring the landing area of the unmanned aerial vehicle;
a sub-region dividing unit configured to divide the coverage region into a plurality of sub-regions based on the landing area and the depth image;
the radar signal receiving and transmitting unit is used for controlling the radar to transmit radar signals to a current sub-area and receiving echo signals reflected by the radar signals by the current sub-area, and the current sub-area is a sub-area right below the unmanned aerial vehicle;
and the radar data acquisition unit is used for acquiring radar data based on the radar signal and the echo signal.
Optionally, the candidate region determining sub-module includes:
the depth value acquisition unit is used for acquiring the depth value of each object point in the current sub-area based on the depth image aiming at the current sub-area;
the landing condition judging unit is used for determining whether the current subarea meets the preset landing condition or not based on the depth value of each object point and the first distance;
and the candidate region determining unit is used for determining the current sub-region as a candidate region.
Optionally, the landing condition determining unit includes:
the mean value and variance calculating subunit is used for calculating the mean value and the variance of the depth values of all the object points in the sub-area;
the maximum value and minimum value determining subunit is used for determining the maximum value and the minimum value of the depth values of all object points in the subarea;
the first searching and calculating subunit is used for calculating a difference value between the mean value and the first distance to obtain a first difference value;
a landing condition conforming determining subunit, configured to determine that the sub-region conforms to a preset landing condition when the first difference, the variance, the maximum value, and the minimum value conform to the preset landing condition;
the non-landing-condition determining subunit is used for determining that the sub-region does not meet the preset landing condition when the first difference value, the variance, the maximum value and the minimum value do not meet the preset landing condition;
wherein the preset landing condition is at least one of the following conditions: the first difference is smaller than a first preset difference threshold, the variance is smaller than a preset variance threshold, the maximum value is smaller than a preset maximum value threshold, and the minimum value is larger than a preset minimum threshold.
Optionally, the landing zone determination sub-module includes:
the second difference calculation unit is used for calculating the difference between the first distance and the second distance to obtain a second difference;
a difference value judging unit, configured to judge whether the second difference value is greater than a second preset difference value threshold;
the first non-landing area determining unit is used for determining that the candidate area is not an area with the penetration thickness of the radar signal smaller than the preset thickness, and the candidate area is a non-landing area;
the signal intensity judging unit is used for judging whether the signal intensity is smaller than a preset intensity threshold value or not;
the landing area determining unit is used for determining that the candidate area is an area which cannot be penetrated by radar signals when the signal intensity is greater than a preset intensity threshold value, and the candidate area is a landing area;
and the second non-landing area determining unit is used for determining that the candidate area is not an area which cannot be penetrated by the radar signal when the signal intensity is smaller than a preset intensity threshold value, and the candidate area is a non-landing area.
The forced landing device of the unmanned aerial vehicle can execute the forced landing method of the unmanned aerial vehicle provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
An embodiment of the present invention further provides an unmanned aerial vehicle, including: one or more processors; a storage device, configured to store one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the method for forced landing of a drone according to any embodiment of the present invention.
An embodiment of the present invention further provides a computer-readable storage medium, where instructions in the storage medium, when executed by a processor of a device, enable an unmanned aerial vehicle to perform the method for forcing the unmanned aerial vehicle to land according to the above method embodiment.
It should be noted that, for the device, the drone and the storage medium embodiment, since they are basically similar to the method embodiment, the description is simpler, and in relevant places, refer to the partial description of the method embodiment.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, and the computer software product may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions to enable a computer device (which may be a robot, a personal computer, a server, or a network device) to execute the unmanned aerial vehicle landing method according to any embodiment of the present invention.
It is worth noting that, in the above forced landing apparatus for unmanned aerial vehicle, each unit and each module included in the apparatus are only divided according to functional logic, but not limited to the above division, as long as the corresponding function can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by suitable instruction execution devices. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (13)

1. An unmanned aerial vehicle forced landing method is characterized by comprising the following steps:
determining a forced landing area in the flight process of the unmanned aerial vehicle;
acquiring the position information of the forced landing area;
storing the location information to a memory;
when a forced landing event is detected, reading the position information of each forced landing area from the memory;
determining target forced landing areas according to the position information of each forced landing area;
and controlling the unmanned aerial vehicle to force to land in the target forced landing area.
2. The unmanned aerial vehicle forced landing method according to claim 1, wherein the determining of the target forced landing area according to the position information of each forced landing area comprises:
determining a location of the drone;
determining the position of each forced landing area based on the position information;
calculating the distance from each forced landing area to the unmanned aerial vehicle by adopting the position of each forced landing area and the position of the unmanned aerial vehicle;
and determining the forced landing area corresponding to the minimum distance as a target forced landing area.
3. The unmanned aerial vehicle forced landing method according to claim 1, wherein the determining of the target forced landing area according to the position information of each forced landing area comprises:
determining a location of the drone;
determining the position of each forced landing area based on the position information;
planning a transition path from the unmanned aerial vehicle to each forced landing area according to the historical flight path;
and determining the forced landing area corresponding to the shortest transition path as a target forced landing area.
4. The method of claim 3, wherein planning the transition path of the unmanned aerial vehicle to each forced landing area according to the historical flight path comprises:
determining areas corresponding to a plurality of paths parallel to each other in the historical flight paths as safe areas;
and planning the transition path according to the safety area.
5. The method of claim 3, wherein planning the transition path of the unmanned aerial vehicle to each forced landing area according to the historical flight path comprises:
widening the historical flight path according to the flight width of the unmanned aerial vehicle to form a strip-shaped safety area;
and planning the transition path according to the safety area.
6. The forced landing method of a drone of any one of claims 1-5, wherein the drone is configured with a depth of field sensor and a radar;
in the flight process of the unmanned aerial vehicle, the forced landing area is determined, and the method comprises the following steps:
when a forced landing event is detected, controlling a depth sensor to acquire a depth image of a coverage area of the depth sensor;
controlling a radar to acquire radar data of the unmanned aerial vehicle to the coverage area;
determining a candidate area with flatness meeting preset landing conditions from the coverage area according to the depth image and the radar data;
and determining a landing area from the candidate area according to the radar data, wherein the landing area is an area in which the penetration thickness of the radar signal is smaller than a preset thickness and the signal intensity of the reflected radar signal is larger than a preset threshold value.
7. The method of forced landing of a drone of claim 6, wherein the control radar obtains radar data of the drone to the coverage area, including
Acquiring the landing area of the unmanned aerial vehicle;
dividing the coverage area into a plurality of sub-areas based on the landing area and the depth image;
controlling the radar to transmit radar signals to a current sub-area and receiving echo signals reflected by the radar signals by the current sub-area;
and acquiring the radar data of the current sub-area based on the radar signal and the echo signal.
8. The method of claim 6, wherein the determining a candidate area with a flatness meeting a preset landing condition from the coverage area according to the depth image and the radar data comprises:
aiming at the current sub-area, acquiring depth values of all object points in the current sub-area based on the depth image;
determining whether the current subarea meets a preset landing condition or not based on the depth value of each object point and the first distance;
and if so, determining the current sub-region as a candidate region.
9. The method of claim 8, wherein the determining whether the current sub-area meets the preset landing condition based on the depth values of the object points and the first distance comprises:
calculating the mean value and the variance of the depth values of all object points in the sub-area;
determining the maximum value and the minimum value of the depth values of all object points in the subarea;
calculating a difference value between the average value and the first distance to obtain a first difference value;
when the first difference value, the variance, the maximum value and the minimum value accord with a preset landing condition, determining that the sub-region accords with the preset landing condition;
when the first difference value, the variance, the maximum value and the minimum value do not accord with a preset landing condition, determining that the sub-region does not accord with the preset landing condition;
wherein the preset landing condition is at least one of the following conditions: the first difference is smaller than a first preset difference threshold, the variance is smaller than a preset variance threshold, the maximum value is smaller than a preset maximum value threshold, and the minimum value is larger than a preset minimum threshold.
10. The forced landing method of unmanned aerial vehicle of claim 6, wherein the radar data includes a first distance, a second distance, and a signal strength of an echo signal of the first surface of the unmanned aerial vehicle to a first surface and a second surface of the candidate area, respectively;
the determining a landing area from the candidate area according to the radar data includes:
calculating a difference value between the first distance and the second distance to obtain a second difference value;
judging whether the second difference is larger than a second preset difference threshold value or not;
if so, determining that the candidate area is not an area with the penetration thickness of the radar signal smaller than the preset thickness, and the candidate area is a non-landing area;
if not, determining that the candidate area is an area with the penetration thickness of the radar signal smaller than the preset thickness, and judging whether the signal intensity is smaller than a preset intensity threshold value or not;
when the signal intensity is greater than a preset intensity threshold value, determining that the candidate area is an area with the signal intensity of the reflected radar signal greater than the preset threshold value, wherein the candidate area is a landing area;
and when the signal intensity is smaller than a preset intensity threshold value, determining that the candidate area is not an area with the signal intensity of the reflected radar signal larger than the preset threshold value, wherein the candidate area is a non-landing area.
11. An unmanned aerial vehicle forced landing device, comprising:
the forced landing area determining module is used for determining a forced landing area in the flight process of the unmanned aerial vehicle;
the position information acquisition module is used for acquiring the position information of the forced landing area;
the position information storage module is used for storing the position information to a memory;
the position information reading module is used for reading the position information of each forced landing area from the memory when a forced landing event is detected;
the target forced landing area determining module is used for determining a target forced landing area according to the position information of each forced landing area;
and the forced landing execution module is used for controlling the unmanned aerial vehicle to carry out forced landing in the target forced landing area.
12. A drone, characterized in that it comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-10.
13. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of forcing the landing of a drone of any one of claims 1 to 10.
CN201910697458.XA 2019-07-30 2019-07-30 Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium Active CN112306084B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910697458.XA CN112306084B (en) 2019-07-30 2019-07-30 Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910697458.XA CN112306084B (en) 2019-07-30 2019-07-30 Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN112306084A true CN112306084A (en) 2021-02-02
CN112306084B CN112306084B (en) 2022-12-27

Family

ID=74485127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910697458.XA Active CN112306084B (en) 2019-07-30 2019-07-30 Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN112306084B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031633A (en) * 2021-03-18 2021-06-25 广州极飞科技股份有限公司 Landing method and device of unmanned aerial vehicle, unmanned aerial vehicle and medium
CN113428371A (en) * 2021-07-23 2021-09-24 中国科学院地理科学与资源研究所 Unmanned aerial vehicle capable of performing emergency treatment and unmanned aerial vehicle emergency treatment method
CN113625759A (en) * 2021-08-25 2021-11-09 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle forced landing control method and device, remote control device and storage medium
CN114578855A (en) * 2022-03-03 2022-06-03 北京新科汇智科技发展有限公司 Unmanned aerial vehicle standby landing method and system
CN114636417A (en) * 2022-05-23 2022-06-17 珠海翔翼航空技术有限公司 Aircraft forced landing path planning method, system and equipment based on image recognition
CN115167500A (en) * 2022-05-19 2022-10-11 北京远度互联科技有限公司 Unmanned aerial vehicle flight control method, device and system and readable storage medium
CN115291630A (en) * 2022-10-09 2022-11-04 广东电网有限责任公司佛山供电局 Power inspection control method and device, processing unit and unmanned aerial vehicle system
CN116088559A (en) * 2021-11-05 2023-05-09 北京三快在线科技有限公司 Unmanned aerial vehicle control system and method and unmanned aerial vehicle
TWI813085B (en) * 2021-12-02 2023-08-21 中光電智能機器人股份有限公司 Unmanned aerial vehicle and landing method for unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105405165A (en) * 2015-10-21 2016-03-16 北京航空航天大学 In-flight landform analysis and forced landing area selection simulation system for universal unmanned plane
KR101507752B1 (en) * 2014-04-18 2016-12-29 주식회사 대한항공 Method for automatic landing of uav
CN107505625A (en) * 2017-08-29 2017-12-22 北醒(北京)光子科技有限公司 A kind of Air Vehicle Detection method
CN108153334A (en) * 2017-12-01 2018-06-12 南京航空航天大学 No cooperative target formula unmanned helicopter vision is independently maked a return voyage and drop method and system
KR101980978B1 (en) * 2017-11-27 2019-05-21 연세대학교 산학협력단 Method and apparatus for determining safety of landing-area for unmanned aerial vehicle using multiple uwb radars
CN109992001A (en) * 2019-04-22 2019-07-09 西安忠林世纪电子科技有限公司 A kind of unmanned plane safe falling method, apparatus and unmanned plane

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101507752B1 (en) * 2014-04-18 2016-12-29 주식회사 대한항공 Method for automatic landing of uav
CN105405165A (en) * 2015-10-21 2016-03-16 北京航空航天大学 In-flight landform analysis and forced landing area selection simulation system for universal unmanned plane
CN107505625A (en) * 2017-08-29 2017-12-22 北醒(北京)光子科技有限公司 A kind of Air Vehicle Detection method
KR101980978B1 (en) * 2017-11-27 2019-05-21 연세대학교 산학협력단 Method and apparatus for determining safety of landing-area for unmanned aerial vehicle using multiple uwb radars
CN108153334A (en) * 2017-12-01 2018-06-12 南京航空航天大学 No cooperative target formula unmanned helicopter vision is independently maked a return voyage and drop method and system
CN109992001A (en) * 2019-04-22 2019-07-09 西安忠林世纪电子科技有限公司 A kind of unmanned plane safe falling method, apparatus and unmanned plane

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031633A (en) * 2021-03-18 2021-06-25 广州极飞科技股份有限公司 Landing method and device of unmanned aerial vehicle, unmanned aerial vehicle and medium
CN113031633B (en) * 2021-03-18 2024-06-14 广州极飞科技股份有限公司 Unmanned aerial vehicle landing method and device, unmanned aerial vehicle and medium
CN113428371A (en) * 2021-07-23 2021-09-24 中国科学院地理科学与资源研究所 Unmanned aerial vehicle capable of performing emergency treatment and unmanned aerial vehicle emergency treatment method
WO2023025200A1 (en) * 2021-08-25 2023-03-02 深圳市道通智能航空技术股份有限公司 Control method and apparatus for forced landing of unmanned aerial vehicle, remote control apparatus, and storage medium
CN113625759A (en) * 2021-08-25 2021-11-09 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle forced landing control method and device, remote control device and storage medium
CN116088559B (en) * 2021-11-05 2024-03-26 北京三快在线科技有限公司 Unmanned aerial vehicle control system and method and unmanned aerial vehicle
CN116088559A (en) * 2021-11-05 2023-05-09 北京三快在线科技有限公司 Unmanned aerial vehicle control system and method and unmanned aerial vehicle
TWI813085B (en) * 2021-12-02 2023-08-21 中光電智能機器人股份有限公司 Unmanned aerial vehicle and landing method for unmanned aerial vehicle
CN114578855A (en) * 2022-03-03 2022-06-03 北京新科汇智科技发展有限公司 Unmanned aerial vehicle standby landing method and system
CN114578855B (en) * 2022-03-03 2022-09-20 北京新科汇智科技发展有限公司 Unmanned aerial vehicle standby landing method and system
CN115167500A (en) * 2022-05-19 2022-10-11 北京远度互联科技有限公司 Unmanned aerial vehicle flight control method, device and system and readable storage medium
CN114636417A (en) * 2022-05-23 2022-06-17 珠海翔翼航空技术有限公司 Aircraft forced landing path planning method, system and equipment based on image recognition
CN115291630A (en) * 2022-10-09 2022-11-04 广东电网有限责任公司佛山供电局 Power inspection control method and device, processing unit and unmanned aerial vehicle system

Also Published As

Publication number Publication date
CN112306084B (en) 2022-12-27

Similar Documents

Publication Publication Date Title
CN112306084B (en) Forced landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium
CN109029422B (en) Method and device for building three-dimensional survey map through cooperation of multiple unmanned aerial vehicles
CN101034155B (en) Flight time measurement apparatus and method for increasing measurement rate
WO2018024177A1 (en) Method and device for terrain simulation flying of unmanned aerial vehicle and unmanned aerial vehicle
CN107076826B (en) Ultra-broadband ranging method and apparatus, barrier-avoiding method and avoidance equipment
US8788128B1 (en) Precision navigation for landing
CN105158762A (en) Identifying and tracking convective weather cells
CN109856625A (en) A kind of vessel position recognition methods based on multisource data fusion
KR101628154B1 (en) Multiple target tracking method using received signal strengths
US20170345317A1 (en) Dynamic routing based on captured data quality
KR102161246B1 (en) Bistatic sonar system having bottom mounted hydrophones and method for source deployment based on ocean environment using the same
CN112306082A (en) Method and device for determining landing area of unmanned aerial vehicle, unmanned aerial vehicle and storage medium
KR101027758B1 (en) The apparatus and method of creation a dem and dsm
CN108444336B (en) Unmanned aerial vehicle trapping method based on GNSS analog signals
KR102068760B1 (en) Unmanned aerial vehicle control device and mine detection system and mine detection method using the same
US11054503B2 (en) Radar target spherical projection method for maritime formation
CN108227739B (en) Close-range obstacle avoidance method of underwater automatic driving equipment and underwater automatic driving equipment
CN109612454A (en) Unmanned boat surveys and draws path calibration method, system and unmanned boat
CN112306083B (en) Unmanned aerial vehicle landing area determining method and device, unmanned aerial vehicle and storage medium
CN105954746A (en) Landform correction meter wave radar height measurement method based on broadcast automatic mutual supervisory signals
CN112306086B (en) Unmanned aerial vehicle landing area determining method and device, unmanned aerial vehicle and storage medium
KR101303774B1 (en) Flight obstacle extraction device, flight obstacle extraction method, and recording medium
CN105066986B (en) A kind of multi-mode uploads the Ground landing system and its control method of system
CN109799477A (en) A kind of sequential vehicle fingerprint localization method and device towards millimeter wave car networking
EP1259777B1 (en) A system and a method for navigating a vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: Guangzhou Jifei Technology Co.,Ltd.

Address before: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant before: Guangzhou Xaircraft Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant