CN112650304B - Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle Download PDF

Info

Publication number
CN112650304B
CN112650304B CN202110074354.0A CN202110074354A CN112650304B CN 112650304 B CN112650304 B CN 112650304B CN 202110074354 A CN202110074354 A CN 202110074354A CN 112650304 B CN112650304 B CN 112650304B
Authority
CN
China
Prior art keywords
landing
unmanned aerial
aerial vehicle
height
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110074354.0A
Other languages
Chinese (zh)
Other versions
CN112650304A (en
Inventor
王浩
牛欢
刘培宇
曾锐
张炯
杨志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Original Assignee
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commercial Aircraft Corp of China Ltd, Beijing Aeronautic Science and Technology Research Institute of COMAC filed Critical Commercial Aircraft Corp of China Ltd
Priority to CN202110074354.0A priority Critical patent/CN112650304B/en
Publication of CN112650304A publication Critical patent/CN112650304A/en
Application granted granted Critical
Publication of CN112650304B publication Critical patent/CN112650304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The patent discloses an unmanned aerial vehicle autonomous landing system, method and unmanned aerial vehicle, belongs to the soft field of computer for improve unmanned aerial vehicle autonomous landing's initial altitude, improve unmanned aerial vehicle autonomous landing accuracy, avoided among the prior art completely rely on GPS or single camera guide autonomous landing the accuracy low, receive the problem of interference easily. According to the invention, landing target point information is acquired mainly by setting at least two landing sensors with different recognition ranges, such as cameras with different focal lengths, so as to control the unmanned aerial vehicle to land. When the unmanned aerial vehicle is at the height which is only suitable for a unique landing sensor, identifying a landing target point by utilizing landing related information acquired by the landing Liu Chuangan device; when the environment where the unmanned aerial vehicle is located can be suitable for at least two landing sensors, landing related information acquired by the at least two landing sensors is synthesized, a landing target point is identified, and finally the unmanned aerial vehicle is controlled to land.

Description

Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle
Technical Field
The invention relates to the field of computers, in particular to an unmanned aerial vehicle autonomous landing system and method and an unmanned aerial vehicle.
Background
With the rapid development of the unmanned aerial vehicle industry, the demand for unmanned aerial vehicle products is not limited to small consumer-grade rotary wing unmanned aerial vehicles, the demand of various industries for large composite wing manned or cargo unmanned aerial vehicles is also more and more prominent, and the demands for autonomy and accuracy of unmanned aerial vehicle landing are higher and higher. Because GPS signal is easily disturbed by shelter from thing and the lower characteristics of accuracy, unmanned aerial vehicle only relies on GPS positional information to realize the precision that fixed position landed, can't satisfy the needs to unmanned aerial vehicle autonomous accurate landing now. Therefore, the unmanned aerial vehicle needs more information to assist in completing the landing function besides using GPS information to assist in achieving autonomous and accurate landing at a fixed position.
Along with the continuous development and maturity of visual target identification and positioning technologies, the effect of combining the visual target positioning method with the prior art to assist the unmanned aerial vehicle to realize the autonomous accurate landing method at the fixed position is better and better. The unmanned aerial vehicle has the characteristics of low flying height and small volume and easiness in control, the unmanned aerial vehicle can be assisted to realize the function of autonomous landing at a fixed position only by using a simple camera device at a lower position, and the unmanned aerial vehicle with the composite wing, which is large in volume and weight and high in flying height, is not easy to control, so that autonomous guidance at a position to be started at a higher height is needed. However, the common focal length camera cannot meet the requirement of the visual target positioning method on the recognition distance, if the camera is changed into the long focal length camera, the visual positioning method can be met, but when the unmanned aerial vehicle is lowered to the low altitude, the image range is smaller due to the fact that the long focal length camera is used, and the target positioning algorithm is easy to lose targets due to swing of the unmanned aerial vehicle, so that auxiliary guiding fails. Therefore, the target identification positioning method capable of meeting the requirements of high altitude and low altitude simultaneously in the autonomous landing function of the unmanned aerial vehicle becomes a key technology.
Disclosure of Invention
The invention provides an unmanned aerial vehicle autonomous landing system, an unmanned aerial vehicle autonomous landing method and an unmanned aerial vehicle, which are used for simultaneously solving the problems that the unmanned aerial vehicle is low in the recognition landing point and a detection target is lost when approaching the ground.
The embodiment of the invention provides an unmanned aerial vehicle autonomous landing system, which comprises:
the information acquisition module is used for setting at least two landing sensors with different identification ranges on the unmanned aerial vehicle and acquiring landing related information, wherein the identification ranges of the at least two landing sensors sequentially and overlappingly cover each stage of the unmanned aerial vehicle when landing;
the landing point identification module is used for identifying a landing target point by utilizing landing related information acquired by the landing Liu Chuangan device when the environment where the unmanned aerial vehicle is located is only suitable for a unique landing sensor; when at least two landing sensors can be applied to the environment where the unmanned aerial vehicle is located, landing related information acquired by the at least two landing sensors is synthesized, and a landing target point is identified;
and the control module is used for controlling the unmanned aerial vehicle to land to the landing target point.
Optionally, the landing site identification module includes:
an information preprocessing unit for preprocessing landing related information acquired by the landing sensor;
the target identification unit detects a landing target point according to the preprocessed landing related information;
and the data processing unit is used for calculating the relative position relationship between the unmanned aerial vehicle and the landing target point according to the landing target point detected by the target recognition unit.
Optionally, in the data processing unit, when the altitude of the unmanned aerial vehicle can enable the landing related information acquired by the at least two landing sensors to identify the landing target point, the relative position relationship between the unmanned aerial vehicle and the landing target point is calculated respectively, and the relative positions obtained by the at least two landing sensors are weighted and averaged to obtain the final relative position relationship.
Alternatively, when the number of landing sensors is two, the landing sensors are divided into a first landing sensor and a second landing sensor, and the relative positions obtained by the Liu Chuangan device are weighted and averaged, and the weights are determined by the following ways:
firstly, setting a first height threshold value and a second height threshold value, wherein the first height threshold value is larger than the second height threshold value;
when the height of the unmanned aerial vehicle is larger than a first height threshold value, giving a higher weight to the relative position obtained by the first landing sensor;
when the unmanned aerial vehicle height is smaller than the first height threshold value and larger than the second height threshold value, the relative positions obtained by the first landing sensor and the second landing sensor are of the same weight;
and when the current height of the unmanned aerial vehicle is smaller than a second height threshold value, giving a higher weight to the relative position obtained by the second sensor.
Optionally, the landing sensor is a camera with different focal lengths.
Optionally, the number of cameras is 2, is short focus wide angle camera and long focus camera respectively.
The embodiment of the invention provides an unmanned aerial vehicle autonomous landing method, which comprises the following steps:
acquiring landing related information acquired by at least two landing sensors and preprocessing, wherein the landing sensors have different identification ranges, and the identification ranges sequentially and overlappingly cover each stage of unmanned aerial vehicle landing;
respectively carrying out landing target detection on the preprocessed landing related information to obtain landing target position information;
when the height of the unmanned aerial vehicle is only suitable for a unique landing sensor, controlling the unmanned aerial vehicle to land to a target position by utilizing landing target position information obtained by the unique landing sensor;
when more than two landing sensors can be applied to the height of the unmanned aerial vehicle, weighted average is carried out on landing target position information obtained by the more than two landing sensors to obtain optimized landing target position information, and the unmanned aerial vehicle is controlled to land on a target position by utilizing the optimized landing target position information.
Optionally, when the number of landing sensors is two, the landing sensors are divided into a first landing sensor and a second landing sensor, and weighted average is performed on the landing target position information obtained by the Liu Chuangan device, and the weight is determined by the following ways:
firstly, setting a first height threshold value and a second height threshold value, wherein the first height threshold value is larger than the second height threshold value;
when the height of the unmanned aerial vehicle is larger than a first height threshold value, giving a higher weight to the relative position obtained by the first landing sensor;
when the unmanned aerial vehicle height is smaller than the first height threshold value and larger than the second height threshold value, the relative positions obtained by the first landing sensor and the second landing sensor are of the same weight;
and when the current height of the unmanned aerial vehicle is smaller than a second height threshold value, giving a higher weight to the relative position obtained by the second sensor.
Optionally, the landing sensor is a camera with different focal lengths.
The embodiment of the invention provides an unmanned aerial vehicle, wherein the unmanned aerial vehicle autonomous landing system or the unmanned aerial vehicle autonomous landing method is used when the unmanned aerial vehicle autonomously lands.
According to the unmanned aerial vehicle autonomous landing system, the unmanned aerial vehicle autonomous landing method and the unmanned aerial vehicle, at least two landing sensors with different recognition ranges are arranged on the unmanned aerial vehicle and used for collecting landing related information, and the recognition ranges of the at least two landing sensors sequentially and overlappingly cover each stage when the unmanned aerial vehicle lands; when the environment where the unmanned aerial vehicle is located is only applicable to a unique landing sensor, identifying a landing target point by utilizing landing related information acquired by the landing Liu Chuangan device; when at least two landing sensors can be applied to the environment where the unmanned aerial vehicle is located, landing related information acquired by the at least two landing sensors is synthesized, and a landing target point is identified; and finally controlling the unmanned aerial vehicle to land to the landing target point.
The invention uses at least two landing sensors with different recognition ranges to realize the automatic high-altitude accurate landing of the unmanned aerial vehicle, and solves the problems that the GPS landing has low accuracy and is easy to be interfered when the GPS landing is completely carried out by depending on GPS data. And simultaneously solves the problem that when a single landing sensor (such as a camera) is used, the initial height of landing is limited by the identification range of the single sensor. According to the invention, through the cooperation of at least two landing sensors, the landing accuracy is ensured, the initial height of the unmanned aerial vehicle for autonomous landing is improved, and the autonomy and the accuracy of the unmanned aerial vehicle landing are enhanced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of an autonomous landing system for a drone in one embodiment of the invention
Fig. 2 is a flow chart of an autonomous landing method of a unmanned aerial vehicle according to an embodiment of the present invention
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides an unmanned aerial vehicle autonomous landing system, as shown in fig. 1, comprising: the landing site comprises an information acquisition module, a landing site identification module and a control module. The landing site identification module comprises an information preprocessing unit, a target identification unit and a data processing unit.
The information acquisition module comprises at least two landing sensors with different identification ranges on the unmanned aerial vehicle and is used for acquiring landing related information. The landing sensor is arranged under the belly and is used for collecting landing related information including the position of a landing target point.
And the identification ranges of at least two landing sensors in the information acquisition module sequentially and overlappably cover each stage of unmanned aerial vehicle landing. The "sequential" refers to that each landing sensor has different applicable height ranges in the process of landing the unmanned aerial vehicle from a higher height, and each landing sensor collects landing related information in each range for subsequent landing point identification. The recognition range can be overlapped, namely, the selection of the landing sensor can allow the unmanned aerial vehicle to recognize that the situation of landing points exists when two or more landing sensors exist at a certain height. Each stage of the unmanned aerial vehicle during landing can be divided in height and combined with a specific identification range of a landing sensor.
In one embodiment of the present invention, the landing sensor is a camera installed under the belly, and the identification range refers to the field of view range of the camera, specifically, the difference of focal lengths of the cameras, so as to ensure that landing target images can be clearly acquired at different heights. The landing target point is a landing mark painted on the landing point.
The unmanned aerial vehicle autonomous landing system further comprises a landing point identification module, and when the environment where the unmanned aerial vehicle is located is only applicable to a unique landing sensor, a landing target point is identified by utilizing landing related information acquired by the Liu Chuangan device; when the environment where the unmanned aerial vehicle is located can be suitable for at least two landing sensors, landing related information acquired by the at least two landing sensors is synthesized, and a landing target point is identified.
The landing site identification module specifically comprises:
the information preprocessing unit is used for preprocessing landing related information acquired by the landing sensor, so that the accuracy rate of the subsequent target recognition unit is improved;
and the target identification unit is used for detecting a landing target point according to the preprocessed landing related information, wherein the landing target point is an image mark landing mark which is painted on the landing point and provided with characteristic information.
And the data processing unit is used for synthesizing the landing target points detected by the at least two landing sensors obtained by the target recognition unit only when the at least two landing sensors can be applied to the landing target points to obtain final landing target points.
In one embodiment of the present invention, in the data processing unit, when the altitude of the unmanned aerial vehicle can enable the landing related information collected by the at least two landing sensors to identify the landing target point, the relative position relationship between the unmanned aerial vehicle and the landing target point is calculated respectively, and the relative positions obtained by the at least two landing sensors are weighted and averaged to obtain the final relative position relationship.
In one embodiment provided by the invention, the number of the landing sensors is two, and the landing sensors are divided into a first landing sensor and a second landing sensor, wherein the landing height covered by the first landing sensor identification range is higher than the landing height covered by the second landing sensor identification range. In this embodiment, the specific method for weighted averaging the relative positions is as follows: first, a first height threshold and a second height threshold are set, wherein the first height threshold is larger than the second height threshold. When the altitude of the front aircraft is greater than a first altitude threshold value, giving a higher weight to the relative position obtained by the first landing sensor; when the aircraft altitude is smaller than the first altitude threshold value and larger than the second altitude threshold value, the relative positions obtained by the first landing sensor and the second landing sensor are of the same weight; and when the current altitude of the aircraft is smaller than the second altitude threshold value, giving higher weight to the relative position obtained by the second sensor.
The above-mentioned giving of a relatively high weight to the relative position obtained by a certain landing sensor means that the relative position obtained by the landing sensor has a higher weight in the average calculation than the relative positions obtained by other landing sensors when the average calculation is performed.
And obtaining the pixel position of the target in the image through weighted average, then carrying out coordinate system conversion calculation, and converting the relative position of the target and the camera into the relative position of the target and the unmanned aerial vehicle.
The unmanned aerial vehicle autonomous landing system further comprises a control module for controlling the unmanned aerial vehicle to land to the landing target point. And according to the landing target point identified by the target identification unit or the data processing unit, calculating the relative position of the landing target point and the unmanned aerial vehicle, inputting the relative position into the PID algorithm module, and controlling the moving position until the unmanned aerial vehicle lands right above the landing beacon, thereby realizing autonomous and accurate landing of the unmanned aerial vehicle.
Example 1:
in one embodiment provided by the invention, the landing sensors in the information acquisition module are two cameras, in particular a short-focal-length wide-angle camera and a long-focal-length camera, so that the unmanned aerial vehicle can perform uninterrupted image acquisition on the landing beacon in the whole course in the descending process from a higher flying height for subsequent landing target identification. The landing target point is an image mark landing mark which is painted on the landing point and provided with characteristic information.
The landing point identification module is used for processing images acquired by the two cameras and identifying landing target orders in the images. When the flight height of the unmanned aerial vehicle is higher, the landing beacon cannot be identified by the image output by the short-focus wide-angle camera, and only the image output by the long-focus camera can be used for identifying and positioning the landing beacon on the ground; along with the reduction of the flight height of the unmanned aerial vehicle, the images shot by the short-focus wide-angle camera and the images shot by the long-focus camera can be used for carrying out image acquisition on land beacons; when the flying height of the unmanned aerial vehicle is reduced to a certain height, landing beacons in the long-focus cameras are more and more blurred, and only short-focus wide-angle cameras can be used for carrying out image acquisition on the landing beacons.
The information preprocessing unit sequentially performs gray level conversion, gaussian filtering, median filtering and edge detection binarization processing on the images acquired by the two cameras, and performs target detection after the images are processed, so that the stability and the accuracy of a target detection positioning algorithm can be improved.
The target recognition unit is used for carrying out target detection on the image processed by the information preprocessing unit, and detecting a landing target point in the image, wherein the landing target point is a landing mark painted on the landing point.
When only the image acquired by the single camera can identify the target landing point, the data processing unit calculates the pixel position of the landing target point in the image, which is obtained by the target identification unit, and then performs coordinate system conversion calculation to convert the relative position of the target and the camera into the relative position of the target and the unmanned aerial vehicle.
When the unmanned aerial vehicle is at a height which enables images acquired by the short-focus wide-angle camera and the long-focus camera to identify a landing target point, the data processing unit calculates the pixel position of the landing target point in the images according to the landing target points identified by the two cameras, then performs coordinate system conversion calculation, converts the relative position of the target and the camera into the relative position of the target and the unmanned aerial vehicle, and performs weighted average on the relative positions obtained by the two images, and the specific method comprises the following steps:
firstly, setting a height threshold value 1 to be 50 meters and a height threshold value 2 to be 20 meters, wherein when the height of the current aircraft is greater than the height threshold value 1, the position coordinate weight of the short-focus camera is 0.3, and the position coordinate weight of the long-focus camera is 0.7; when the aircraft height is smaller than the threshold value 1 and larger than the height threshold value 2, the weight of the position coordinates of the short-focus camera is 0.5, and the weight of the position coordinates of the long-focus camera is 0.5; when the current altitude of the aircraft is less than the altitude threshold 2, the short-focus camera position coordinate weight is 0.7, and the long-focus camera coordinate weight is 0.3.
And obtaining the pixel position of the target in the image through weighted average, then carrying out coordinate system conversion calculation, and converting the relative position of the target and the camera into the relative position of the target and the unmanned aerial vehicle.
The control module inputs the relative position into the PID algorithm module, controls the moving position until the unmanned aerial vehicle lands right above the landing beacon, and finally realizes the function of assisting the unmanned aerial vehicle in autonomous accurate position landing by using the visual target recognition and positioning method.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
In an embodiment of the present invention, an autonomous landing method of an unmanned aerial vehicle is provided, where the method corresponds to the autonomous landing system of an unmanned aerial vehicle in one-to-one correspondence in the above embodiment. The unmanned aerial vehicle autonomous landing method, as shown in fig. 2, specifically includes:
and acquiring landing related information acquired by at least two landing sensors and preprocessing, wherein the landing sensors have different identification ranges, and the identification ranges sequentially and overlappingly cover each stage of unmanned aerial vehicle landing.
In one embodiment of the present invention, the landing sensor is a camera installed under the belly, and the identification range refers to the field of view range of the camera, specifically, the difference of focal lengths of the cameras, so as to ensure that landing target images can be clearly acquired at different heights. In this embodiment, the number of cameras is two, namely a short-focus wide-angle camera and a long-focus camera, and the landing target point is an image mark landing mark coated on the landing point and provided with characteristic information.
The identification range covers each stage of the unmanned aerial vehicle landing in sequence and in an overlapping manner. The "sequential" refers to that each landing sensor has different applicable height ranges in the process of landing the unmanned aerial vehicle from a higher height, and each landing sensor collects landing related information in each range for subsequent landing point identification. The recognition range can be overlapped, namely, the selection of the landing sensor can allow the unmanned aerial vehicle to recognize that the situation of landing points exists when two or more landing sensors exist at a certain height. Each stage of the unmanned aerial vehicle during landing can be divided in height and combined with a specific identification range of a landing sensor. The short-focus wide-angle camera and the long-focus camera provided by the embodiment can cover each stage of unmanned aerial vehicle landing in sequence and in an overlapping manner: when the unmanned aerial vehicle is at a higher height, the landing target point on the ground cannot be identified by the image shot by the short-focus wide-angle camera, and the long-focus camera can clearly acquire landing point information; when the unmanned aerial vehicle is low in height, the long-focus camera cannot acquire clear landing point information, and the image shot by the short-focus wide-angle camera can identify a landing target point on the ground; when the unmanned aerial vehicle is located at a height between the two conditions, the landing target points on the ground can be identified by the images shot by the two cameras, namely, the identification ranges of the two can be overlapped.
The preprocessing refers to optimizing an image shot by a camera, and specifically comprises the following steps: gray conversion, gaussian filtering, median filtering and edge detection binarization are sequentially carried out, and the stability and accuracy of a target detection positioning algorithm can be improved by carrying out target detection after the image is processed.
Respectively carrying out landing target detection on the preprocessed landing related information, namely the image shot by the processed camera in the embodiment, so as to obtain landing target position information; the landing target point is an image mark landing mark which is painted on the landing point and provided with characteristic information.
When the unmanned aerial vehicle is at the height which is only suitable for the unique landing sensor, the landing target position information obtained by the unique landing sensor is utilized to control the unmanned aerial vehicle to land to the target position. In this embodiment, when the height of the unmanned aerial vehicle is higher or lower, only one image captured by the camera can identify the landing target point on the ground, and the pixel position of the target in the image is calculated. After the position of the target in the image is obtained, coordinate system conversion calculation is carried out, and the relative position of the target and the camera is converted into the relative position of the target and the unmanned aerial vehicle. And finally, inputting the relative position into a PID algorithm, and controlling the moving position until the unmanned aerial vehicle falls to the position right above the landing beacon.
When the unmanned aerial vehicle is at a height which enables images shot by two cameras to identify landing target points on the ground, the pixel positions of the targets in the two images are calculated by weighted average, and the specific method comprises the following steps:
firstly, setting a height threshold value 1 to be 50 meters and a height threshold value 2 to be 20 meters, wherein when the height of the current aircraft is greater than the height threshold value 1, the position coordinate weight of the short-focus camera is 0.3, and the position coordinate weight of the long-focus camera is 0.7; when the aircraft height is smaller than the threshold value 1 and larger than the height threshold value 2, the weight of the position coordinates of the short-focus camera is 0.5, and the weight of the position coordinates of the long-focus camera is 0.5; when the current altitude of the aircraft is less than the altitude threshold 2, the short-focus camera position coordinate weight is 0.7, and the long-focus camera coordinate weight is 0.3.
And obtaining the pixel position of the target in the image through weighted average, then carrying out coordinate system conversion calculation, and converting the relative position of the target and the camera into the relative position of the target and the unmanned aerial vehicle. And finally, inputting the relative position into a PID algorithm, and controlling the moving position until the unmanned aerial vehicle falls to the position right above the landing beacon, so as to finally realize the function of assisting the unmanned aerial vehicle in autonomous accurate position landing by using a visual target recognition and positioning method.
In an embodiment of the present invention, a unmanned aerial vehicle is provided, where the unmanned aerial vehicle autonomous landing system in the above embodiment is installed or the unmanned autonomous landing method in the above embodiment is used.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (10)

1. An unmanned aerial vehicle autonomous landing system, comprising:
the information acquisition module is used for setting at least two landing sensors with different identification ranges on the unmanned aerial vehicle and acquiring landing related information, wherein the identification ranges of the at least two landing sensors sequentially and overlapped cover each stage of the unmanned aerial vehicle when landing, and each stage of the unmanned aerial vehicle when landing is divided by height;
the landing point identification module is used for identifying a landing target point by utilizing landing related information acquired by the landing Liu Chuangan device when the environment where the unmanned aerial vehicle is located is only suitable for a unique landing sensor; when at least two landing sensors can be applied to the environment where the unmanned aerial vehicle is located, landing related information acquired by the at least two landing sensors is synthesized, and a landing target point is identified;
the control module controls the unmanned aerial vehicle to land to the landing target point;
in the process of landing of the unmanned aerial vehicle from a higher height, each landing sensor has different applicable height ranges, each landing sensor collects landing related information in each range, and the higher height is larger than a first height threshold; the identification ranges can overlap to allow the unmanned aerial vehicle to be at a certain height, and two or more landing sensors can identify landing points.
2. The unmanned aerial vehicle autonomous landing system of claim 1, wherein the landing site identification module comprises:
an information preprocessing unit for preprocessing landing related information acquired by the landing sensor;
the target identification unit detects a landing target point according to the preprocessed landing related information;
and the data processing unit is used for calculating the relative position relationship between the unmanned aerial vehicle and the landing target point according to the landing target point detected by the target recognition unit.
3. The autonomous landing system of claim 2, wherein in the data processing unit, when the altitude of the unmanned aerial vehicle can enable landing related information acquired by at least two landing sensors to identify a landing target point, the relative positional relationship between the unmanned aerial vehicle and the landing target point is calculated respectively, and the relative positions obtained by the at least two landing sensors are weighted and averaged to obtain a final relative positional relationship.
4. A drone autonomous landing system according to claim 3, wherein when the number of landing sensors is two, it is divided into a first landing sensor and a second landing sensor, the relative positions obtained against the Liu Chuangan device are weighted averaged, the weights being determined by:
firstly, setting a first height threshold value and a second height threshold value, wherein the first height threshold value is larger than the second height threshold value;
when the height of the unmanned aerial vehicle is larger than a first height threshold value, giving a higher weight to the relative position obtained by the first landing sensor;
when the unmanned aerial vehicle height is smaller than the first height threshold value and larger than the second height threshold value, the relative positions obtained by the first landing sensor and the second landing sensor are of the same weight;
and when the current height of the unmanned aerial vehicle is smaller than a second height threshold value, giving a higher weight to the relative position obtained by the second landing sensor.
5. The unmanned aerial vehicle autonomous landing system of claim 1, 2, 3, or 4, wherein the landing sensors are cameras with different focal lengths.
6. The unmanned aerial vehicle autonomous landing system of claim 5, the number of cameras is 2, a short focal length wide angle camera and a long focal length camera, respectively.
7. An unmanned aerial vehicle autonomous landing method, comprising:
acquiring landing related information acquired by at least two landing sensors and preprocessing, wherein the landing sensors have different identification ranges, the identification ranges sequentially and overlappingly cover all stages of unmanned aerial vehicles when landing, and all stages of unmanned aerial vehicles when landing are divided by height;
respectively carrying out landing target detection on the preprocessed landing related information to obtain landing target position information;
when the height of the unmanned aerial vehicle is only suitable for a unique landing sensor, controlling the unmanned aerial vehicle to land on a target position by utilizing landing target position information obtained by the unique landing sensor;
when more than two landing sensors can be applied to the height of the unmanned aerial vehicle, carrying out weighted average on landing target position information obtained by the more than two landing sensors to obtain optimized landing target position information, and controlling the unmanned aerial vehicle to land to a target position by utilizing the optimized landing target position information;
in the process of landing of the unmanned aerial vehicle from a higher height, each landing sensor has different applicable height ranges, each landing sensor collects landing related information in each range, and the higher height is larger than a first height threshold; the identification ranges can overlap to allow the unmanned aerial vehicle to be at a certain height, and two or more landing sensors can identify landing points.
8. The unmanned aerial vehicle autonomous landing method of claim 7, wherein when the number of landing sensors is two, the landing sensors are divided into a first landing sensor and a second landing sensor, and the landing target position information obtained against the Liu Chuangan device is weighted-averaged, and the weight is determined by:
firstly, setting a first height threshold value and a second height threshold value, wherein the first height threshold value is larger than the second height threshold value;
when the height of the unmanned aerial vehicle is larger than a first height threshold value, giving a higher weight to the relative position obtained by the first landing sensor;
when the unmanned aerial vehicle height is smaller than the first height threshold value and larger than the second height threshold value, the relative positions obtained by the first landing sensor and the second landing sensor are of the same weight;
and when the current height of the unmanned aerial vehicle is smaller than a second height threshold value, giving a higher weight to the relative position obtained by the second landing sensor.
9. The unmanned aerial vehicle autonomous landing method of claim 7 or 8, wherein the landing sensors are cameras with different focal lengths.
10. A drone characterized in that the drone autonomous landing system of any one of claims 1-6 or the drone autonomous landing method of any one of claims 7-9 is used.
CN202110074354.0A 2021-01-20 2021-01-20 Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle Active CN112650304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110074354.0A CN112650304B (en) 2021-01-20 2021-01-20 Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110074354.0A CN112650304B (en) 2021-01-20 2021-01-20 Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112650304A CN112650304A (en) 2021-04-13
CN112650304B true CN112650304B (en) 2024-03-05

Family

ID=75370728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110074354.0A Active CN112650304B (en) 2021-01-20 2021-01-20 Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112650304B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113485440A (en) * 2021-07-31 2021-10-08 武夷科技信息(北京)有限公司 Direction control method for landing flight of unmanned aerial vehicle
CN113917934A (en) * 2021-11-22 2022-01-11 江苏科技大学 Unmanned aerial vehicle accurate landing method based on laser radar

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749847A (en) * 2012-06-26 2012-10-24 清华大学 Cooperative landing method for multiple unmanned aerial vehicles
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN105468006A (en) * 2014-09-26 2016-04-06 空中客车防卫和太空有限责任公司 Redundant Determination of Positional Data for an Automatic Landing System
CN107399440A (en) * 2017-07-27 2017-11-28 北京航空航天大学 Aircraft lands method and servicing unit
CN108255190A (en) * 2016-12-28 2018-07-06 北京卓翼智能科技有限公司 Precision landing method based on multisensor and it is tethered at unmanned plane using this method
CN109562844A (en) * 2016-08-06 2019-04-02 深圳市大疆创新科技有限公司 The assessment of automatic Landing topographical surface and relevant system and method
CN109643129A (en) * 2016-08-26 2019-04-16 深圳市大疆创新科技有限公司 The method and system of independent landing
CN112229406A (en) * 2020-09-29 2021-01-15 中国航空工业集团公司沈阳飞机设计研究所 Redundancy guide full-automatic landing information fusion method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2527792B1 (en) * 2011-05-27 2014-03-12 EADS Deutschland GmbH Method for supporting a pilot when landing an aircraft in case of restricted visibility

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749847A (en) * 2012-06-26 2012-10-24 清华大学 Cooperative landing method for multiple unmanned aerial vehicles
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
CN105468006A (en) * 2014-09-26 2016-04-06 空中客车防卫和太空有限责任公司 Redundant Determination of Positional Data for an Automatic Landing System
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN109562844A (en) * 2016-08-06 2019-04-02 深圳市大疆创新科技有限公司 The assessment of automatic Landing topographical surface and relevant system and method
CN109643129A (en) * 2016-08-26 2019-04-16 深圳市大疆创新科技有限公司 The method and system of independent landing
CN108255190A (en) * 2016-12-28 2018-07-06 北京卓翼智能科技有限公司 Precision landing method based on multisensor and it is tethered at unmanned plane using this method
CN107399440A (en) * 2017-07-27 2017-11-28 北京航空航天大学 Aircraft lands method and servicing unit
CN112229406A (en) * 2020-09-29 2021-01-15 中国航空工业集团公司沈阳飞机设计研究所 Redundancy guide full-automatic landing information fusion method and system

Also Published As

Publication number Publication date
CN112650304A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
EP2413096B1 (en) Ground-based videometrics guiding method for aircraft landing or unmanned aerial vehicles recovery
KR100985195B1 (en) System for automatic taking off and landing of image based
CN111326023A (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN112650304B (en) Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle
CN104015931B (en) Vision localization, measurement and control method, system and experimental platform for automatic refueling dead zone of unmanned aerial vehicle
CN108153334B (en) Visual autonomous return and landing method and system for unmanned helicopter without cooperative target
CN110222612B (en) Dynamic target identification and tracking method for autonomous landing of unmanned aerial vehicle
CN105302151A (en) Aircraft docking guidance and type recognition system and method
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN110044212B (en) Rotor unmanned aerial vehicle capture recovery method based on vision measurement information
CN105549614A (en) Target tracking method of unmanned plane
CN103149939A (en) Dynamic target tracking and positioning method of unmanned plane based on vision
CN106127201A (en) A kind of unmanned plane landing method of view-based access control model positioning landing end
CN106697322A (en) Automatic abutting system and method for boarding bridge
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
Lebedev et al. Accurate autonomous uav landing using vision-based detection of aruco-marker
RU2703412C2 (en) Automatic aircraft landing method
Zarandy et al. A novel algorithm for distant aircraft detection
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
US11816863B2 (en) Method and device for assisting the driving of an aircraft moving on the ground
CN115665553B (en) Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium
US20230073120A1 (en) Method for Controlling an Unmanned Aerial Vehicle for an Inspection Flight to Inspect an Object and Inspection Unmanned Aerial Vehicle
CN116202489A (en) Method and system for co-locating power transmission line inspection machine and pole tower and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant