CN111766900A - System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium - Google Patents

System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN111766900A
CN111766900A CN202010611271.6A CN202010611271A CN111766900A CN 111766900 A CN111766900 A CN 111766900A CN 202010611271 A CN202010611271 A CN 202010611271A CN 111766900 A CN111766900 A CN 111766900A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
target image
landing area
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010611271.6A
Other languages
Chinese (zh)
Inventor
李坤煌
和瑞江
张玉礼
冼永胜
陈淋燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen High Innovation Technology Co ltd
Original Assignee
Shenzhen High Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen High Innovation Technology Co ltd filed Critical Shenzhen High Innovation Technology Co ltd
Priority to CN202010611271.6A priority Critical patent/CN111766900A/en
Publication of CN111766900A publication Critical patent/CN111766900A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control

Abstract

The application is suitable for the technical field of computers, and provides a system, a method and a storage medium for high-precision autonomous landing of an unmanned aerial vehicle, wherein the method comprises the following steps: acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle; calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the landing area of the unmanned aerial vehicle; and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle. In the mode, the infrared beacons on the unmanned aerial vehicle are shot through the camera arranged in the unmanned aerial vehicle landing area to obtain the target image, and the shot target image is clear and accurate due to the fact that the camera is very stable. And then can be according to the accurate unmanned aerial vehicle position that obtains of this target image, make unmanned aerial vehicle be difficult to appear descending the deviation when descending, improved the precision that unmanned aerial vehicle independently descended.

Description

System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium
Technical Field
The application belongs to the technical field of computers, and particularly relates to a system and a method for high-precision autonomous landing of an unmanned aerial vehicle and a storage medium.
Background
Along with the development of science and technology, unmanned aerial vehicle's application is more and more extensive, and control technique is becoming mature day by day. In order to improve the automation function of the unmanned aerial vehicle, the autonomous landing of the unmanned aerial vehicle is very important.
The traditional landing method of the unmanned aerial vehicle generally adopts a camera to shoot a landing pattern on the ground or an infrared beacon on the ground, and determines a landing point based on the position of the landing pattern or the infrared beacon to realize autonomous landing of the unmanned aerial vehicle. However, the unmanned aerial vehicle often appears rocking state when flying, and this makes the landing pattern or the position of infrared beacon that shoot appear the deviation, leads to landing place location inaccurate, and then leads to the unmanned aerial vehicle not to land accurate landing place.
Disclosure of Invention
In view of this, embodiments of the present application provide a system, a method, and a storage medium for high-precision autonomous landing of an unmanned aerial vehicle, so as to solve the problem that the unmanned aerial vehicle cannot land to an accurate landing point due to inaccurate landing point positioning.
The first aspect of the embodiment of the application provides a method for high-precision autonomous landing of an unmanned aerial vehicle, which comprises the following steps:
acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
Optionally, calculating the position of the drone based on the corresponding position of the infrared beacon in the target image and the central position of the drone landing area includes:
according to the corresponding position of the infrared beacon in the target image and the central position, determining the height and the horizontal distance of the unmanned aerial vehicle relative to the central position;
determining a location of the drone based on the altitude and the horizontal distance.
Optionally, the target image includes two adjacent frames of images at adjacent shooting times; the method for determining the corresponding position of the infrared beacon in the target image comprises the following steps:
carrying out difference processing on the two frames of images to obtain a difference image;
and determining the corresponding position of the infrared beacon in the differential image according to the differential image.
Optionally, controlling the drone to land from the drone's location to the drone landing area includes:
detecting the course of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to fly above the landing area of the unmanned aerial vehicle according to the heading and the horizontal distance of the unmanned aerial vehicle;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height.
Optionally, according to the course of unmanned aerial vehicle with horizontal distance, control unmanned aerial vehicle flies to unmanned aerial vehicle lands the top in the region, include:
in the flight process of the unmanned aerial vehicle, if the fact that the horizontal distance is shortened is detected, the current flight state is kept, and the unmanned aerial vehicle is controlled to fly above a landing area of the unmanned aerial vehicle;
and if the fact that the horizontal distance is longer is detected, re-acquiring the target image of the unmanned aerial vehicle, determining a new horizontal distance based on the new target image, and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the new horizontal distance.
Optionally, the controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height includes:
in the unmanned aerial vehicle landing process, if the unmanned aerial vehicle is detected to exceed a threshold range corresponding to the central position in the vertical direction, controlling the unmanned aerial vehicle to fly to the threshold range, and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area;
if the unmanned aerial vehicle is detected not to exceed the threshold range, the current landing state is kept, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
A second aspect of the embodiments of the present invention provides a device for high-precision autonomous landing of an unmanned aerial vehicle, the device including:
the acquisition unit is used for acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
the calculating unit is used for calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and the control unit is used for controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
Optionally, the computing unit is specifically configured to:
according to the corresponding position of the infrared beacon in the target image and the central position, determining the height and the horizontal distance of the unmanned aerial vehicle relative to the central position;
determining a location of the drone based on the altitude and the horizontal distance.
Optionally, the target image includes two adjacent frames of images at adjacent shooting times; the method for determining the corresponding position of the infrared beacon in the target image comprises the following steps:
carrying out difference processing on the two frames of images to obtain a difference image;
and determining the corresponding position of the infrared beacon in the differential image according to the differential image.
Optionally, the control unit includes:
the detection unit is used for detecting the course of the unmanned aerial vehicle;
the flight control unit is used for controlling the unmanned aerial vehicle to fly above the landing area of the unmanned aerial vehicle according to the heading and the horizontal distance of the unmanned aerial vehicle;
and the landing control unit is used for controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height.
Optionally, the flight control unit is specifically configured to:
in the flight process of the unmanned aerial vehicle, if the fact that the horizontal distance is shortened is detected, the current flight state is kept, and the unmanned aerial vehicle is controlled to fly above a landing area of the unmanned aerial vehicle;
and if the fact that the horizontal distance is longer is detected, re-acquiring the target image of the unmanned aerial vehicle, determining a new horizontal distance based on the new target image, and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the new horizontal distance.
Optionally, the landing control unit is specifically configured to:
in the unmanned aerial vehicle landing process, if the unmanned aerial vehicle is detected to exceed a threshold range corresponding to the central position in the vertical direction, controlling the unmanned aerial vehicle to fly to the threshold range, and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area;
if the unmanned aerial vehicle is detected not to exceed the threshold range, the current landing state is kept, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
A third aspect of an embodiment of the present invention provides another terminal, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program that supports the terminal to execute the above method, where the computer program includes program instructions, and the processor is configured to call the program instructions and execute the following steps:
acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of:
acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
The embodiment of the invention also provides a high-precision autonomous landing system of the unmanned aerial vehicle, which comprises the high-precision autonomous landing device of the unmanned aerial vehicle and the terminal.
The system, the method and the storage medium for high-precision autonomous landing of the unmanned aerial vehicle have the following beneficial effects:
according to the embodiment of the application, a target image corresponding to the unmanned aerial vehicle to be landed is obtained; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle; calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area; and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle. In this application, shoot the infrared beacon on the unmanned aerial vehicle through the camera that sets up in the unmanned aerial vehicle landing zone and obtain the target image that unmanned aerial vehicle corresponds, because the camera is very stable, the target image that obtains of shooing is clear, accurate. According to the method for setting the camera and the infrared beacon, the position of the central point of the landing area of the unmanned aerial vehicle is determined without shooting a landing pattern or the infrared beacon on the ground through the camera, and the accurate position of the central point of the landing area of the unmanned aerial vehicle is directly obtained. Therefore, the position of the unmanned aerial vehicle obtained by calculation according to the position corresponding to the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area is very accurate, so that the unmanned aerial vehicle can land to the unmanned aerial vehicle landing area accurately according to the position of the unmanned aerial vehicle, landing deviation is not easy to occur, and the precision of autonomous landing of the unmanned aerial vehicle is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic view of a scene involved in a method for high-precision autonomous landing of an unmanned aerial vehicle provided by the present application;
fig. 2 is a flowchart of an implementation of a method for high-precision autonomous landing of an unmanned aerial vehicle according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a refinement step of S103 provided herein;
fig. 4 is a schematic view of a device for high-precision autonomous landing of a drone provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a terminal according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, fig. 1 is a schematic view of a scene involved in the method for high-precision autonomous landing of an unmanned aerial vehicle provided by the present application. As shown in fig. 1, the scenario includes: unmanned aerial vehicle, install infrared beacon on unmanned aerial vehicle, install camera, the terminal in the unmanned aerial vehicle landing zone.
Wherein, install the infrared beacon in unmanned aerial vehicle fuselage bottom can be one, also can be a plurality of. Exemplarily, when the infrared beacon is one, the infrared beacon can be installed at the center of the bottom of the unmanned aerial vehicle body; when the number of the infrared beacons is two, the infrared beacons can be arranged at the front and back of the bottom of the unmanned aerial vehicle body; this is not limited to actual conditions. The camera is installed in unmanned aerial vehicle descending region, upwards shoots the infrared beacon on the unmanned aerial vehicle. Illustratively, the camera can be wide angle camera, and it is installed at the center in unmanned aerial vehicle landing zone, shoots the infrared beacon on the unmanned aerial vehicle perpendicularly upwards. The terminal can perform data interaction with the unmanned aerial vehicle, and control the flight, landing and the like of the unmanned aerial vehicle; meanwhile, the terminal and the camera can also carry out data interaction, such as image transmission, parameter setting and the like.
Worth explaining, when unmanned aerial vehicle stopped on ground, the fuselage bottom had certain high distance apart from ground, when the camera was installed at the regional center of unmanned aerial vehicle descending, unmanned aerial vehicle also can be good stop in the camera top.
In a possible implementation manner, the infrared beacon on the unmanned aerial vehicle to be landed transmits a light signal downwards at a preset frequency, and a camera installed at the center of a landing area of the unmanned aerial vehicle searches the infrared beacon upwards. When the unmanned aerial vehicle to be landed flies to the within range in which the infrared beacon can be searched by the camera, the camera shoots the infrared beacon above, and a target image corresponding to the unmanned aerial vehicle to be landed is obtained. The camera transmits the shot target image to the terminal, and the terminal analyzes the corresponding position of the infrared beacon in the target image based on the target image. The terminal calculates the position of the unmanned aerial vehicle to be landed based on the position corresponding to the infrared beacon in the target image and the center position of the unmanned aerial vehicle landing area acquired in advance, and controls the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the calculated position of the unmanned aerial vehicle.
In the above implementation mode, the infrared beacons on the unmanned aerial vehicle are shot through the camera arranged in the unmanned aerial vehicle landing area to obtain the target images corresponding to the unmanned aerial vehicle, and the shot target images are clear and accurate due to the fact that the camera is very stable. According to the method for setting the camera and the infrared beacon, the position of the central point of the landing area of the unmanned aerial vehicle is determined without shooting a landing pattern or the infrared beacon on the ground through the camera, and the accurate position of the central point of the landing area of the unmanned aerial vehicle is directly obtained. Therefore, the position of the unmanned aerial vehicle obtained by calculation according to the position corresponding to the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area is very accurate, so that the unmanned aerial vehicle can land to the unmanned aerial vehicle landing area accurately according to the position of the unmanned aerial vehicle, landing deviation is not easy to occur, and the precision of autonomous landing of the unmanned aerial vehicle is improved.
Referring to fig. 2, fig. 2 is a schematic flowchart of a method for high-precision autonomous landing of an unmanned aerial vehicle according to an embodiment of the present application. The main execution body of the method in this embodiment is a terminal, and the terminal includes, but is not limited to, a mobile terminal such as a smart phone, a tablet computer, a Personal Digital Assistant (PDA), and the like, and may also include a terminal such as a desktop computer. The method as shown in fig. 2 may include:
s101: acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is shot by a camera and obtained by an infrared beacon on the unmanned aerial vehicle, and the camera is located in an unmanned aerial vehicle landing area.
The target image is an image obtained by upwards shooting an infrared beacon on the unmanned aerial vehicle by a camera positioned in an unmanned aerial vehicle landing area.
Exemplarily, a camera installed at the center of the landing area of the unmanned aerial vehicle always searches for the infrared beacon in the range of searching for the infrared beacon, so that the infrared beacon can be captured in real time and the infrared ray can be emitted. When the unmanned aerial vehicle waiting for landing receives a return flight instruction, the unmanned aerial vehicle starts to return flight, and the infrared beacon installed on the unmanned aerial vehicle always sends infrared rays downwards at a preset frequency. When the unmanned aerial vehicle flies to the searching range of the camera, the camera shoots the infrared beacon on the unmanned aerial vehicle upwards to obtain a target image corresponding to the unmanned aerial vehicle to be landed; the unmanned aerial vehicle landing system can be understood as that infrared rays emitted by the infrared beacon enter the camera, form images on a photosensitive device of the camera, and obtain a target image corresponding to the unmanned aerial vehicle to be landed. The camera sends the shot target image to the terminal, and the terminal receives the target image sent by the camera.
S102: and calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area.
Unmanned aerial vehicle descends the region for preset, consequently can accurately obtain the central point that unmanned aerial vehicle descended the region and put, when the camera was installed in unmanned aerial vehicle descended regional center, also can accurately obtain the position of this camera.
And the terminal analyzes the target image to obtain the corresponding position of the infrared beacon on the unmanned aerial vehicle in the target image. Illustratively, if the currently shot target image is a frame, the terminal analyzes the position corresponding to the infrared beacon in the frame target image. And if the currently shot target image comprises two frames of images with adjacent shooting time, the terminal performs difference processing on the two frames of images to obtain a difference image, and the position of the infrared beacon obtained through analysis in the difference image is used as the corresponding position of the infrared beacon in the target image.
Illustratively, when the camera is vertically installed upwards in the center of the unmanned aerial vehicle landing area, the terminal represents the central position of the unmanned aerial vehicle landing area by the position of the camera, and calculates the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the position of the camera.
For example, when the camera is vertically installed at the center of the landing area of the drone, the above S102 may include S1021-S1022, specifically as follows:
s1021: and determining the height and the horizontal distance of the unmanned aerial vehicle relative to the central position according to the corresponding position of the infrared beacon in the target image and the central position.
When there is one infrared beacon installed on the drone, the infrared beacon appears as one point in the target image. Illustratively, when the target image is a single-frame image, the terminal acquires the target image, establishes a coordinate system on the target image, and determines the corresponding coordinates of the infrared beacon in the coordinate system by taking the central point of the coordinate system as an origin. And the terminal acquires a camera coordinate system corresponding to the camera, converts the coordinate corresponding to the infrared beacon according to the camera coordinate system as a reference, and obtains the pixel coordinate corresponding to the infrared beacon in the camera coordinate system, namely the corresponding position of the infrared beacon in the target image.
For example, the target image may be two images that are adjacent to each other in shooting time, that is, two images obtained by continuously shooting the infrared beacon on the drone. At this time, the terminal performs difference processing on the two frames of images to obtain a difference image, and determines the corresponding position of the infrared beacon in the difference image according to the difference image. For example, the two images are a first image and a second image, respectively, wherein the shooting time of the first image is earlier than that of the second image. And the terminal subtracts the pixel value of the first pixel point in the first image from the pixel value of the second pixel point in the second image, takes the absolute value of the pixel value difference obtained by calculation, and generates a difference image according to the absolute value of the difference of each pixel point. The second pixel points are used for representing each pixel point in the second image, the first pixel points are used for representing each pixel point in the first image, and each second pixel point is provided with a first pixel point corresponding to the first pixel point. Or subtracting the pixel value of the second pixel point in the second image from the pixel value of the first pixel point in the first image, taking the absolute value of the calculated pixel value difference value, and generating a difference image according to the absolute value of the difference value of each pixel point. The first image and the second image are input into a preset function of the terminal, and the function is used for realizing the difference processing of the first image and the second image to obtain a difference image. The method is not limited to the actual situation. The two frames of images are subjected to differential processing, impurities, noise and the like in the images obtained by shooting can be effectively removed, the position corresponding to the obtained infrared beacon is more accurate, and the position of the unmanned aerial vehicle obtained by calculation is more accurate according to the position corresponding to the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area.
The process of determining the corresponding position of the infrared beacon in the differential image by the terminal is similar to the process of determining the corresponding position of the infrared beacon in the target image, namely, a coordinate system is established on the differential image, and the coordinate corresponding to the infrared beacon in the coordinate system is determined by taking the central point of the coordinate system as the origin. And the terminal acquires a camera coordinate system corresponding to the camera, converts the coordinate corresponding to the infrared beacon according to the camera coordinate system as a reference, and obtains the pixel coordinate corresponding to the infrared beacon in the camera coordinate system, namely the corresponding position of the infrared beacon in the difference image. The corresponding position of the infrared beacon in the difference image can be used to represent the corresponding position of the infrared beacon in the target image.
A virtual connecting line is arranged between the infrared beacon and the optical center of the camera, and a certain included angle is formed between the connecting line and the light of the optical center of the camera in the vertical direction and is used for representing the degree of the infrared beacon deviating from the optical center of the camera. Since the position of the camera can be used to indicate the central position of the landing area of the drone, this included angle can also be understood as the degree to which the infrared beacon deviates from the center of the landing area of the drone. Specifically, the terminal may calculate the included angle according to the pixel coordinate of the infrared beacon in the target image and the following formula:
Figure BDA0002561873730000131
wherein f isx、fyRepresents the corresponding pixel focal length u of the camera0、v0And a represents the pixel center of the target image, a represents a preset calculation coefficient, and u and v represent corresponding pixel coordinates of the infrared beacon in the target image. X is to bek、ykSubstitution of value of (1)
Figure BDA0002561873730000132
And calculating to obtain the value of theta, namely obtaining the value of the included angle.
The vertical height of the unmanned aerial vehicle relative to the ground can be calculated through radar ranging, ultrasonic ranging, laser ranging, satellite positioning, inertial measurement and other modes. The ultrasonic ranging is taken as an example for explanation, and the unmanned aerial vehicle is provided with the ultrasonic rangingThe distance measuring device comprises an ultrasonic transmitter and an ultrasonic receiver. The ultrasonic transmitter is used for transmitting ultrasonic waves to the ground, and the ultrasonic receiver is used for receiving reflected waves. The ultrasonic transmitter starts timing when transmitting ultrasonic waves to the ground, the ultrasonic waves are transmitted in the air and immediately return when encountering an obstacle, and the ultrasonic receiver immediately stops timing when receiving reflected waves. It can be known that the propagation speed of the ultrasonic wave in the air is 340 meters per second, and the height H of the unmanned aerial vehicle from the ground can be calculated according to the time T recorded by the timer, namely the height H
Figure BDA0002561873730000133
Illustratively, according to the included angle θ and the height H calculated above, the horizontal distance S of the drone relative to the optical center of the camera, that is, the horizontal distance S of the drone relative to the center position of the landing area of the drone, can be calculated by using a trigonometric function. For example, substituting θ and H into
Figure BDA0002561873730000134
The value of S is obtained.
S1022: determining a location of the drone based on the altitude and the horizontal distance.
Make unmanned aerial vehicle be the coplanar at the plane of vertical direction and the plane that the light of camera light center in vertical direction was located to camera light center is the initial point, can confirm unmanned aerial vehicle's position based on this initial point, height and horizontal distance. For example, the coordinates of the drone may be (S, 0, H), which is only an exemplary illustration and is not limited thereto.
S103: and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
The steerable unmanned aerial vehicle of terminal flies to the regional top of unmanned aerial vehicle descending earlier, controls unmanned aerial vehicle again and descends to unmanned aerial vehicle descending region from the regional top of unmanned aerial vehicle descending. The unmanned aerial vehicle can be controlled to vertically land on the ground, and then the unmanned aerial vehicle is controlled to move to an unmanned aerial vehicle landing area.
Exemplarily, in order to improve unmanned aerial vehicle's descending precision, make unmanned aerial vehicle accurately descend to unmanned aerial vehicle descending region, the terminal can control unmanned aerial vehicle earlier and fly to the unmanned aerial vehicle descending regional central point and put directly over, control unmanned aerial vehicle earlier promptly and fly to the vertical top of camera light center, control unmanned aerial vehicle perpendicular descending to unmanned aerial vehicle descending region again.
Optionally, when installing the infrared beacon in unmanned aerial vehicle fuselage bottom when two, the mountable is around unmanned aerial vehicle fuselage bottom, and two infrared beacons are respectively with the frequency of difference infrared signal of emitting downwards, and these two infrared beacons of camera shooting obtain the target image. Because there are the positions that two infrared beacons correspond in this target image, based on two infrared beacons position calculation average values that correspond respectively in the target image, based on this average value and the central point that unmanned aerial vehicle lands the region, the unmanned aerial vehicle's that the calculation obtained position is more accurate. Therefore, make unmanned aerial vehicle more be difficult to appear descending the deviation when descending, further improved the precision that unmanned aerial vehicle independently descended.
Specifically, when there are two infrared beacons, the corresponding pixel coordinates of each infrared beacon in the target image are determined by the method in S1021, and the two pixel coordinates are averaged to obtain a value as the target position. Determining the height and horizontal distance of the unmanned aerial vehicle relative to the central position based on the target position and the central position; based on the altitude and the horizontal distance, the position of the drone is determined. One of the two infrared beacons may also be selected as a target, the pixel coordinate corresponding to the infrared beacon in the target image is calculated, and the pixel coordinate represents the corresponding position of the infrared beacon in the target image, and the subsequent calculation method is the same as the method described in S1021 and S1022, and is not described again here.
It is understood that when there are a plurality of infrared beacons installed at the bottom of the unmanned aerial vehicle body, the position of the infrared beacon in the target image is also calculated by the method described above.
Referring to fig. 3, fig. 3 is a detailed step of S103 in the embodiment of the present application. Exemplarily, the above S103 may include S1031 to S1033, which are as follows:
s1031: and detecting the course of the unmanned aerial vehicle.
The terminal detects the course of the unmanned aerial vehicle. Specifically, the direction of the camera is preset to yawevAccording to rawevAnd determining the direction yaw of the unmanned aerial vehicle under the camera coordinates by the calculated position of the unmanned aerial vehicleuavAccording to raw ═ rawev+yawuavAnd calculating to obtain the course of the unmanned aerial vehicle. The course of the unmanned aerial vehicle can be detected through a course gyroscope on the unmanned aerial vehicle. The method is not limited to the actual situation.
S1032: and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the heading of the unmanned aerial vehicle and the horizontal distance.
The terminal detects whether the course of the unmanned aerial vehicle faces the central position of the unmanned aerial vehicle landing area, namely whether the course of the unmanned aerial vehicle faces the optical center of the camera. When the heading of the unmanned aerial vehicle is detected to face the central position of the unmanned aerial vehicle landing area, the unmanned aerial vehicle is controlled to fly above the central position, and the horizontal distance is continuously reduced until the unmanned aerial vehicle flies to the position above the central position of the unmanned aerial vehicle landing area. It can be understood that, at this time, the unmanned aerial vehicle can keep the current height and fly above the central position of the landing area of the unmanned aerial vehicle based on the horizontal direction; or can fly in a mode of moving to the upper part of the central position while landing; the flying vehicle can fly in a mode of moving upwards from the central position while lifting, and the flying vehicle is not limited to the above mode.
When the fact that the course of the unmanned aerial vehicle does not face the central position of the unmanned aerial vehicle landing area is detected, the course of the unmanned aerial vehicle is adjusted firstly, and the course of the unmanned aerial vehicle faces the central position of the unmanned aerial vehicle landing area. And controlling the unmanned aerial vehicle to fly above the central position, and continuously reducing the horizontal distance until the unmanned aerial vehicle flies above the central position of the unmanned aerial vehicle landing area.
Illustratively, when the terminal detects that the heading of the drone is towards the central position of the drone landing zone, the above S1032 may include S10321-S10322, specifically as follows:
s10321: in the flight process of the unmanned aerial vehicle, if the horizontal distance is detected to be shortened, the current flight state is kept, and the unmanned aerial vehicle is controlled to fly to the position above the unmanned aerial vehicle landing area.
The terminal is in the process of controlling unmanned aerial vehicle flight to the unmanned aerial vehicle landing zone's top, and real-time detection horizontal distance shortens or not. When detecting that horizontal distance shortens, then show that unmanned aerial vehicle flight does not go out the deviation this moment, belong to the process of constantly being close to the top that unmanned aerial vehicle lands the region. At the moment, the terminal controls the unmanned aerial vehicle to keep the current flight state until the unmanned aerial vehicle flies above the unmanned aerial vehicle landing area.
Exemplarily, in order to improve the precision that unmanned aerial vehicle descends, when detecting that horizontal distance shortens, control unmanned aerial vehicle keeps current flight state to control unmanned aerial vehicle to descend directly over regional to unmanned aerial vehicle, control unmanned aerial vehicle to fly to the vertical direction at camera light center place promptly.
S10322: and if the fact that the horizontal distance is longer is detected, re-acquiring the target image of the unmanned aerial vehicle, determining a new horizontal distance based on the new target image, and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the new horizontal distance.
When detecting that horizontal distance becomes long, then explain unmanned aerial vehicle flight this moment and appear the deviation, unmanned aerial vehicle is keeping away from the top that unmanned aerial vehicle lands the region promptly. At this moment, the terminal sends an image acquisition instruction to the camera, and the image acquisition instruction is used for instructing the camera to shoot a target image corresponding to the unmanned aerial vehicle at this moment and sending the shot target image to the terminal. And the terminal determines the corresponding position of the infrared beacon in the newly acquired target image at the moment based on the newly acquired target image, and calculates a new horizontal distance, namely the horizontal distance of the unmanned aerial vehicle relative to the central position at the moment based on the position and the central position of the unmanned aerial vehicle landing area. The process of calculating the new horizontal distance is the same as the process of calculating the horizontal distance described in the above-described S1021, and is not described again here. The terminal controls the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the heading and the new horizontal distance of the unmanned aerial vehicle at the moment.
It should be noted that the execution sequence of S10321 and S10322 is not limited, and the two steps can be adjusted and executed at any time according to different situations of the unmanned aerial vehicle during the flight. For example, in the whole flight process in which the drone flies above the drone landing area, if it is detected that the horizontal distance is being reduced, S10321 is executed. If it is detected that the horizontal distance is reduced first and then the horizontal distance is increased, S10321 is performed first, and then S10322 is performed. If it is detected that the horizontal distance is becoming large from the beginning, S10322 is performed. The actual flight condition is taken as the standard, and the method is not limited.
S1033: and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height.
Exemplarily, if the unmanned aerial vehicle flies above the unmanned aerial vehicle landing area based on the horizontal direction, the height of the unmanned aerial vehicle does not change, and at this time, the terminal controls the unmanned aerial vehicle to land for the distance of the height, so that the unmanned aerial vehicle lands in the unmanned aerial vehicle landing area. If unmanned aerial vehicle is that the limit descends and flies to the top in unmanned aerial vehicle descending area simultaneously, then unmanned aerial vehicle's height step-down, terminal control unmanned aerial vehicle descends the distance of height this moment, makes unmanned aerial vehicle descend in unmanned aerial vehicle descending area. If unmanned aerial vehicle is that the limit risees the limit and flies to the top in unmanned aerial vehicle descending area, then unmanned aerial vehicle's height uprises, and the distance of terminal control unmanned aerial vehicle descending height this moment makes unmanned aerial vehicle descend in unmanned aerial vehicle descending area.
Illustratively, the above S1033 may include S10331 to S10332, specifically as follows:
s10331: in the unmanned aerial vehicle landing process, if the unmanned aerial vehicle is detected to exceed the threshold range corresponding to the vertical direction in the central position, the unmanned aerial vehicle is controlled to fly to the threshold range, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
The threshold range corresponding to the central position in the vertical direction can be understood as the threshold range corresponding to the optical center of the camera in the vertical direction. The threshold value range can be preset, for example, the light of the optical center of the camera in the vertical direction is used as the center, the square circle 5 cm is used as the threshold value range, and the smaller the threshold value range is, the more accurate the unmanned aerial vehicle lands, and the unmanned aerial vehicle landing method is not limited to the above.
The terminal is at the in-process that control unmanned aerial vehicle descends to unmanned aerial vehicle descending region, and real-time unmanned aerial vehicle is whether in the threshold value scope. If it exceeds the threshold value scope to detect unmanned aerial vehicle, then explain that unmanned aerial vehicle has deviated the central point and put directly over this moment, adjust unmanned aerial vehicle's flight position, make unmanned aerial vehicle be in the threshold value scope, then control unmanned aerial vehicle to descend to unmanned aerial vehicle and descend regional.
S10332: if the unmanned aerial vehicle is detected not to exceed the threshold range, the current landing state is kept, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
If it does not exceed the threshold range to detect unmanned aerial vehicle, then explain that unmanned aerial vehicle still is in the central point directly over this moment and put, keeps the present state of descending of unmanned aerial vehicle, controls unmanned aerial vehicle and descends to unmanned aerial vehicle landing area.
In the above-mentioned embodiment, shoot the infrared beacon on the unmanned aerial vehicle through the camera that sets up in the unmanned aerial vehicle landing zone and obtain the target image that unmanned aerial vehicle corresponds, because the camera is very stable, the target image that obtains of shooing is clear, accurate. According to the method for setting the camera and the infrared beacon, the position of the central point of the landing area of the unmanned aerial vehicle is determined without shooting a landing pattern or the infrared beacon on the ground through the camera, and the accurate position of the central point of the landing area of the unmanned aerial vehicle is directly obtained. Therefore, the position of the unmanned aerial vehicle obtained by calculation according to the position corresponding to the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area is very accurate, so that the unmanned aerial vehicle can land to the unmanned aerial vehicle landing area accurately according to the position of the unmanned aerial vehicle, landing deviation is not easy to occur, and the precision of autonomous landing of the unmanned aerial vehicle is improved. In the process of controlling the unmanned aerial vehicle to fly above the landing area of the unmanned aerial vehicle, the length of the horizontal distance of the unmanned aerial vehicle is detected in real time, and the adjustment is made in time; when the unmanned aerial vehicle is controlled to land to an unmanned aerial vehicle landing area, detecting whether the unmanned aerial vehicle is within a threshold range in real time, and adjusting in time; this makes unmanned aerial vehicle avoid flying unnecessary route to accurately descend at the regional center of unmanned aerial vehicle descending, realized unmanned aerial vehicle's accurate descending.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Referring to fig. 4, fig. 4 is a schematic view of a device for high-precision autonomous landing of an unmanned aerial vehicle according to an embodiment of the present application. The unmanned aerial vehicle landing device comprises units for executing the steps in the embodiment corresponding to the figures 2 and 3. Please refer to fig. 2 and fig. 3 for the corresponding embodiments. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 4, including:
an obtaining unit 210, configured to obtain a target image corresponding to an unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
a calculating unit 220, configured to calculate a position of the drone based on a corresponding position of the infrared beacon in the target image and a central position of the drone landing area;
a control unit 230 for controlling the drone to land from the drone's location to the drone landing area.
Optionally, the calculating unit 220 is specifically configured to:
according to the corresponding position of the infrared beacon in the target image and the central position, determining the height and the horizontal distance of the unmanned aerial vehicle relative to the central position;
determining a location of the drone based on the altitude and the horizontal distance.
Optionally, the target image includes two adjacent frames of images at adjacent shooting times; the method for determining the corresponding position of the infrared beacon in the target image comprises the following steps:
carrying out difference processing on the two frames of images to obtain a difference image;
and determining the corresponding position of the infrared beacon in the differential image according to the differential image.
Optionally, the control unit 230 includes:
the detection unit is used for detecting the course of the unmanned aerial vehicle;
the flight control unit is used for controlling the unmanned aerial vehicle to fly above the landing area of the unmanned aerial vehicle according to the heading and the horizontal distance of the unmanned aerial vehicle;
and the landing control unit is used for controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height.
Optionally, the flight control unit is specifically configured to:
in the flight process of the unmanned aerial vehicle, if the fact that the horizontal distance is shortened is detected, the current flight state is kept, and the unmanned aerial vehicle is controlled to fly above a landing area of the unmanned aerial vehicle;
and if the fact that the horizontal distance is longer is detected, re-acquiring the target image of the unmanned aerial vehicle, determining a new horizontal distance based on the new target image, and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the new horizontal distance.
Optionally, the landing control unit is specifically configured to:
in the unmanned aerial vehicle landing process, if the unmanned aerial vehicle is detected to exceed a threshold range corresponding to the central position in the vertical direction, controlling the unmanned aerial vehicle to fly to the threshold range, and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area;
if the unmanned aerial vehicle is detected not to exceed the threshold range, the current landing state is kept, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
Referring to fig. 5, fig. 5 is a schematic diagram of a terminal according to another embodiment of the present application. As shown in fig. 5, the terminal 3 of this embodiment includes: a processor 30, a memory 31, and computer readable instructions 32 stored in the memory 31 and executable on the processor 30. The processor 30, when executing the computer readable instructions 32, implements the steps in the various method embodiments described above, such as S101-S103 shown in fig. 2. Alternatively, the processor 30, when executing the computer readable instructions 32, implements the functions of the units in the above embodiments, such as the units 210 to 230 shown in fig. 4.
Illustratively, the computer readable instructions 32 may be divided into one or more units, which are stored in the memory 31 and executed by the processor 30 to accomplish the present application. The one or more units may be a series of computer readable instruction segments capable of performing specific functions, which are used to describe the execution of the computer readable instructions 32 in the terminal 3. For example, the computer readable instructions 32 may be divided into an acquisition unit, a calculation unit, and a control unit, each unit having the specific functions as described above.
The terminal may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 5 is only an example of a terminal 3 and does not constitute a limitation of the terminal 3 and may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the terminal may also comprise input output terminals, network access terminals, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal 3, such as a hard disk or a memory of the terminal 3. The memory 31 may also be an external storage terminal of the terminal 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the terminal 3. Further, the memory 31 may also include both an internal storage unit of the terminal 3 and an external storage terminal. The memory 31 is used for storing the computer readable instructions and other programs and data required by the terminal. The memory 31 may also be used to temporarily store data that has been output or is to be output.
The embodiment of the application also provides a system for the high-precision autonomous landing of the unmanned aerial vehicle, which comprises the device for the high-precision autonomous landing of the unmanned aerial vehicle provided by the embodiment and the terminal provided by the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not cause the essential features of the corresponding technical solutions to depart from the spirit scope of the technical solutions of the embodiments of the present application, and are intended to be included within the scope of the present application.

Claims (10)

1. Unmanned aerial vehicle high accuracy is method of descending independently, its characterized in that includes:
acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
2. The method of claim 1, wherein calculating the position of the drone based on the corresponding position of the infrared beacon in the target image and the center position of the drone landing area comprises:
according to the corresponding position of the infrared beacon in the target image and the central position, determining the height and the horizontal distance of the unmanned aerial vehicle relative to the central position;
determining a location of the drone based on the altitude and the horizontal distance.
3. The method of claim 1, wherein the target image comprises two frames of images that are adjacent in capture time; the method for determining the corresponding position of the infrared beacon in the target image comprises the following steps:
carrying out difference processing on the two frames of images to obtain a difference image;
and determining the corresponding position of the infrared beacon in the differential image according to the differential image.
4. The method of claim 2, wherein said controlling said drone to land from a location of said drone to said drone landing area comprises:
detecting the course of the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to fly above the landing area of the unmanned aerial vehicle according to the heading and the horizontal distance of the unmanned aerial vehicle;
and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area according to the height.
5. The method of claim 4, wherein said controlling said drone to fly above said drone landing area based on said horizontal distance and said heading of said drone comprises:
in the flight process of the unmanned aerial vehicle, if the fact that the horizontal distance is shortened is detected, the current flight state is kept, and the unmanned aerial vehicle is controlled to fly above a landing area of the unmanned aerial vehicle;
and if the fact that the horizontal distance is longer is detected, re-acquiring the target image of the unmanned aerial vehicle, determining a new horizontal distance based on the new target image, and controlling the unmanned aerial vehicle to fly above the unmanned aerial vehicle landing area according to the new horizontal distance.
6. The method of claim 4, wherein said controlling said drone to land to said drone landing area according to said altitude comprises:
in the unmanned aerial vehicle landing process, if the unmanned aerial vehicle is detected to exceed a threshold range corresponding to the central position in the vertical direction, controlling the unmanned aerial vehicle to fly to the threshold range, and controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area;
if the unmanned aerial vehicle is detected not to exceed the threshold range, the current landing state is kept, and the unmanned aerial vehicle is controlled to land to the unmanned aerial vehicle landing area.
7. Unmanned aerial vehicle high accuracy is device of descending independently, its characterized in that includes:
the acquisition unit is used for acquiring a target image corresponding to the unmanned aerial vehicle to be landed; the target image is obtained by shooting an infrared beacon on the unmanned aerial vehicle through a camera, and the camera is located in a landing area of the unmanned aerial vehicle;
the calculating unit is used for calculating the position of the unmanned aerial vehicle based on the corresponding position of the infrared beacon in the target image and the central position of the unmanned aerial vehicle landing area;
and the control unit is used for controlling the unmanned aerial vehicle to land to the unmanned aerial vehicle landing area from the position of the unmanned aerial vehicle.
8. Terminal comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor implements the method according to any of claims 1 to 6 when executing said computer program.
9. A high-precision autonomous landing system for unmanned aerial vehicles, comprising the high-precision autonomous landing apparatus for unmanned aerial vehicles according to claim 7, and the terminal according to claim 8.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN202010611271.6A 2020-06-30 2020-06-30 System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium Pending CN111766900A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010611271.6A CN111766900A (en) 2020-06-30 2020-06-30 System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010611271.6A CN111766900A (en) 2020-06-30 2020-06-30 System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN111766900A true CN111766900A (en) 2020-10-13

Family

ID=72722951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010611271.6A Pending CN111766900A (en) 2020-06-30 2020-06-30 System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN111766900A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113109852A (en) * 2021-03-11 2021-07-13 国网江西省电力有限公司电力科学研究院 Path planning method and device for unmanned aerial vehicle to enter narrow space
CN113409485A (en) * 2021-08-03 2021-09-17 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium
CN113448351A (en) * 2021-08-30 2021-09-28 广州知行机器人科技有限公司 Method and device for guiding unmanned aerial vehicle to land accurately and unmanned aerial vehicle hangar
CN113899367A (en) * 2021-08-25 2022-01-07 广州优飞智能设备有限公司 Positioning method and device for unmanned aerial vehicle landing, computer equipment and storage medium
CN114261306A (en) * 2021-12-20 2022-04-01 深圳市歌尔泰克科技有限公司 Unmanned aerial vehicle cabin returning charging method, unmanned aerial vehicle, charging cabin and readable storage medium
US11733717B2 (en) 2020-07-10 2023-08-22 Zhuhai Ziyan Uav Co., Ltd. Unmanned aerial vehicle control method and system based on moving base

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278541A (en) * 2015-09-02 2016-01-27 蔡兆旸 Aircraft auxiliary landing control method and system
CN105501457A (en) * 2015-12-16 2016-04-20 南京航空航天大学 Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle)
CN106020239A (en) * 2016-08-02 2016-10-12 南京奇蛙智能科技有限公司 Precise landing control method for unmanned aerial vehicle
CN107357310A (en) * 2017-07-17 2017-11-17 北京京东尚科信息技术有限公司 UAV Flight Control equipment, system, method and unmanned aerial vehicle (UAV) control method
CN108750129A (en) * 2018-04-20 2018-11-06 广州亿航智能技术有限公司 A kind of manned unmanned plane positioning landing concept and manned unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278541A (en) * 2015-09-02 2016-01-27 蔡兆旸 Aircraft auxiliary landing control method and system
CN105501457A (en) * 2015-12-16 2016-04-20 南京航空航天大学 Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle)
CN106020239A (en) * 2016-08-02 2016-10-12 南京奇蛙智能科技有限公司 Precise landing control method for unmanned aerial vehicle
CN107357310A (en) * 2017-07-17 2017-11-17 北京京东尚科信息技术有限公司 UAV Flight Control equipment, system, method and unmanned aerial vehicle (UAV) control method
CN108750129A (en) * 2018-04-20 2018-11-06 广州亿航智能技术有限公司 A kind of manned unmanned plane positioning landing concept and manned unmanned plane

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11733717B2 (en) 2020-07-10 2023-08-22 Zhuhai Ziyan Uav Co., Ltd. Unmanned aerial vehicle control method and system based on moving base
CN113109852A (en) * 2021-03-11 2021-07-13 国网江西省电力有限公司电力科学研究院 Path planning method and device for unmanned aerial vehicle to enter narrow space
CN113109852B (en) * 2021-03-11 2024-03-12 国网江西省电力有限公司电力科学研究院 Path planning method and device for entering narrow space of unmanned aerial vehicle
CN113409485A (en) * 2021-08-03 2021-09-17 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium
CN113409485B (en) * 2021-08-03 2023-12-12 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium
CN113899367A (en) * 2021-08-25 2022-01-07 广州优飞智能设备有限公司 Positioning method and device for unmanned aerial vehicle landing, computer equipment and storage medium
CN113448351A (en) * 2021-08-30 2021-09-28 广州知行机器人科技有限公司 Method and device for guiding unmanned aerial vehicle to land accurately and unmanned aerial vehicle hangar
CN114261306A (en) * 2021-12-20 2022-04-01 深圳市歌尔泰克科技有限公司 Unmanned aerial vehicle cabin returning charging method, unmanned aerial vehicle, charging cabin and readable storage medium

Similar Documents

Publication Publication Date Title
CN111766900A (en) System and method for high-precision autonomous landing of unmanned aerial vehicle and storage medium
US11797009B2 (en) Unmanned aerial image capture platform
US11073389B2 (en) Hover control
KR100842104B1 (en) Guide and control method for automatic landing of uavs using ads-b and vision-based information
CN110001980B (en) Aircraft landing method and device
WO2018227350A1 (en) Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
US20200105150A1 (en) Unmanned aircraft return flight control method, device, and unmanned aerial vehicle
CN105867397B (en) A kind of unmanned plane exact position landing method based on image procossing and fuzzy control
US10133929B2 (en) Positioning method and positioning device for unmanned aerial vehicle
AU2019321145A1 (en) Method, device, and equipment for obstacle or ground recognition and flight control, and storage medium
CN106197422A (en) A kind of unmanned plane based on two-dimensional tag location and method for tracking target
WO2020143576A1 (en) Method and apparatus for adjusting main detection direction of airborne radar, and unmanned aerial vehicle
WO2018210078A1 (en) Distance measurement method for unmanned aerial vehicle, and unmanned aerial vehicle
EP3531224A1 (en) Environment-adaptive sense and avoid system for unmanned vehicles
CN110989682B (en) Unmanned aerial vehicle accurate landing method based on single base station
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN111413708A (en) Unmanned aerial vehicle autonomous landing site selection method based on laser radar
WO2018059398A1 (en) Method, apparatus, and system for controlling multi-rotor aircraft
WO2020233682A1 (en) Autonomous circling photographing method and apparatus and unmanned aerial vehicle
CN112947526B (en) Unmanned aerial vehicle autonomous landing method and system
KR20200082234A (en) Indoor Flight System for Unmanned Aerial Vehicle and Method Thereof
WO2021056139A1 (en) Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
WO2020237478A1 (en) Flight planning method and related device
KR101876829B1 (en) Induction control system for indoor flight control of small drones
CN113759940A (en) Unmanned aerial vehicle landing method and device, unmanned aerial vehicle system, airport, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination