CN109460047B - Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation - Google Patents
Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation Download PDFInfo
- Publication number
- CN109460047B CN109460047B CN201811236405.XA CN201811236405A CN109460047B CN 109460047 B CN109460047 B CN 109460047B CN 201811236405 A CN201811236405 A CN 201811236405A CN 109460047 B CN109460047 B CN 109460047B
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- image
- landing area
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Abstract
The invention provides an unmanned aerial vehicle autonomous graded landing method and system based on visual navigation, which can designate any landing area shot by an unmanned aerial vehicle according to the needs of a user; the invention uses visual tracking as the feedback of the flight position of the unmanned aerial vehicle to navigate the unmanned aerial vehicle; according to the invention, the landing area is tracked in a grading manner, so that the unmanned aerial vehicle can land accurately.
Description
Technical Field
The invention relates to an unmanned aerial vehicle autonomous graded landing method and system based on visual navigation.
Background
At present, unmanned aerial vehicle's descending mainly relies on GPS location to realize, but GPS's precision receives signal strength's influence to there is the error of meter level in ordinary GPS's precision, consequently, relies on the GPS signal can not realize accurate descending
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle autonomous graded landing method and system based on visual navigation.
In order to solve the problems, the invention provides an unmanned aerial vehicle autonomous graded landing method based on visual navigation, which comprises the following steps:
step S1, selecting an image of an expected landing area from the images shot by the unmanned aerial vehicle;
step S2, tracking the image of the expected landing area by using an image tracking algorithm, and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
and step S3, controlling the unmanned aerial vehicle to fly to the expected landing area according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground.
Further, in the above method, in step S3, controlling the drone to fly to the desired landing area according to the position and size of the image of the desired landing area in the image currently captured by the drone until the drone lands on the ground includes:
step S3, controlling the unmanned aerial vehicle to fly to the desired landing area according to the position and size of the image of the desired landing area in the image currently captured by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground, including:
step S31, obtaining the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle, and controlling the center of the image of the expected landing area to be at the center of the image currently shot by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle;
step S32, when the unmanned aerial vehicle flies to the sky close to the expected landing area, adjusting the camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical downwards, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
step S33, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, and reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
step S34, when the unmanned aerial vehicle flies to the desired landing area while the flying height is reduced, and when the image of the desired landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle, taking the center of the image of the desired landing area as the center, and taking one half of the image of the desired landing area as a new image of the precise landing area;
step S35, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the new accurate landing area, and simultaneously reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the new accurate landing area is continuously tracked, and according to the deviation between the center of the image of the new accurate landing area and the center of the image currently shot by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
repeating the steps S34 and S35 until the drone lands on the ground.
Further, in the above method, the desired landing area comprises an area with distinct features.
Further, in the above method, the step S2 includes:
and tracking the image of the expected landing area by using a tracking algorithm of filtering and target color histograms, and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle.
Further, in the above method, the step S32 includes:
continuously tracking and acquiring the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image shot by the unmanned aerial vehicle so that the center of the expected landing area is positioned in the center of the image shot by the unmanned aerial vehicle;
then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward;
finally, the camera of the drone is aimed vertically downward at the desired landing area.
Further, in the above method, the step S33 includes: keeping the angle of a camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the expected landing area, acquiring the position and the size of the expected landing area, controlling the unmanned aerial vehicle to start descending, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently shot by the unmanned aerial vehicle, so that the center of the image of the expected landing area is at the center of the image currently shot by the unmanned aerial vehicle;
the step S35 includes: keep unmanned aerial vehicle's camera angle is perpendicular downwards, continues to track new accurate image of descending the region, acquires new accurate image of descending the region is in position and size in the image that unmanned aerial vehicle shot at present reduce simultaneously unmanned aerial vehicle's height, according to the deviation of the center of the image of new accurate image of descending the region and the current image of shooting of unmanned aerial vehicle uses PID control, constantly adjusts unmanned aerial vehicle's gesture and position make the center of new accurate image of descending the region is in the center of the current image of shooting of unmanned aerial vehicle.
According to another aspect of the present invention, there is also provided an unmanned aerial vehicle autonomous hierarchical landing system based on visual navigation, including:
the first module is used for selecting an image of an expected landing area from images shot by the unmanned aerial vehicle at present;
the second module is used for tracking the image of the expected landing area by using an image tracking algorithm and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
and the third module is used for controlling the unmanned aerial vehicle to fly to the expected landing area until the unmanned aerial vehicle lands on the ground according to the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle currently.
Further, in the above system, the third module includes:
a third module, configured to obtain a deviation between a center of the image of the expected landing area and a center of the image currently captured by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently captured by the unmanned aerial vehicle, and control the center of the image of the expected landing area to be at the center of the image currently captured by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to adjust a camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical and downward when the unmanned aerial vehicle flies to the sky near the expected landing area, wherein in this process, images of the expected landing area are continuously tracked, and according to a deviation between a center of the image of the expected landing area and a center of an image currently captured by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, and reduce the flying height of the unmanned aerial vehicle, where in this process, the image of the expected landing area is continuously tracked, and according to a deviation between the center of the image of the expected landing area and the center of the image currently taken by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently taken by the unmanned aerial vehicle;
a third fourth module, configured to, when the unmanned aerial vehicle flies to the expected landing area while the flying height is reduced, take one half of the image of the expected landing area as a new image of the accurate landing area, with a center of the image of the expected landing area as a center and a length and a width of the image of the expected landing area corresponding to each other when the image of the expected landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle;
a fifth module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the new accurate landing area, and reduce the flying height of the unmanned aerial vehicle, wherein in this process, the image of the new accurate landing area is continuously tracked, and according to a deviation between a center of the image of the new accurate landing area and a center of an image currently captured by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
and the third sixth module is used for repeatedly executing the third fourth module and the third fifth module until the unmanned aerial vehicle lands on the ground.
Further, in the above system, the desired landing area comprises a region having a distinct characteristic.
Further, in the above system, the second module is configured to track the image of the expected landing area by using a tracking algorithm of filtering and target color histogram, and continuously obtain a position and a size of the image of the expected landing area in the image currently captured by the unmanned aerial vehicle.
Further, in the above system, the third module is configured to continuously track and acquire the position and size of the image of the expected landing area in the image captured by the unmanned aerial vehicle, and then continuously adjust the attitude and position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently captured by the unmanned aerial vehicle, so that the center of the expected landing area is located at the center of the image currently captured by the unmanned aerial vehicle; then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward; finally, the camera of the drone is aimed vertically downward at the desired landing area.
Further, in the above system, the third module is configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the expected landing area, obtain the position and size of the expected landing area, control the unmanned aerial vehicle to start descending, and continuously adjust the posture and position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently captured by the unmanned aerial vehicle, so that the center of the image of the expected landing area is located at the center of the image currently captured by the unmanned aerial vehicle;
the third module is used for keeping the camera angle of the unmanned aerial vehicle is vertically downward, and the camera angle of the unmanned aerial vehicle is continuously tracked, the new image of the accurate landing area is obtained, the new image of the accurate landing area is in the position and the size of the image shot by the unmanned aerial vehicle at present, and the height of the unmanned aerial vehicle is reduced at the same time.
Compared with the prior art, the unmanned aerial vehicle shooting system can specify any landing area shot by the unmanned aerial vehicle according to the user requirement; the invention uses visual tracking as the feedback of the flight position of the unmanned aerial vehicle to navigate the unmanned aerial vehicle; according to the invention, the landing area is tracked in a grading manner, so that the unmanned aerial vehicle can land accurately.
Drawings
Fig. 1 is a flowchart of a method for autonomous and staged landing of a drone based on visual navigation according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
As shown in fig. 1, the present invention provides a visual navigation-based autonomous landing method for an unmanned aerial vehicle, including:
step S1, selecting an image of an expected landing area from the images shot by the unmanned aerial vehicle, wherein the remaining areas which are not selected in the images shot by the unmanned aerial vehicle are other areas;
here, the expected landing area is selected from a current picture taken by the drone, and an area with obvious features is selected as the landing area of the drone, for example: the top of the roof, or a flat ground;
after the unmanned aerial vehicle uses a camera to shoot a real-time current picture, the picture is sent to a terminal for display through an image transmission system; the image (x, y, w, h) of the desired landing area can be selected from the displayed images in the terminal
Step S2, tracking the image of the expected landing area by using an image tracking algorithm, and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
here, in order to obtain the position and size of the image (x, y, w, h) of the desired landing area in the subsequent video, the image of the desired landing area needs to be tracked as a target; in an embodiment, tracking the image of the expected landing area by using a tracking algorithm of related filtering and target color histogram, and continuously acquiring the position and size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
and step S3, controlling the unmanned aerial vehicle to fly to the expected landing area according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground.
In an embodiment of the autonomous staged landing method for an unmanned aerial vehicle based on visual navigation, in step S3, controlling the unmanned aerial vehicle to fly to the desired landing area according to the position and size of the image of the desired landing area in the image currently captured by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground includes:
step S31, obtaining the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle, and controlling the center of the image of the expected landing area to be at the center of the image currently shot by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle;
step S32, when the unmanned aerial vehicle flies to the sky close to the expected landing area, adjusting the camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical downwards, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
after the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle are continuously tracked and obtained, the attitude and the position of the unmanned aerial vehicle are continuously adjusted by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image shot by the unmanned aerial vehicle, so that the center of the expected landing area is located at the center of the image shot by the unmanned aerial vehicle; then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward; finally, vertically downwards aligning a camera of the unmanned aerial vehicle to the expected landing area;
step S33, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, and reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
step S34, when the unmanned aerial vehicle flies to the desired landing area while the flying height is reduced, and when the image of the desired landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle, taking the center of the image of the desired landing area as the center, and taking one half of the image of the desired landing area as a new image of the precise landing area;
step S35, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the new accurate landing area, and simultaneously reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the new accurate landing area is continuously tracked, and according to the deviation between the center of the image of the new accurate landing area and the center of the image currently shot by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
repeating the steps S34 and S35 until the drone lands on the ground.
The unmanned aerial vehicle shooting system can specify any landing area shot by the unmanned aerial vehicle according to the user requirement; the invention uses visual tracking as the feedback of the flight position of the unmanned aerial vehicle to navigate the unmanned aerial vehicle; according to the invention, the landing area is tracked in a grading manner, so that the unmanned aerial vehicle can land accurately.
Specifically, in the above step S33-step S35, in the unmanned aerial vehicle landing process, the image of the expected landing area is larger and larger, and even exceeds the shooting view of the unmanned aerial vehicle, for accurate landing, the stepped tracking landing area is adopted, the center position of the landing area is tracked all the time, and the specific step of the stepped tracking accurate landing is as follows:
step S33, including: keeping the angle of a camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the expected landing area, acquiring the position and the size of the expected landing area, controlling the unmanned aerial vehicle to start descending, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently shot by the unmanned aerial vehicle, so that the center of the image of the expected landing area is at the center of the image currently shot by the unmanned aerial vehicle;
step S34, including: the size of the image of the expected landing area is larger and larger along with the descending of the unmanned aerial vehicle, when the unmanned aerial vehicle flies to the expected landing area when the flying height is reduced, and when the image of the expected landing area occupies 2/3 of the image currently shot by the unmanned aerial vehicle, taking the center of the image of the expected landing area as the center, and taking one half of the image of the expected landing area as the length and width to serve as the image of a new accurate landing area;
step S35, including: keeping the angle of a camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the new accurate landing area, acquiring the position and the size of the image of the new accurate landing area in the image currently shot by the unmanned aerial vehicle, simultaneously reducing the height of the unmanned aerial vehicle, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the deviation between the center of the image of the new accurate landing area and the center of the image currently shot by the unmanned aerial vehicle so as to enable the center of the image of the new accurate landing area to be at the center of the image currently shot by the unmanned aerial vehicle;
repeating the steps S34 and S35 until the drone lands on the ground.
According to another aspect of the present invention, there is also provided an unmanned aerial vehicle autonomous hierarchical landing system based on visual navigation, including:
the first module is used for selecting an image of an expected landing area from images shot by the unmanned aerial vehicle at present;
the second module is used for tracking the image of the expected landing area by using an image tracking algorithm and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
and the third module is used for controlling the unmanned aerial vehicle to fly to the expected landing area until the unmanned aerial vehicle lands on the ground according to the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle currently.
Further, in the above system, the third module includes:
a third module, configured to obtain a deviation between a center of the image of the expected landing area and a center of the image currently captured by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently captured by the unmanned aerial vehicle, and control the center of the image of the expected landing area to be at the center of the image currently captured by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to adjust a camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical and downward when the unmanned aerial vehicle flies to the sky near the expected landing area, wherein in this process, images of the expected landing area are continuously tracked, and according to a deviation between a center of the image of the expected landing area and a center of an image currently captured by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, and reduce the flying height of the unmanned aerial vehicle, where in this process, the image of the expected landing area is continuously tracked, and according to a deviation between the center of the image of the expected landing area and the center of the image currently taken by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently taken by the unmanned aerial vehicle;
a third fourth module, configured to, when the unmanned aerial vehicle flies to the expected landing area while the flying height is reduced, take one half of the image of the expected landing area as a new image of the accurate landing area, with a center of the image of the expected landing area as a center and a length and a width of the image of the expected landing area corresponding to each other when the image of the expected landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle;
a fifth module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the new accurate landing area, and reduce the flying height of the unmanned aerial vehicle, wherein in this process, the image of the new accurate landing area is continuously tracked, and according to a deviation between a center of the image of the new accurate landing area and a center of an image currently captured by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
and the third sixth module is used for repeatedly executing the third fourth module and the third fifth module until the unmanned aerial vehicle lands on the ground.
Further, in the above system, the desired landing area comprises a region having a distinct characteristic.
Further, in the above system, the second module is configured to track the image of the expected landing area by using a tracking algorithm of filtering and target color histogram, and continuously obtain a position and a size of the image of the expected landing area in the image currently captured by the unmanned aerial vehicle.
Further, in the above system, the third module is configured to continuously track and acquire the position and size of the image of the expected landing area in the image captured by the unmanned aerial vehicle, and then continuously adjust the attitude and position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently captured by the unmanned aerial vehicle, so that the center of the expected landing area is located at the center of the image currently captured by the unmanned aerial vehicle; then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward; finally, the camera of the drone is aimed vertically downward at the desired landing area.
Further, in the above system, the third module is configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the expected landing area, obtain the position and size of the expected landing area, control the unmanned aerial vehicle to start descending, and continuously adjust the posture and position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently captured by the unmanned aerial vehicle, so that the center of the image of the expected landing area is located at the center of the image currently captured by the unmanned aerial vehicle;
the third module is used for keeping the camera angle of the unmanned aerial vehicle is vertically downward, and the camera angle of the unmanned aerial vehicle is continuously tracked, the new image of the accurate landing area is obtained, the new image of the accurate landing area is in the position and the size of the image shot by the unmanned aerial vehicle at present, and the height of the unmanned aerial vehicle is reduced at the same time.
For details of each system embodiment of the present invention, reference may be made to corresponding parts of each method embodiment, and details are not described herein again.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (8)
1. An unmanned aerial vehicle autonomous graded landing method based on visual navigation is characterized by comprising the following steps:
step S1, selecting an image of an expected landing area from the images shot by the unmanned aerial vehicle;
step S2, tracking the image of the expected landing area by using an image tracking algorithm, and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
step S3, controlling the unmanned aerial vehicle to fly to the expected landing area according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground;
step S3, controlling the unmanned aerial vehicle to fly to the desired landing area according to the position and size of the image of the desired landing area in the image currently captured by the unmanned aerial vehicle until the unmanned aerial vehicle lands on the ground, including:
step S31, obtaining the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle, and controlling the center of the image of the expected landing area to be at the center of the image currently shot by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle;
step S32, when the unmanned aerial vehicle flies to the sky close to the expected landing area, adjusting the camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical downwards, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
step S33, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, and reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the expected landing area is continuously tracked, and according to the deviation between the center of the image of the expected landing area and the center of the image currently shot by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
step S34, when the unmanned aerial vehicle flies to the desired landing area while the flying height is reduced, and when the image of the desired landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle, taking the center of the image of the desired landing area as the center, and taking one half of the image of the desired landing area as a new image of the precise landing area;
step S35, keeping the angle of the camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the new accurate landing area, and simultaneously reducing the flying height of the unmanned aerial vehicle, wherein in the process, the image of the new accurate landing area is continuously tracked, and according to the deviation between the center of the image of the new accurate landing area and the center of the image currently shot by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently shot by the unmanned aerial vehicle;
repeating the steps S34 and S35 until the drone lands on the ground.
2. The unmanned aerial vehicle autonomous hierarchical landing method based on visual navigation of claim 1, wherein the step S2 includes:
and tracking the image of the expected landing area by using a tracking algorithm of filtering and target color histograms, and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle.
3. The unmanned aerial vehicle autonomous hierarchical landing method based on visual navigation of claim 1, wherein the step S32 includes:
continuously tracking and acquiring the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image shot by the unmanned aerial vehicle so that the center of the expected landing area is positioned in the center of the image shot by the unmanned aerial vehicle;
then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward;
finally, the camera of the drone is aimed vertically downward at the desired landing area.
4. The unmanned aerial vehicle autonomous hierarchical landing method based on visual navigation of any one of claims 1 to 3, wherein the step S33 includes: keeping the angle of a camera of the unmanned aerial vehicle vertically downward, continuously tracking the image of the expected landing area, acquiring the position and the size of the expected landing area, controlling the unmanned aerial vehicle to start descending, and continuously adjusting the posture and the position of the unmanned aerial vehicle by using PID control according to the difference between the coordinate of the center of the expected landing area and the coordinate of the center of the image currently shot by the unmanned aerial vehicle, so that the center of the image of the expected landing area is at the center of the image currently shot by the unmanned aerial vehicle;
the step S35 includes: keep unmanned aerial vehicle's camera angle is perpendicular downwards, continues to track new accurate image of descending the region, acquires new accurate image of descending the region is in position and size in the image that unmanned aerial vehicle shot at present reduce simultaneously unmanned aerial vehicle's height, according to the deviation of the center of the image of new accurate image of descending the region and the current image of shooting of unmanned aerial vehicle uses PID control, constantly adjusts unmanned aerial vehicle's gesture and position make the center of new accurate image of descending the region is in the center of the current image of shooting of unmanned aerial vehicle.
5. The utility model provides an unmanned aerial vehicle is descending system in grades independently based on vision navigation which characterized in that includes:
the first module is used for selecting an image of an expected landing area from images shot by the unmanned aerial vehicle at present;
the second module is used for tracking the image of the expected landing area by using an image tracking algorithm and continuously acquiring the position and the size of the image of the expected landing area in the image currently shot by the unmanned aerial vehicle;
the third module is used for controlling the unmanned aerial vehicle to fly to the expected landing area according to the position and the size of the image of the expected landing area in the image shot by the unmanned aerial vehicle at present until the unmanned aerial vehicle lands on the ground;
the third module, comprising:
a third module, configured to obtain a deviation between a center of the image of the expected landing area and a center of the image currently captured by the unmanned aerial vehicle according to the position and the size of the image of the expected landing area in the image currently captured by the unmanned aerial vehicle, and control the center of the image of the expected landing area to be at the center of the image currently captured by the unmanned aerial vehicle when the unmanned aerial vehicle flies to the expected landing area through a PID algorithm according to the deviation between the center of the image of the expected landing area and the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to adjust a camera angle of the unmanned aerial vehicle until the camera angle of the unmanned aerial vehicle is completely vertical and downward when the unmanned aerial vehicle flies to the sky near the expected landing area, wherein in this process, images of the expected landing area are continuously tracked, and according to a deviation between a center of the image of the expected landing area and a center of an image currently captured by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
a third module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, and reduce the flying height of the unmanned aerial vehicle, where in this process, the image of the expected landing area is continuously tracked, and according to a deviation between the center of the image of the expected landing area and the center of the image currently taken by the unmanned aerial vehicle, when the unmanned aerial vehicle is controlled to fly to the expected landing area through a PID algorithm, the center of the image of the expected landing area is controlled to be at the center of the image currently taken by the unmanned aerial vehicle;
a third fourth module, configured to, when the unmanned aerial vehicle flies to the expected landing area while the flying height is reduced, take one half of the image of the expected landing area as a new image of the accurate landing area, with a center of the image of the expected landing area as a center and a length and a width of the image of the expected landing area corresponding to each other when the image of the expected landing area occupies 2/3 of the image currently captured by the unmanned aerial vehicle;
a fifth module, configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the new accurate landing area, and reduce the flying height of the unmanned aerial vehicle, wherein in this process, the image of the new accurate landing area is continuously tracked, and according to a deviation between a center of the image of the new accurate landing area and a center of an image currently captured by the unmanned aerial vehicle, when the aircraft is controlled to fly to the new accurate landing area through a PID algorithm, the center of the image of the new accurate landing area is controlled to be at the center of the image currently captured by the unmanned aerial vehicle;
and the third sixth module is used for repeatedly executing the third fourth module and the third fifth module until the unmanned aerial vehicle lands on the ground.
6. The unmanned aerial vehicle autonomous hierarchical landing system based on visual navigation of claim 5, wherein the second module is configured to track the image of the desired landing area using a tracking algorithm of filtering plus target color histogram, and continuously obtain the position and size of the image of the desired landing area in the image currently captured by the unmanned aerial vehicle.
7. The unmanned aerial vehicle autonomous staged landing system based on visual navigation of claim 5, wherein the third module is configured to continuously track and acquire the position and size of the image of the expected landing area after the unmanned aerial vehicle captures the image, and continuously adjust the attitude and position of the unmanned aerial vehicle by using PID control according to the difference between the coordinates of the center of the expected landing area and the coordinates of the center of the image currently captured by the unmanned aerial vehicle, so that the center of the expected landing area is located at the center of the image currently captured by the unmanned aerial vehicle; then gradually adjusting the camera angle of the unmanned aerial vehicle, and in the process of adjusting the camera angle, continuously controlling the posture and the position of the unmanned aerial vehicle to enable the center of the image of the expected landing area to be located at the center of the image currently shot by the unmanned aerial vehicle until the camera angle is adjusted to be vertically downward; finally, the camera of the drone is aimed vertically downward at the desired landing area.
8. The unmanned aerial vehicle autonomous hierarchical landing system based on visual navigation of any one of claims 5 to 7, wherein the third module is configured to keep the camera angle of the unmanned aerial vehicle vertically downward, continue to track the image of the expected landing area, acquire the position and size of the expected landing area, control the unmanned aerial vehicle to start descending, and continuously adjust the attitude and position of the unmanned aerial vehicle according to the coordinate of the center of the expected landing area and the difference between the coordinate of the center of the image currently captured by the unmanned aerial vehicle, using PID control, so that the center of the image of the expected landing area is at the center of the image currently captured by the unmanned aerial vehicle;
the third module is used for keeping the camera angle of the unmanned aerial vehicle is vertically downward, and the camera angle of the unmanned aerial vehicle is continuously tracked, the new image of the accurate landing area is obtained, the new image of the accurate landing area is in the position and the size of the image shot by the unmanned aerial vehicle at present, and the height of the unmanned aerial vehicle is reduced at the same time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811236405.XA CN109460047B (en) | 2018-10-23 | 2018-10-23 | Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811236405.XA CN109460047B (en) | 2018-10-23 | 2018-10-23 | Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109460047A CN109460047A (en) | 2019-03-12 |
CN109460047B true CN109460047B (en) | 2022-04-12 |
Family
ID=65608158
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811236405.XA Active CN109460047B (en) | 2018-10-23 | 2018-10-23 | Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109460047B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110018692A (en) * | 2019-04-24 | 2019-07-16 | 五邑大学 | A kind of unmanned plane method of guidance, system, device and storage medium |
JP7476660B2 (en) * | 2020-05-19 | 2024-05-01 | マツダ株式会社 | Vehicle-mounted aircraft control system |
JP7363733B2 (en) * | 2020-09-30 | 2023-10-18 | トヨタ自動車株式会社 | Terminal programs, unmanned aerial vehicles, and information processing equipment |
CN113050664A (en) * | 2021-03-24 | 2021-06-29 | 北京三快在线科技有限公司 | Unmanned aerial vehicle landing method and device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
CN102096927A (en) * | 2011-01-26 | 2011-06-15 | 北京林业大学 | Target tracking method of independent forestry robot |
CN102831392A (en) * | 2012-07-09 | 2012-12-19 | 哈尔滨工业大学 | Device for remote iris tracking and acquisition, and method thereof |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
CN106054903A (en) * | 2016-07-27 | 2016-10-26 | 中南大学 | Multi-rotor unmanned aerial vehicle self-adaptive landing method and system |
CN206411519U (en) * | 2017-01-04 | 2017-08-15 | 四川克瑞斯航空科技有限公司 | A kind of UAS of video control landing |
CN206671898U (en) * | 2017-02-22 | 2017-11-24 | 北京航天军创技术有限公司 | A kind of more rotor Autonomous landing systems of vision guide |
WO2018015959A1 (en) * | 2016-07-21 | 2018-01-25 | Vision Cortex Ltd. | Systems and methods for automated landing of a drone |
-
2018
- 2018-10-23 CN CN201811236405.XA patent/CN109460047B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
CN102096927A (en) * | 2011-01-26 | 2011-06-15 | 北京林业大学 | Target tracking method of independent forestry robot |
CN102831392A (en) * | 2012-07-09 | 2012-12-19 | 哈尔滨工业大学 | Device for remote iris tracking and acquisition, and method thereof |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
WO2018015959A1 (en) * | 2016-07-21 | 2018-01-25 | Vision Cortex Ltd. | Systems and methods for automated landing of a drone |
CN106054903A (en) * | 2016-07-27 | 2016-10-26 | 中南大学 | Multi-rotor unmanned aerial vehicle self-adaptive landing method and system |
CN206411519U (en) * | 2017-01-04 | 2017-08-15 | 四川克瑞斯航空科技有限公司 | A kind of UAS of video control landing |
CN206671898U (en) * | 2017-02-22 | 2017-11-24 | 北京航天军创技术有限公司 | A kind of more rotor Autonomous landing systems of vision guide |
Non-Patent Citations (1)
Title |
---|
一种旋翼式无人机的视觉着陆位姿估计方法;高扉扉 等;《电光与控制》;20170228;第24卷(第2期);第35-38、80页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109460047A (en) | 2019-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109460047B (en) | Unmanned aerial vehicle autonomous graded landing method and system based on visual navigation | |
US11120261B2 (en) | Imaging control method and device | |
CN110222581B (en) | Binocular camera-based quad-rotor unmanned aerial vehicle visual target tracking method | |
CN106774431B (en) | Method and device for planning air route of surveying and mapping unmanned aerial vehicle | |
EP3315414B1 (en) | Geo-location or navigation camera, and aircraft and navigation method therefor | |
CN108363946B (en) | Face tracking system and method based on unmanned aerial vehicle | |
US9641810B2 (en) | Method for acquiring images from arbitrary perspectives with UAVs equipped with fixed imagers | |
CN105120146A (en) | Shooting device and shooting method using unmanned aerial vehicle to perform automatic locking of moving object | |
US11073389B2 (en) | Hover control | |
CN108702448B (en) | Unmanned aerial vehicle image acquisition method, unmanned aerial vehicle and computer readable storage medium | |
CA2513514C (en) | Compensation for overflight velocity when stabilizing an airborne camera | |
CN105549605B (en) | A method of it is winged to realize that unmanned plane is stared at | |
CN105487552A (en) | Unmanned aerial vehicle tracking shooting method and device | |
CN106054924A (en) | Unmanned aerial vehicle accompanying method, unmanned aerial vehicle accompanying device and unmanned aerial vehicle accompanying system | |
CN108475075A (en) | A kind of control method, device and holder | |
CN106973221B (en) | Unmanned aerial vehicle camera shooting method and system based on aesthetic evaluation | |
WO2017084240A1 (en) | Target positioning and tracking system, device, and positioning and tracking method | |
CN107783555B (en) | Target positioning method, device and system based on unmanned aerial vehicle | |
CN106094876A (en) | A kind of unmanned plane target locking system and method thereof | |
CN110879617A (en) | Infrared-guided unmanned aerial vehicle landing method and device | |
CN106976561A (en) | A kind of unmanned plane photographic method | |
CN110896331B (en) | Method, device and storage medium for measuring antenna engineering parameters | |
CN106292716A (en) | A kind of rock-climbing tracking system and tracking | |
CN114326771A (en) | Unmanned aerial vehicle shooting route generation method and system based on image recognition | |
WO2019030820A1 (en) | Flying vehicle, flying vehicle control device, flying vehicle control method, and flying vehicle control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230707 Address after: Room No. 388, Zhengwei East Road, Jinxi Town, Kunshan City, Suzhou City, Jiangsu Province, 215324 Patentee after: Kunshan Helang Aviation Technology Co.,Ltd. Address before: Room No. 388, Zhengwei East Road, Jinxi Town, Kunshan City, Suzhou City, Jiangsu Province, 215324 Patentee before: Yuneec International (China) Co.,Ltd. |
|
TR01 | Transfer of patent right |