CN110991207B - Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition - Google Patents

Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition Download PDF

Info

Publication number
CN110991207B
CN110991207B CN201911136463.XA CN201911136463A CN110991207B CN 110991207 B CN110991207 B CN 110991207B CN 201911136463 A CN201911136463 A CN 201911136463A CN 110991207 B CN110991207 B CN 110991207B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
target
dimensional code
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911136463.XA
Other languages
Chinese (zh)
Other versions
CN110991207A (en
Inventor
李新
董思远
吴祥雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201911136463.XA priority Critical patent/CN110991207B/en
Publication of CN110991207A publication Critical patent/CN110991207A/en
Application granted granted Critical
Publication of CN110991207B publication Critical patent/CN110991207B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The utility model discloses an unmanned aerial vehicle accurate landing method of synthesizing H pattern recognition and aprilTag two-dimensional code discernment includes: making a target identification image, and arranging the target identification image at a target landing point, wherein the target identification image comprises a two-dimensional code label and a target graph; controlling the unmanned aerial vehicle to stop above a target landing point, and adjusting the camera to a downward vertical direction to enable a target recognition image to appear in the visual field range of the camera; before unmanned aerial vehicle descends to the appointed altitude and after descending to the appointed altitude, discern unmanned aerial vehicle respectively for the offset of target figure and two-dimensional code label to judge whether this offset satisfies predetermined error range value, adjust horizontal distance and positive direction angle, carry out to descend to the target and fall the point. The performance balance during image recognition is ensured, the influence of wind disturbance is solved during landing, and the high precision, stability and reliability of the whole landing process are ensured.

Description

Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
Technical Field
The utility model relates to an accurate landing technical field of unmanned aerial vehicle especially relates to an accurate landing method of unmanned aerial vehicle who synthesizes H pattern recognition and aprilTag two-dimensional code discernment.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Along with the development and the popularization of unmanned aerial vehicle technique, unmanned aerial vehicle's autonomic descending technique is that unmanned aerial vehicle realizes the important one ring of automation, and accurate descending technique requires very high technique to the precision again in descending technique, so corresponding degree of difficulty also is bigger.
Before image recognition based on deep learning, a conventional landing method is assisted to land by means of GPS positioning, the civil-level GPS positioning accuracy can reach about 10m, errors are large, and GPS signals are increased in errors and even lost in areas with more shelters, such as dense buildings or forests, and even land to wrong places under the interference of malicious satellite positioning signals; professional-level high-precision GPS equipment is expensive in cost and has no economic practicability.
And only adopt the image recognition of degree of depth learning, the problem that the unmanned aerial vehicle is disturbed by the wind can't be solved again, AprilTags is a vision reference system, is applicable to various tasks, including augmented reality, robot and camera calibration, and AprilTags detects accurate 3D position, direction and the sign of software calculation label relative to the camera. When unmanned aerial vehicle does not possess better anti-wind performance, AprilTags label discernment can be finely tuned unmanned aerial vehicle according to the influence of wind direction, guarantees that the aircraft can normally hover.
Disclosure of Invention
In order to solve the above problem, the present disclosure provides an unmanned aerial vehicle accurate landing method integrating H pattern recognition and AprilTag two-dimensional code recognition, the unmanned aerial vehicle accurate landing combining image recognition of deep learning and AprilTags, reducing errors brought by unmanned aerial vehicle landing using civil GPS positioning, reducing the burden of mobile devices when processing images in real time, improving the landing accuracy by combining AprilTags, and under severe environment, improving the stability and reliability of accurate landing by solving the problem of wind disturbance when the unmanned aerial vehicle lands.
In order to achieve the purpose, the following technical scheme is adopted in the disclosure:
in a first aspect, the present disclosure provides an unmanned aerial vehicle accurate landing method that integrates H pattern recognition and AprilTag two-dimensional code recognition, including:
making a target identification image, and arranging the target identification image at a target landing point, wherein the target identification image comprises a two-dimensional code label and a target graph;
controlling the unmanned aerial vehicle to stop above a target landing point, and adjusting the camera to a downward vertical direction to enable a target recognition image to appear in the visual field range of the camera;
before unmanned aerial vehicle descends to the appointed altitude and after descending to the appointed altitude, discern unmanned aerial vehicle respectively for the offset of target figure and two-dimensional code label to judge whether this offset satisfies predetermined error range value, adjust horizontal distance and positive direction angle, carry out to descend to the target and fall the point.
As some possible implementation manners, the target graph includes an "H" tag, a triangular tag, and a circle tag, the target identification graph is a rectangle, and includes five two-dimensional code tags disposed at four corners and a center of the rectangle, the "H" tag disposed at the center of the rectangle, the triangular tag disposed above the "H" mark, and the outermost circle tag, the two-dimensional code tags at the four corners of the rectangle are not included in the circle tag, and a color of the tag is different from a color of a background color.
As some possible implementation manners, before the unmanned aerial vehicle lands at the designated height, image recognition is performed on the target graph, whether a horizontal distance deviation value and a positive direction angle deviation value of the unmanned aerial vehicle relative to the target graph meet a preset first error range value or not is judged, the horizontal distance and the positive direction angle are adjusted, and landing is performed to the designated height;
after unmanned aerial vehicle descends to appointed height, carry out image recognition to the two-dimensional code label, calculate the offset of the relative two-dimensional code label of unmanned aerial vehicle to judge whether this offset exceeds predetermined second error range value, adjust horizontal distance and positive direction angle, move the central point of target identification image until unmanned aerial vehicle, carry out the landing to the target landing point.
As some possible implementations, the image recognizing the target pattern includes,
presetting a horizontal distance error value and a positive direction deviation error value, returning the horizontal distance deviation value according to an image shot by the current position of the unmanned aerial vehicle, calculating a positive direction angle deviation value through the coordinates of the recognized triangular pattern and the recognized H pattern, respectively comparing the horizontal distance deviation value and the positive direction angle deviation value with the preset horizontal distance error value and the positive direction deviation error value, and executing a landing operation if the error range is met; and if the error range is not met, performing position adjustment in the horizontal direction by adopting a PID algorithm until the error range is met, and executing landing operation.
As some possible implementations, the image recognizing the two-dimensional code label includes,
calibrating the camera parameters by adopting a checkerboard method to obtain the camera parameters;
identifying a two-dimensional code label in an image shot by a camera;
and carrying out relative positioning on the target landing point according to the acquired camera parameters and the two-dimension code tag, and calculating the rotation angle of the camera according to the pose of the two-dimension code tag and the matrix operation library.
As possible implementation manners, in the stage of identifying the target graph, the unmanned aerial vehicle divides the vertical distance into a plurality of interval ranges, and the descending speeds of the unmanned aerial vehicle in the interval ranges are ensured to be consistent;
unmanned aerial vehicle is at the discernment two-dimensional code label stage, divide into a plurality of intervals ranges with vertical distance, and along with the reduction of height, the falling speed diminishes in these a plurality of intervals within range.
In a second aspect, the present disclosure provides an unmanned aerial vehicle accurate landing system integrating H pattern recognition and AprilTag two-dimensional code recognition, including,
the image drawing module is used for making a target identification image, and the target identification image comprises a two-dimensional code label and a target graph;
the control module is used for controlling the unmanned aerial vehicle to stop above a target landing point and adjusting the camera to a downward vertical direction so that a target recognition image appears in the visual field range of the camera;
the identification and adjustment module is used for identifying the offset of the unmanned aerial vehicle relative to the target graph and the two-dimensional code label before and after the unmanned aerial vehicle lands at the designated height, judging whether the offset meets a preset error range value, adjusting the horizontal distance and the positive direction angle, and executing landing to a target landing point.
Compared with the prior art, the beneficial effect of this disclosure is:
the utility model provides a pair of synthesize accurate descending method of unmanned aerial vehicle of H pattern recognition and aprilTag two-dimensional code discernment guarantees the performance balance when image recognition, solves the problem of wind disturbance when descending, has guaranteed whole descending process high accuracy, stability and reliability.
Dividing the whole landing into a plurality of intervals, setting different values for the error and landing speed of each interval, and gradually improving the precision along with the reduction of the height;
simultaneously, combine the image recognition based on degree of depth study and the advantage of aprilTags two-dimensional code label, improve the precision of descending, alleviate the burden of mobile device when handling the image in real time, finally reach the accurate descending of unmanned aerial vehicle centimetre level.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and are not to limit the disclosure.
FIG. 1 is a flow chart of a method of the present disclosure;
FIG. 2 is a schematic illustration of an apron target identification image;
FIG. 3 is a schematic view of a control flow of the precision landing;
FIG. 4 is a disassembled partial apron pattern;
fig. 5 is a schematic diagram of AprilTags two-dimensional code.
The specific implementation mode is as follows:
the present disclosure is further described with reference to the following drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Example 1
As shown in fig. 1, the present disclosure provides an unmanned aerial vehicle accurate landing method that integrates H pattern recognition and AprilTag two-dimensional code recognition, including:
the method comprises the following steps: making a target identification image, and arranging the target identification image in a target landing point, namely a parking apron; the target identification image comprises a two-dimensional code label and a target graph;
the target identification image is composed of an outermost circle, a triangle, an 'H' and five two-dimensional code labels. The pattern is black and the background is bright yellow;
wherein, the 'H' mark is positioned in the right center of the image, a triangle is arranged above the 'H' mark, one side marked with the triangle corresponds to the front of the unmanned aerial vehicle, the other side corresponds to the rear of the unmanned aerial vehicle, and two sides of the 'H' shape correspond to two sides of a landing leg of the unmanned aerial vehicle; the color of the parking apron is not unique, the color with strong contrast is selected as much as possible, for example, the parking apron in the embodiment adopts bright yellow, the target identification images of the H-shaped mark, the triangle and the circle adopt black, the detection identification degree of the unmanned aerial vehicle on the positioning groove of the parking box through a camera device in the air is enhanced, and under special conditions, when the unmanned aerial vehicle is required to be manually landed, the design of the obvious parking apron can better assist the aircraft;
one of the five two-dimensional code labels is arranged in the middle of the graph, the other four two-dimensional code labels are arranged at the four corners of the graph respectively, the label sizes at the four corners are smaller than that of the middle label, and the two-dimensional code labels can adopt AprilTags labels. The aprilatas tag in the middle of the apron is a main identification tag, and the tags in the four corners are smaller than those in the middle for two reasons: firstly, provide unmanned aerial vehicle's horizontal offset when solving wind and disturbing, secondly along with the decline of unmanned aerial vehicle height, camera field of vision scope reduces, and the appearance in the field of vision that the label of small size can be complete.
Step two: controlling the unmanned aerial vehicle to stop above a target landing point, and adjusting the camera to a downward vertical direction to enable a target recognition image to appear in the visual field range of the camera;
step three: before unmanned aerial vehicle descends to the appointed altitude and after descending to the appointed altitude, discern unmanned aerial vehicle respectively for the offset of target figure and two-dimensional code label to judge whether this offset satisfies predetermined error range value, adjust horizontal distance and positive direction angle, carry out to descend to the target and fall the point.
Carrying out two-stage identification operation on a target identification image according to a shot image, wherein in the first stage, the target image is subjected to image identification, whether a horizontal distance deviation value and a positive direction angle deviation value of the unmanned aerial vehicle relative to the target image meet a preset first error range value is judged, the horizontal distance and the positive direction angle are adjusted, and landing is carried out to a specified height;
setting the maximum value and the minimum value of the unmanned aerial vehicle identification vertical distance in the first identification stage, and dividing the range of the identification vertical distance into a first interval range, a second interval range and a third interval range; the upper limit value of the first interval range is the maximum value of the identification vertical distance, and the lower limit value of the third interval range is the minimum value of the identification vertical distance; next, the lower limit value of the first interval range is the upper limit value of the second interval range, and the lower limit value of the second interval range is the upper limit value of the third interval range.
When the unmanned aerial vehicle is in the first interval range and the second interval range, returning a horizontal distance deviation value according to an image shot by the current position of the unmanned aerial vehicle, comparing the horizontal distance deviation value with the preset horizontal distance error value, and executing landing operation if the horizontal distance deviation value meets the error range; if the error range is not met, position adjustment in the horizontal direction is carried out by adopting a PID algorithm until the returned horizontal distance deviation value meets the error range. And ensuring that the offset distance between the unmanned aerial vehicle and the target recognition image in the horizontal direction meets the error value at the current vertical height.
Presetting a positive direction deviation error value, when the unmanned aerial vehicle is in a third interval range, calculating a positive direction angle deviation value through the coordinates of the identified triangular pattern and the 'H' pattern, comparing the positive direction deviation value with the preset positive direction deviation error value, and executing a landing operation if the error range is met; if the error range is not met, the horizontal distance and the positive direction angle are adjusted until the error range is met, and the landing operation is executed.
And the descending speed of the unmanned aerial vehicle in the first identification stage is ensured to be consistent.
In this embodiment, the first recognition stage is image recognition based on deep learning, specifically described as recognizing circles, triangles and "H", if the height of the drone is greater than 3m and less than 20m, the drone recognizes the pattern shown in fig. 3, and at the same time, the height of the drone is divided into three regions, 20m-8m, 8m-5m and 5m-3m, and for example, wawter with a resolution of 2560 x 1080 is taken as a flat plate, the errors of the three regions are all 30px, the offset ranges in the three regions are larger, but the offset ranges gradually decrease with the decrease of the height, and the same error value represents different actual error ranges, and the decrease speed is maintained at about 0.5 m/s.
When the recognition result is returned, the unmanned aerial vehicle hovers at the current height, the result is compared with the preset horizontal distance error value, and the position in the horizontal direction is adjusted through a PID algorithm until the error range is met. When the unmanned aerial vehicle is in the area of 5m-3m, the unmanned aerial vehicle starts to recognize a triangular pattern, calculates an angle through coordinates returned when recognizing triangles and H, adjusts the angle in the positive direction, ensures that the minimum error of the angle in the positive direction is 10-30 degrees, compares whether the horizontal distance acquired by current image recognition and the angle in the positive direction meet the error range, and executes landing operation if the horizontal distance meets the set value of the error range; if not, hovering the vehicle at the current position, and adjusting the horizontal distance and the positive direction angle until the error range is met.
And in the second stage, after the unmanned aerial vehicle executes landing to a specified height, calculating the offset of the unmanned aerial vehicle relative to the two-dimensional code label, judging whether the offset exceeds a preset second error range value, adjusting the horizontal distance and the positive direction angle until the unmanned aerial vehicle moves to the central point of the target identification image, and executing landing to a target landing point.
Setting the maximum value and the minimum value of the vertical distance recognized by the unmanned aerial vehicle in the second recognition stage, and setting the distance range of the unmanned aerial vehicle in the second recognition stage when the vertical distance is smaller than the minimum value of the vertical distance recognized in the first recognition stage as a fourth interval range;
(1) calibrating the camera parameters by adopting a checkerboard method to obtain the camera parameters; the parameters comprise focal length, distortion parameters and the like;
(2) acquiring an identification image containing a two-dimensional code label shot by a camera, and detecting the two-dimensional code label in the identification image by using an image processing algorithm;
(3) performing relative positioning of a target landing point according to the acquired camera parameters and the two-dimension code label, and calculating a rotation angle of the camera according to the pose of the two-dimension code label and an Eigen matrix operation library;
the camera parameters are calibrated by adopting a checkerboard method, and the identification image is preprocessed; the preprocessing comprises the operations of filtering and denoising the color image;
performing pixel gradient clustering operation on the preprocessed image, and extracting edges and fitting edge lines;
adding a vector pointing to a bright area from a dark area to the edge line, connecting the edge line to obtain a quad loop, judging the quad loop and decoding to obtain camera parameters;
and the pose identifies the two-dimension code tag and the rotation angle of the two-dimension code tag, constructs a pose data equation, and solves the pose of the two-dimension code coordinate system in the imaging plane coordinate system.
The fourth interval range may be divided into a plurality of descending areas according to actual needs, but the descending speed becomes smaller as the height decreases. The error range values preset in the two recognition stages are set as different error values according to different heights.
In the embodiment, when the unmanned aerial vehicle falls below 3m, the AprilTags label is identified, the height of the part is divided into 3m-2m, 2m-1m and 1m-0.5m, and the descending speeds are respectively 0.2m/s, 0.2m/s and 0.1 m/s. Before this, a checkerboard method is needed to calibrate the high-definition camera, and internal parameters including the focal length and distortion parameters of the camera are acquired; acquiring a video stream through an unmanned aerial vehicle holder, and detecting and identifying AprilTags marks in an image by using an image processing algorithm; and (3) utilizing the calibrated camera parameters and combining AprilTags marks to carry out relative positioning on the apron, and calculating the rotation angle relative to the camera according to the returned pose and the Eigen library.
Setting the error range within 12cm, setting the minimum error of the positive direction angle to be 10-30 degrees, comparing whether the currently acquired horizontal distance and the positive direction angle meet the error range, and if the currently acquired horizontal distance and the positive direction angle meet the set value, executing landing operation; if not, hovering the vehicle at the current position, and adjusting the distance and the angle until the error range is met.
Step four: after the unmanned aerial vehicle executes the landing operation to a set height, calculating the offset of the unmanned aerial vehicle relative to the central point of the target identification image, judging whether the offset exceeds a preset error range value, and adjusting the horizontal distance and the positive direction angle until the unmanned aerial vehicle moves to the central point;
according to the shot target image, calculating the offset of the relative central point of the unmanned aerial vehicle through the two-dimensional code labels of the four corners, judging whether the offset exceeds a preset error range value, if so, adjusting the horizontal position through a PID algorithm until the unmanned aerial vehicle is moved to the central point, and executing landing.
In the embodiment, assuming that the height of the unmanned aerial vehicle is below 2m, if the fine adjustment of the aircraft is performed under appropriate conditions, the horizontal displacement and the pose returned by aprilatas tags at four corners of the apron cannot be processed; theoretically, when unmanned aerial vehicle began to descend from the high altitude through the result of discernment return, unmanned aerial vehicle is in a predictable certain extent for the deviation value of air park, if wind has influenced unmanned aerial vehicle fine setting, five aprilTags labels have produced in the unmanned aerial vehicle coordinate and have surpassed preset deviation value, we can think that unmanned aerial vehicle has received the interference of factors such as wind, in order to solve the wind disturbance problem, when unmanned aerial vehicle deviates the air park middle part, as long as can also see the aprilTags label of air park four corners in the camera field of vision, just can calculate unmanned aerial vehicle's offset, then control unmanned aerial vehicle and adjust on horizontal position.
Recalculating the PID according to the difference value between the coordinates of the label in the visual field range and the coordinates of the center of the screen; especially, the aprilTags label in the middle of the parking apron does not exist in the visual field range of the camera, a certain label on four corners appears, the unmanned aerial vehicle hovers to obtain the current horizontal displacement, the position of the unmanned aerial vehicle in the horizontal direction is adjusted through a PID algorithm until the unmanned aerial vehicle is moved to the position near the aprilTags label in the middle of the parking apron, and the next operation is carried out until the unmanned aerial vehicle finishes landing.
Step five: through the steps, the horizontal displacement and the positive direction angle are continuously adjusted, different vertical speeds are used according to the height, and the unmanned aerial vehicle can land accurately.
Example 2
The utility model provides an unmanned aerial vehicle accurate landing system integrating H pattern recognition and Apriltag two-dimensional code recognition, which comprises,
the image drawing module is used for making a target identification image, and the target identification image comprises a two-dimensional code label and a target graph;
the control module is used for controlling the unmanned aerial vehicle to stop above a target landing point and adjusting the camera to a downward vertical direction so that a target recognition image appears in the visual field range of the camera;
the identification and adjustment module is used for identifying the offset of the unmanned aerial vehicle relative to the target graph and the two-dimensional code label before and after the unmanned aerial vehicle lands at the designated height, judging whether the offset meets a preset error range value, adjusting the horizontal distance and the positive direction angle, and executing landing to a target landing point.
The above is merely a preferred embodiment of the present disclosure and is not intended to limit the present disclosure, which may be variously modified and varied by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
Although the present disclosure has been described with reference to specific embodiments, it should be understood that the scope of the present disclosure is not limited thereto, and those skilled in the art will appreciate that various modifications and changes can be made without departing from the spirit and scope of the present disclosure.

Claims (8)

1. Synthesize accurate landing method of unmanned aerial vehicle of H pattern recognition and aprilTag two-dimensional code discernment, its characterized in that includes:
making a target identification image, and arranging the target identification image at a target landing point, wherein the target identification image comprises a two-dimensional code label and a target graph;
controlling the unmanned aerial vehicle to stop above a target landing point, and adjusting the camera to a downward vertical direction to enable a target recognition image to appear in the visual field range of the camera;
before and after the unmanned aerial vehicle lands at the designated height, respectively identifying the offset of the unmanned aerial vehicle relative to the target graph and the two-dimensional code label, judging whether the offset meets a preset error range value, adjusting the horizontal distance and the positive direction angle, and executing landing to a target landing point;
the target graph comprises an 'H' label, a triangular label and a circle label;
the target identification image is rectangular and comprises five two-dimension code tags arranged at four corners and the center of the rectangle, an 'H' tag arranged at the center of the rectangle, a triangular tag arranged above the 'H' tag, and an outermost circle tag, wherein the circle tag does not contain the two-dimension code tags at the four corners of the rectangle, and the sizes of the two-dimension code tags at the four corners are smaller than that of the two-dimension code tag at the middle;
presetting a horizontal distance error value and a positive direction deviation error value before the unmanned aerial vehicle lands at a specified height, returning the horizontal distance deviation value according to an image shot by the current position of the unmanned aerial vehicle, and calculating a positive direction angle deviation value through the coordinates of the identified triangular pattern and the 'H' pattern;
after the unmanned aerial vehicle falls to a specified height, calculating the offset of the unmanned aerial vehicle relative to a central point by identifying the two-dimensional code labels at four corners according to the shot target image; or the camera visual field range does not have a central two-dimensional code label, only has a certain label on the four corners, and unmanned aerial vehicle hovers and acquires the current horizontal displacement, carries out the position adjustment of horizontal direction through the PID algorithm.
2. The method of claim 1, wherein the method for precise landing of an unmanned aerial vehicle integrates H pattern recognition and AprilTag two-dimensional code recognition,
before the unmanned aerial vehicle lands at the designated height, performing image recognition on the target graph, judging whether a horizontal distance deviation value and a positive direction angle deviation value of the unmanned aerial vehicle relative to the target graph meet a preset first error range value, adjusting the horizontal distance and the positive direction angle, and executing landing to the designated height;
after unmanned aerial vehicle descends to appointed height, carry out image recognition to the two-dimensional code label, calculate the offset of the relative two-dimensional code label of unmanned aerial vehicle to judge whether this offset exceeds predetermined second error range value, adjust horizontal distance and positive direction angle, move the central point of target identification image until unmanned aerial vehicle, carry out the landing to the target landing point.
3. The method for unmanned aerial vehicle accurate landing based on H pattern recognition and AprilTag two-dimensional code recognition as claimed in claim 2, wherein the image recognition of the target pattern comprises,
presetting a horizontal distance error value and a positive direction deviation error value, returning the horizontal distance deviation value according to an image shot by the current position of the unmanned aerial vehicle, calculating a positive direction angle deviation value through the coordinates of the recognized triangular pattern and the recognized H pattern, respectively comparing the horizontal distance deviation value and the positive direction angle deviation value with the preset horizontal distance error value and the positive direction deviation error value, and executing a landing operation if the error range is met; and if the error range is not met, performing position adjustment in the horizontal direction by adopting a PID algorithm until the error range is met, and executing landing operation.
4. The method of claim 2, wherein the image recognition of the two-dimensional code tag comprises image recognition of the two-dimensional code tag,
calibrating the camera parameters by adopting a checkerboard method to obtain the camera parameters;
identifying a two-dimensional code label in an image shot by a camera;
and carrying out relative positioning on the target landing point according to the acquired camera parameters and the two-dimension code tag, and calculating the rotation angle of the camera according to the pose of the two-dimension code tag and the matrix operation library.
5. The method of claim 4, wherein the calibrating camera parameters using a checkerboard method comprises, for each of the H pattern recognition and AprilTag two-dimensional code recognition combined, a fine landing of the UAV,
preprocessing the identification image; the preprocessing comprises the operations of filtering and denoising the color image;
performing pixel gradient clustering operation on the preprocessed image, and extracting edges and fitting edge lines;
and adding a vector pointing to a bright area from a dark area to the edge line, connecting the edge line to obtain a loop, judging the loop, decoding, and acquiring camera parameters.
6. The unmanned aerial vehicle accurate landing method integrating H pattern recognition and AprilTag two-dimensional code recognition as claimed in claim 4, wherein the pose is obtained by recognizing the two-dimensional code tag and the rotation angle of the two-dimensional code tag, constructing a pose data equation, and solving the pose of the two-dimensional code coordinate system in the imaging plane coordinate system.
7. The method of claim 1, wherein the method for precise landing of an unmanned aerial vehicle integrates H pattern recognition and AprilTag two-dimensional code recognition,
in the stage of identifying the target graph, the unmanned aerial vehicle divides the vertical distance into a plurality of interval ranges, and the descending speeds of the unmanned aerial vehicle in the interval ranges are ensured to be consistent;
unmanned aerial vehicle is at the discernment two-dimensional code label stage, divide into a plurality of intervals ranges with vertical distance, and along with the reduction of height, the falling speed diminishes in these a plurality of intervals within range.
8. An unmanned aerial vehicle accurate landing system integrating H pattern recognition and Apriltag two-dimensional code recognition is characterized by comprising,
the image drawing module is used for making a target identification image, and the target identification image comprises a two-dimensional code label and a target graph;
the control module is used for controlling the unmanned aerial vehicle to stop above a target landing point and adjusting the camera to a downward vertical direction so that a target recognition image appears in the visual field range of the camera;
the identification and adjustment module is used for respectively identifying the offset of the unmanned aerial vehicle relative to the target graph and the two-dimensional code label before and after the unmanned aerial vehicle lands at the designated height, judging whether the offset meets a preset error range value, adjusting the horizontal distance and the positive direction angle, and executing landing to a target landing point; the target graph comprises an 'H' label, a triangular label and a circle label;
the target identification image is rectangular and comprises five two-dimension code tags arranged at four corners and the center of the rectangle, an 'H' tag arranged at the center of the rectangle, a triangular tag arranged above the 'H' tag, and an outermost circle tag, wherein the circle tag does not contain the two-dimension code tags at the four corners of the rectangle, and the sizes of the two-dimension code tags at the four corners are smaller than that of the two-dimension code tag at the middle;
presetting a horizontal distance error value and a positive direction deviation error value before the unmanned aerial vehicle lands at a specified height, returning the horizontal distance deviation value according to an image shot by the current position of the unmanned aerial vehicle, and calculating a positive direction angle deviation value through the coordinates of the identified triangular pattern and the 'H' pattern;
after the unmanned aerial vehicle falls to a specified height, calculating the offset of the unmanned aerial vehicle relative to a central point by identifying the two-dimensional code labels at four corners according to the shot target image; or the camera visual field range does not have a central two-dimensional code label, only has a certain label on the four corners, and unmanned aerial vehicle hovers and acquires the current horizontal displacement, carries out the position adjustment of horizontal direction through the PID algorithm.
CN201911136463.XA 2019-11-19 2019-11-19 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition Active CN110991207B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911136463.XA CN110991207B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911136463.XA CN110991207B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition

Publications (2)

Publication Number Publication Date
CN110991207A CN110991207A (en) 2020-04-10
CN110991207B true CN110991207B (en) 2021-04-27

Family

ID=70085291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911136463.XA Active CN110991207B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition

Country Status (1)

Country Link
CN (1) CN110991207B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506091A (en) * 2020-05-07 2020-08-07 山东力阳智能科技有限公司 Unmanned aerial vehicle accurate landing control system and method based on dynamic two-dimensional code
CN111397609A (en) * 2020-05-13 2020-07-10 广东星舆科技有限公司 Route planning method, mobile machine and computer readable medium
CN111993421B (en) * 2020-08-11 2022-02-08 苏州瑞得恩光能科技有限公司 Connection system and connection method
CN114647253B (en) * 2020-12-17 2024-04-09 北京三快在线科技有限公司 Unmanned aerial vehicle light-shielding landing method, unmanned aerial vehicle landing platform, unmanned aerial vehicle hangar and building
CN112731442B (en) * 2021-01-12 2023-10-27 桂林航天工业学院 Unmanned aerial vehicle survey and drawing is with adjustable surveying instrument
CN112947526B (en) * 2021-03-12 2022-09-27 华中科技大学 Unmanned aerial vehicle autonomous landing method and system
CN113220020B (en) * 2021-04-30 2023-10-31 西安鲲鹏易飞无人机科技有限公司 Unmanned aerial vehicle task planning method based on graphic labels
CN113593057A (en) * 2021-06-28 2021-11-02 西安坤斗科技有限责任公司 In-road parking space management method based on unmanned aerial vehicle routing inspection
CN113655806B (en) * 2021-07-01 2023-08-08 中国人民解放军战略支援部队信息工程大学 Unmanned aerial vehicle group auxiliary landing method
TWI829005B (en) * 2021-08-12 2024-01-11 國立政治大學 High-altitude positioning center setting method and high-altitude positioning flight control method
CN113821047A (en) * 2021-08-18 2021-12-21 杭州电子科技大学 Unmanned aerial vehicle autonomous landing method based on monocular vision
CN113741496A (en) * 2021-08-25 2021-12-03 中国电子科技集团公司第五十四研究所 Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle
CN113955136B (en) * 2021-09-02 2024-04-05 浙江图盛输变电工程有限公司温州科技分公司 Automatic unmanned aerial vehicle target hanging point calibration transfer station that patrols and examines of electric wire netting
CN113534833A (en) * 2021-09-17 2021-10-22 广东汇天航空航天科技有限公司 Visual tag for autonomous landing of aircraft, autonomous landing method and aircraft
CN113759943A (en) * 2021-10-13 2021-12-07 北京理工大学重庆创新中心 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
CN114200954B (en) * 2021-10-28 2023-05-23 佛山中科云图智能科技有限公司 Unmanned aerial vehicle landing method and device based on Apriltag, medium and electronic equipment
CN114115318B (en) * 2021-12-01 2023-03-17 山东八五信息技术有限公司 Visual method for unmanned aerial vehicle to land on top of moving vehicle
CN114326757A (en) * 2021-12-03 2022-04-12 国网智能科技股份有限公司 Precise landing control method and system for unmanned aerial vehicle
CN114527792A (en) * 2022-01-25 2022-05-24 武汉飞流智能技术有限公司 Unmanned aerial vehicle landing guiding method, device, equipment and storage medium
CN114782841B (en) * 2022-04-21 2023-12-15 广州中科云图智能科技有限公司 Correction method and device based on landing pattern
CN115924157A (en) * 2022-12-07 2023-04-07 国网江苏省电力有限公司泰州供电分公司 Unmanned aerial vehicle single-person operation equipment capable of accurately landing and using method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105652887A (en) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 Unmanned aerial vehicle landing method adopting two-level graph recognition
US9663234B1 (en) * 2015-08-26 2017-05-30 Amazon Technologies, Inc. Aerial package delivery system
CN107399440A (en) * 2017-07-27 2017-11-28 北京航空航天大学 Aircraft lands method and servicing unit
CN108873943A (en) * 2018-07-20 2018-11-23 南京奇蛙智能科技有限公司 A kind of image processing method that unmanned plane Centimeter Level is precisely landed
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN110047109A (en) * 2019-03-11 2019-07-23 南京航空航天大学 A kind of camera calibration plate and its recognition detection method based on self-identifying label

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110239677A (en) * 2019-06-21 2019-09-17 华中科技大学 A kind of unmanned plane autonomous classification target simultaneously drops to the method on the unmanned boat of movement

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9663234B1 (en) * 2015-08-26 2017-05-30 Amazon Technologies, Inc. Aerial package delivery system
CN105652887A (en) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 Unmanned aerial vehicle landing method adopting two-level graph recognition
CN107399440A (en) * 2017-07-27 2017-11-28 北京航空航天大学 Aircraft lands method and servicing unit
CN108873943A (en) * 2018-07-20 2018-11-23 南京奇蛙智能科技有限公司 A kind of image processing method that unmanned plane Centimeter Level is precisely landed
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN110047109A (en) * 2019-03-11 2019-07-23 南京航空航天大学 A kind of camera calibration plate and its recognition detection method based on self-identifying label

Also Published As

Publication number Publication date
CN110991207A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110991207B (en) Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN110989661B (en) Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes
CN109270953B (en) Multi-rotor unmanned aerial vehicle autonomous landing method based on concentric circle visual identification
CN107240063B (en) Autonomous take-off and landing method of rotor unmanned aerial vehicle facing mobile platform
CN110222612B (en) Dynamic target identification and tracking method for autonomous landing of unmanned aerial vehicle
CN108305264B (en) A kind of unmanned plane precision landing method based on image procossing
CN106371447A (en) Controlling method for all-weather precision landing of unmanned aerial vehicle
CN109885086B (en) Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance
CN110221625B (en) Autonomous landing guiding method for precise position of unmanned aerial vehicle
CN102538782B (en) Helicopter landing guide device and method based on computer vision
CN106197422A (en) A kind of unmanned plane based on two-dimensional tag location and method for tracking target
CN106054929A (en) Unmanned plane automatic landing guiding method based on optical flow
CN107063261B (en) Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle
CN106502257B (en) Anti-interference control method for precise landing of unmanned aerial vehicle
CN114415736B (en) Multi-stage visual accurate landing method and device for unmanned aerial vehicle
CN109613926A (en) Multi-rotor unmanned aerial vehicle land automatically it is High Precision Automatic identification drop zone method
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN111221343A (en) Unmanned aerial vehicle landing method based on embedded two-dimensional code
CN109839945A (en) Unmanned plane landing method, unmanned plane landing-gear and computer readable storage medium
Chiu et al. Vision-only automatic flight control for small UAVs
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
Liu et al. Sensor fusion method for horizon detection from an aircraft in low visibility conditions
CN107576329B (en) Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision
CN117058231A (en) Split type aerocar positioning and docking method based on visual depth information
Lee et al. Safe landing of drone using AI-based obstacle avoidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant