CN109885086A - A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark - Google Patents

A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark Download PDF

Info

Publication number
CN109885086A
CN109885086A CN201910179487.7A CN201910179487A CN109885086A CN 109885086 A CN109885086 A CN 109885086A CN 201910179487 A CN201910179487 A CN 201910179487A CN 109885086 A CN109885086 A CN 109885086A
Authority
CN
China
Prior art keywords
unmanned plane
profile
image
polygonal shape
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910179487.7A
Other languages
Chinese (zh)
Other versions
CN109885086B (en
Inventor
沈沛意
张亮
上官木天
朱光明
宋娟
鲍珂
谷佳铭
曹彦琛
王旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910179487.7A priority Critical patent/CN109885086B/en
Publication of CN109885086A publication Critical patent/CN109885086A/en
Application granted granted Critical
Publication of CN109885086B publication Critical patent/CN109885086B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to unmanned plane and field of machine vision, specifically a kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark, comprising the following steps: (1) multiple polygonal shape mark is set;(2) unmanned plane pre-processes the drop zone image of shooting;(3) unmanned plane obtains the profile information of binaryzation marginal information image;(4) unmanned plane screens the profile information of binaryzation marginal information image;(5) unmanned plane calculates the length in pixels and center point coordinate of polygonal profile;(6) unmanned plane determines profile combination mode and identifies multiple polygonal shape mark;(7) unmanned plane calculates its relative coordinate with marker;(8) unmanned plane landing control;(9) unmanned plane landing adjustment.The present invention realizes unmanned plane to functions such as the locking of specific objective, tracking and accurate vertical landings.

Description

A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
Technical field
The present invention relates to unmanned plane and field of machine vision, it is specifically a kind of based on multiple polygonal shape mark guidance nobody Machine vertical landing method can be used for unmanned plane Autonomous landing in fixed location.
Background technique
Unmanned plane is that one kind can fast implement the technology platform reached in the air, in electric inspection process, environmental monitoring, geography The fields such as remote sensing, mapping, traffic guidance, communication relaying have a wide range of applications.Unmanned plane is low with use cost, technology is clever It is living, be not required to the advantages that personnel directly reach primary scene, it can be achieved that VTOL, spot hover, position locks, right place patrols The functions such as patrol.With the development and popularization of unmanned plane application technology, each application field and industry to the technical detail of unmanned plane and Function realization is proposed many deep requirements, wherein realizes unmanned plane to the Personnel Dependence etc. of unmanned plane manipulation hand to get rid of Full-automation or programming operations be the development of following unmanned air vehicle technique important directions.Wherein, unmanned plane independent landing is returned The demand of receipts increasingly increases.
In the Autonomous landing technology of existing development, most unmanned plane is positioned and is led by global position system Boat, also some unmanned plane is positioned and is navigated by machine vision method.Using global position system carry out positioning and Its flight precision of the unmanned plane of navigation and positioning accuracy are influenced by factors such as global position system precision and signal strengths, thus right The location navigation of unmanned plane, which generates biggish precision, to be influenced, and certain error is caused, in fixed point inspection, tracking, fixed point landing etc. Bigger difficulty and uncertainty are caused in functional application.Civilian global positioning system precision can only achieve within 10 meters, drop During falling, this precision is unable to reach the requirement of unmanned plane precision approach.It is positioned and is navigated using machine vision method Unmanned plane, the identification of visible sensation method is affected by factors such as marker size, shape, colors, and is easy outer Similar object in portion's environment is interfered, and unmanned plane is caused to be unable to satisfy practical landing requirement.
Therefore, for the above status, there is an urgent need to develop a kind of unmanned plane based on the guidance of multiple polygonal shape mark is vertical Landing method, to overcome the shortcomings of in currently practical application.
Summary of the invention
The purpose of the present invention is to provide a kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark, purports It is influenced when solving to use global position system in the prior art by satellite signal strength and quantity, positioning exists uncertain Property, it is possible to create large error, the problem of cannot achieve accurate landing or in visible sensation method by marker oneself factor and The problem of external environment is interfered.
To achieve the above object, the invention provides the following technical scheme:
A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark, comprising the following steps:
(1) multiple polygonal shape mark is set:
Global position system guidance unmanned plane enters drop zone overhead, and multiple polygonal is arranged in the drop zone of unmanned plane Shape mark, successively nesting forms the regular polygon which is overlapped by three central points inside-out, and adjacent The color of regular polygon is different;
(2) unmanned plane pre-processes the drop zone image of shooting:
Unmanned plane shoots the drop zone for including multiple polygonal shape mark by Airborne Camera, and passes through airborne meter Calculation machine carries out mean filter to the image of shooting, the drop zone image after obtaining noise reduction;
(3) unmanned plane obtains the profile information of binaryzation marginal information image:
3a) unmanned plane carries out edge detection to drop zone image by airborne computer, obtains the two-value of drop zone Change marginal information image;
It includes inflection point number and inflection point coordinate in binaryzation marginal information image that 3b) unmanned plane is extracted by airborne computer Profile information and preservation;
(4) unmanned plane screens the profile information of binaryzation marginal information image:
Unmanned plane is screened by airborne computer by obtained profile information is extracted, and the number of profile inflection point is selected Polygonal profile identical with polygon number of edges in multiple polygonal shape mark, obtain meeting in multiple polygonal shape mark three it is polygon The polygonal profile of shape;
(5) unmanned plane calculates the length in pixels and center point coordinate of polygonal profile:
Unmanned plane by airborne computer according to polygonal profile inflection point coordinate obtain polygonal side length length in pixels and Polygon central point pixel coordinate;
(6) unmanned plane determines profile combination mode and identifies multiple polygonal shape mark:
Unmanned plane obtains present level by the flight parameter of flight controller, and unmanned plane determines polygon according to present level Shape profile combination mode simultaneously carries out multiple polygonal shape landmark identification, the polygonal profile combination identified;
(7) unmanned plane calculates its relative coordinate with marker:
Unmanned plane utilizes the picture of side length in polygonal profile combination in binaryzation marginal information image by airborne computer Plain length and central point pixel coordinate information calculate the relative coordinate (X, Y, H) of unmanned plane and marker;X, Y represent unmanned plane The plane coordinates of current location and marker, H represent the present level of unmanned plane and marker;
(8) unmanned plane landing control:
Unmanned plane passes through the relative coordinate that step (8) obtain, and is compared with the coordinate of polygonal profile combination center point It calculates, obtains expectation displacement, it would be desirable to be displaced incoming flight controller, flight controller obtains rotor angle speed by PID control The output valve of degree carries out the control of unmanned plane landing;
(9) unmanned plane landing adjustment:
Unmanned plane is first horizontal to carry out pose adjustment, until in unmanned plane center point coordinate in horizontal direction and mark article pattern Point coordinate is overlapped, and is landed later.
As a further solution of the present invention: multiple polygonal shape mark described in step (1), three therein just polygon Shape is identical or different.
As a further solution of the present invention: unmanned plane described in step (3) is by airborne computer to drop zone Image carries out edge detection, using Canny edge detection method, realizes step are as follows:
Drop zone image 3a1) is converted into grayscale image, and small threshold value, big threshold are adjusted according to the geometrical characteristic of grayscale image Two parameters of value carry out image segmentation;
Image after segmentation 3a2) is obtained containing only the image of binaryzation marginal information using edge detection Canny operator.
As a further solution of the present invention: the profile information of extraction described in step (4) realizes step are as follows:
4a) influencing the precision of identified contours extract according to unmanned plane imaging error is that polygon perimeter multiplies 0.004 and 5 The maximum value that pixel value is compared;
4b) according to contours extract accuracy value, the closure edge line segment aggregate within accuracy value will be met as profile information.
As a further solution of the present invention: what contours extract described in step (4) obtained is closed contour information, and will Contour area is less than M pixel value and concave contour is removed, wherein 2≤M≤20.
As a further solution of the present invention: polygonal profile central point pixel coordinate (x described in step (7)o, yo) by Formula (1), formula (2) are calculated:
Wherein, M00=∑xyI (x, y), M10=∑xyXI (x, y), M01=∑xyYI (x, y), I (x, y) are current The pixel value of pixel coordinate (x, y) in the bianry image of profile, the variation range of x, y are the region of profile, xc, ycRespectively count Obtained coordinate;Enable (xc, yc) it is used as profile central point (xo, yo):
As a further solution of the present invention: determining that polygonal profile group merges according to present level described in step (7) Multiple polygonal shape landmark identification is carried out, height therein is capable of complete distinguishing mark two kinds of polygons of outer layer in airborne computer When distance range, select polygonal profile identical with mark two kinds of polygons of outer layer as composite figure, highly in airborne meter When calculation machine can completely recognize the distance range of mark two kinds of polygons of internal layer, selection and two kinds of polygon phases of marker internal layer Same polygonal profile is as composite figure;When composite figure is identical as the combined method of mark, then multiple polygonal is regarded as Shape mark.
As a further solution of the present invention: the position coordinates of the current unmanned plane of calculating, camera described in step (8) are burnt Length in pixels away from polygonal side length internal in the composite figure shot for f, video camera is h, the physical length of marker figure Size scaling ratio for L, relative altitude H, image is d, can be derived by the calculation formula (3) of relative altitude H:
Airborne camera center point coordinate (xi, yi) distance marker object centre coordinate (xo, yo) horizontal pixel coordinate be x =xi-xo, longitudinal pixel coordinate is y=yi-yo, the practical relative coordinate of horizontal direction unmanned plane is (X, Y), can be according to formula (4) It is derived by:
X=x × d;Y=y × d (4)
Wherein, d is the size scaling ratio of image, and camera focal length f needs to mark camera by being calculated Available camera transverse direction focal length fx, longitudinal focal length fy after fixed;The calculation formula (5) of camera focal length f are as follows:
F=(fx+fy)/2 (5)。
As a further solution of the present invention: the coordinate described in step (9) with pattern center point is calculated It is expected that being displaced, calculation method is expectation displacementX, Y are the practical relative coordinate values of unmanned plane.
As a further solution of the present invention: if not screening to obtain satisfactory pattern, repeatedly step (2), step Suddenly (3) and step (4);Mark article pattern is not found, then unmanned plane keeps hovering in situ t seconds, and 1≤t≤5 carry out mark quality testing It surveys;If not finding that marker, unmanned plane climb vertically upward always within hovering t seconds, marker detection is carried out;It is maximum Climb altitude is the maximum distance that unmanned plane can recognize that polygon outside marker.
Compared with prior art, the beneficial effects of the present invention are:
1) by being placed with the scaling board of special graph in target position, with machine vision algorithm, camera is acquired The image arrived carries out feature extraction and center of gravity, side length, ratio calculate, so that relative altitude and the relative position of unmanned plane are obtained, Unmanned plane is realized to functions such as the locking of specific objective, tracking and precisely vertical landing;
2) pattern combined using special graph, is reduced the interference of external environment, improves the accuracy rate of identification, simultaneously Reduce the complexity of image processing algorithm;
3) marker that different size pattern is identified by setting different distance improves unmanned plane in high aerial identification mark The precision of will object, while also solving the problems, such as can not distinguishing mark object after unmanned plane drops to low latitude.
Detailed description of the invention
Fig. 1 is implementation flow chart of the unmanned plane in descent.
Fig. 2 is the multiple polygonal shape pattern of object scaling board.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Referring to FIG. 1-2, in the embodiment of the present invention, a kind of unmanned plane vertical landing based on the guidance of multiple polygonal shape mark Method, comprising the following steps:
Step 1) guides nobody by global position system according to the global position system information of the drop zone of setting Machine enters drop zone overhead, reduces unmanned plane height, hovers over unmanned plane away from ground 10-15m height;
Wherein, drop zone is the region for being identified with polygon landing mark, and polygon landing mark is as shown in Fig. 2, institute Regular hexagon, regular pentagon and equilateral triangle of the polygon landing mark stated by center in same point form.Polygon landing The hex area of mark is maximum, followed by pentagon, and triangle is minimum, and each side length of hexagon is 60cm, each side length of pentagon For 45cm, each side length of triangle is 26cm, wherein hexagon and triangle fill color are black, and pentagon fill color is white.
Step 2) is shot image using Airborne Camera, is pre-processed using mean filter, passes through the edge Canny later Detection method extracts the marginal information in image, obtains the image of only binaryzation marginal information;
Obtained edge image is carried out contours extract, extracts all profiles by step 3), including it is interior enclose and circumference, Between profile independently of one another, it extracts the inflection point coordinate information on each profile and is saved;
Wherein, what contours extract obtained is closed contour information, and by contour area less than 10 pixel values and concave contour into Row removal.
Step 4) will be extracted obtained closed contour information and be screened, filters out regular hexagon, regular pentagon and positive three The side length of each polygon and center point coordinate are combined by angular profile according to different height, are 4m or more in height When, figure combination is hexagon and pentagon, and marker side length is pentagon side length, highly in 1m to 4m range, graphical set Conjunction is pentagon and triangle, and marker side length is triangle side length, meets the pattern of figure combination condition, then regards as indicating Object;
Wherein, the hexagon, pentagon of screening and the method for triangle are to extract obtained each profile according to step 3) to turn For point number come what is determined, inflection point number is respectively 6,5 and 3;It whether is that the Rule of judgment of regular polygon is divided to two, one is that comparison is each Whether the side length of a profile is consistent, the other is comparative silhouette central point is to each side length apart from whether identical, profile central point (xo, yo) by formula (1), formula (2) is calculated:
Wherein, M00=∑xyI (x, y), M10=∑xyXI (x, y), M01=∑xyYI (x, y), I (x, y) are current The pixel value of pixel coordinate (x, y) in the bianry image of profile, the variation range of x, y are the region of profile;xc, ycRespectively count Obtained coordinate, enables (xc, yc) it is used as profile central point (xo, yo):
The condition of figure combination is that the central point of two figures is overlapped, when hexagon is combined with pentagon, hexagon side Long to be greater than pentagon side length, pentagon pixel region is in hexagonal pixels region, when pentagon is combined with triangle, pentagon Side length is greater than triangle side length, and in pentagon pixel region, that detects meets figure combination side in triangular pixel region The pattern of formula is considered marker;
If not screening to obtain satisfactory pattern, repeatedly step 2), step 3) and step 4), do not find marker Pattern, then unmanned plane keeps hovering in situ 3 seconds, carries out marker detection, if not finding marker always in hovering 3 seconds, Unmanned plane climbs upwards, carries out marker detection, and maximum climb altitude is 15m.
Step 5) is worked as using the pixel side length and central point pixel coordinate information of marker in current frame image to calculate The position coordinates (X, Y, H) of preceding unmanned plane;X, Y represent the level coordinates of unmanned plane current location and marker, and H represents nobody The relative altitude of machine and marker;
Wherein, camera focal length is f, and the length in pixels of internal polygon is h, mark in the combination pattern of video camera shooting The physical length of object side length is L, and the size scaling ratio of relative altitude H, image are d, can be derived by relative altitude H's Calculation formula (3):
Airborne camera central point (xi, yi) distance marker object center (xo, yo) horizontal pixel coordinate be x=xi-xo, indulge It is y=y to pixel coordinatei-yo, then the practical relative coordinate of horizontal direction is (X, Y), it can be derived by according to formula (4):
X=x × d;Y=y × d (4)
Wherein, d is the size scaling ratio of image, and camera focal length f needs to mark camera by being calculated Available camera transverse direction focal length fx, longitudinal focal length fy after fixed.The calculation formula (5) of camera focal length f are as follows:
F=(fx+fy)/2 (5)
Step 6), the unmanned plane position coordinates obtained according to step 5) are compared calculating with the coordinate of pattern center point, Obtain expectation displacement, it would be desirable to be displaced incoming flight controller, flight controller obtains the defeated of rotor angular speed by PID control It is worth out, carries out the control of unmanned plane landing;
Step 7), unmanned plane is first horizontal to carry out pose adjustment, until unmanned plane center point coordinate in horizontal direction and mark Article pattern midpoint coordinates is overlapped, and is landed later.
Concrete application scheme are as follows:
1) global position system guidance unmanned plane enters drop zone overhead;
2) image shot by camera is used, is extracted in image after mean filter pretreatment by Canny edge detection method Marginal information, obtain the image of binaryzation marginal information;
3) obtained edge image is subjected to contours extract, extracts all profiles, the inflection point coordinate on each profile is believed It ceases and is saved;
4) profile information is screened, hexagon, pentagon and triangle are obtained, by different graphic according to identification condition and combination Mode is combined;
5) using the pixel side length and central point pixel coordinate information of marker in current frame image, come calculate it is current nobody The position coordinates (X, Y, H) of machine;
6) unmanned plane landing is controlled according to unmanned plane position coordinates;
7) unmanned plane horizontal direction is adjusted, later vertical landing.
In conclusion the present invention passes through computer vision algorithms make, using the pattern with special geometric figure as unmanned plane The marker of landing guides, and camera acquired image is carried out feature extraction and side length, the ratio at center calculate, from And relative altitude and the relative position of unmanned plane and marker are obtained, then by flow operations of specifically landing, realize nobody Machine is to functions such as the detection of specific objective, tracking and precisely vertical landing.
The landing test of unmanned plane practical flight:
This experiment test unmanned plane in the route flight of plan, reach regulation landing point after performance objective object identification with The task of landing.Wherein, the winged control needs of unmanned plane are communicated with the ground control cabinet of laptop, and notebook is logical It crosses WiFi and remotely controls wonderful foresight on unmanned plane.Fly control to be also required to keep communicating when executing and landing with wonderful foresight, it is logical using serial ports Letter.Interprocess communication is used between image procossing process and data-transfer process, start has mainly been used in the order for flying control end, Stop, end, read_imu, write_dis five orders.Yawing angular data is that itself surveyed yaw angle is passed to by unmanned plane It is used in wonderful foresight for image procossing process, by the depth information of unmanned plane, NED coordinate, which passes to, flies control end at wonderful foresight end.To guarantee to fly Row order section UAV Communication it is unimpeded, flying height is set as 2 meters, causes to hurt to people or other objects to prevent unmanned plane out of control Evil, unmanned plane speed are set as 0.3m/s, and landing place is centered on landing point, and the border circular areas that 2 meters of radius is unobstructed nearby Object, ground is red plastic place, convenient for the identification of target object.Identify that object is the label object of 13.8cm*14cm.
Unmanned plane first carries out flight planned route, and the terminal of flight planned route is landing point, and unmanned plane is hanged at this time Stop, landing instruction is waited to issue, after executing landing instruction, unmanned plane hovering position is the northeast corner of target object, nobody Machine a period of time hovering adjustment after, target object can be accurately identified, and fall, fall trajectory be unmanned plane from The northeast corner of target object flies to southwest corner.Dropping process, unmanned plane can according to target object and oneself relative distance into The adjustment of line position appearance.Unmanned plane level point distance objective object distance is 0.92 meter, within the scope of normally landing, unmanned plane hovering position It sets.
It lands and tests it can be found that the error completely landed after unmanned plane identification object is in 1 meter of range on the spot through unmanned plane It is interior, meet the needs of unmanned plane pinpoint landing.Error producing cause analysis: coordinate data and unmanned plane by combining wonderful foresight end Flight video it can be concluded that, the data of wonderful foresight end transmission and fly the data that receive of control termination and have delay, and set in communication protocol Stop instruction be, when unmanned plane can't detect target object, unmanned plane will hover.So in unmanned plane by the data of delay When data as winged control are adjusted, target object has had moved out the visual field of unmanned plane, and unmanned plane directly executes at this time Stop instruction causes the pose of unmanned plane to adjust and does not complete also, just hovered.
The unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark, has following prominent characteristics:
1) by being placed with the scaling board of special graph in target position, with machine vision algorithm, camera is acquired The image arrived carries out feature extraction and center of gravity, side length, ratio calculate, so that relative altitude and the relative position of unmanned plane are obtained, Unmanned plane is realized to functions such as the locking of specific objective, tracking and precisely vertical landing;
2) pattern combined using special graph, is reduced the interference of external environment, improves the accuracy rate of identification, simultaneously Reduce the complexity of image processing algorithm;
3) marker that different size pattern is identified by setting different distance improves unmanned plane in high aerial identification mark The precision of will object, while also solving the problems, such as can not distinguishing mark object after unmanned plane drops to low latitude.
The above are merely the preferred embodiment of the present invention, it is noted that for those skilled in the art, not Under the premise of being detached from present inventive concept, several modifications and improvements can also be made, these also should be considered as protection model of the invention It encloses, these all will not influence the effect and patent practicability that the present invention is implemented.

Claims (10)

1. a kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark, which comprises the following steps:
(1) multiple polygonal shape mark is set:
Global position system guidance unmanned plane enters drop zone overhead, and multiple polygonal shape mark is arranged in the drop zone of unmanned plane Will, successively nesting forms the regular polygon which is overlapped by three central points inside-out, and adjacent just more The color of side shape is different;
(2) unmanned plane pre-processes the drop zone image of shooting:
Unmanned plane shoots the drop zone for including multiple polygonal shape mark by Airborne Camera, and passes through airborne computer Mean filter is carried out to the image of shooting, the drop zone image after obtaining noise reduction;
(3) unmanned plane obtains the profile information of binaryzation marginal information image:
3a) unmanned plane carries out edge detection to drop zone image by airborne computer, obtains the binaryzation side of drop zone Edge information image;
3b) unmanned plane extracts the wheel in binaryzation marginal information image including inflection point number and inflection point coordinate by airborne computer Wide information simultaneously saves;
(4) unmanned plane screens the profile information of binaryzation marginal information image:
Unmanned plane is screened by airborne computer by obtained profile information is extracted, and is selected the number of profile inflection point and is answered The identical polygonal profile of polygon number of edges in polygon mark is closed, three polygons are obtained meeting in multiple polygonal shape mark Polygonal profile;
(5) unmanned plane calculates the length in pixels and center point coordinate of polygonal profile:
Unmanned plane obtains the length in pixels of polygonal side length and polygon according to polygonal profile inflection point coordinate by airborne computer Shape central point pixel coordinate;
(6) unmanned plane determines profile combination mode and identifies multiple polygonal shape mark:
Unmanned plane obtains present level by the flight parameter of flight controller, and unmanned plane determines polygonal wheel according to present level Wide combination simultaneously carries out multiple polygonal shape landmark identification, the polygonal profile combination identified;
(7) unmanned plane calculates its relative coordinate with marker:
Unmanned plane is by airborne computer using the pixel of side length is long in polygonal profile combination in binaryzation marginal information image Degree and central point pixel coordinate information calculate the relative coordinate (X, Y, H) of unmanned plane and marker;It is current that X, Y represent unmanned plane The plane coordinates of position and marker, H represent the present level of unmanned plane and marker;
(8) unmanned plane landing control:
Unmanned plane passes through the relative coordinate that step (8) obtain, and is compared calculating with the coordinate of polygonal profile combination center point, Obtain expectation displacement, it would be desirable to be displaced incoming flight controller, flight controller obtains the defeated of rotor angular speed by PID control It is worth out, carries out the control of unmanned plane landing;
(9) unmanned plane landing adjustment:
Unmanned plane is first horizontal to carry out pose adjustment, until unmanned plane center point coordinate in horizontal direction and mark article pattern midpoint are sat Indicated weight closes, and lands later.
2. the unmanned plane vertical landing method according to claim 1 based on the guidance of multiple polygonal shape mark, feature exist In, multiple polygonal shape mark described in step (1), three just polygon shapes therein are identical or different.
3. the unmanned plane vertical landing method according to claim 1 or 2 based on the guidance of multiple polygonal shape mark, feature It is, unmanned plane described in step (3) carries out edge detection to drop zone image by airborne computer, using Canny Edge detection method realizes step are as follows:
Drop zone image 3a1) is converted into grayscale image, and small threshold value, big threshold value two are adjusted according to the geometrical characteristic of grayscale image A parameter carries out image segmentation;
Image after segmentation 3a2) is obtained containing only the image of binaryzation marginal information using edge detection Canny operator.
4. the unmanned plane vertical landing method according to claim 3 based on the guidance of multiple polygonal shape mark, feature exist In the profile information of extraction described in step (4) realizes step are as follows:
4a) influencing the precision of identified contours extract according to unmanned plane imaging error is that polygon perimeter multiplies 0.004 and 5 pixels The maximum value that value is compared;
4b) according to contours extract accuracy value, the closure edge line segment aggregate within accuracy value will be met as profile information.
5. the unmanned plane vertical landing method according to claim 4 based on the guidance of multiple polygonal shape mark, feature exist In what contours extract described in step (4) obtained is closed contour information, and contour area is less than M pixel value and concave contour It is removed, wherein 2≤M≤20.
6. the unmanned plane vertical landing method according to claim 1 based on the guidance of multiple polygonal shape mark, feature exist In polygonal profile central point pixel coordinate (x described in step (7)o, yo) by formula (1), formula (2) is calculated:
Wherein, M00=∑xyI (x, y), M10=∑xyXI (x, y), M01=∑xyYI (x, y), I (x, y) are current outline Bianry image in pixel coordinate (x, y) pixel value, the variation range of x, y is the region of profile, xc, ycRespectively calculate The coordinate arrived;Enable (xc, yc) it is used as profile central point (xo, yo):
7. the unmanned plane vertical landing method according to claim 6 based on the guidance of multiple polygonal shape mark, feature exist In, determine that polygonal profile group is merged into row multiple polygonal shape landmark identification according to present level described in step (7), it is therein Height is when airborne computer is capable of the distance range of complete distinguishing mark two kinds of polygons of outer layer, two kinds of outer layer of selection and mark The identical polygonal profile of polygon highly can completely recognize two kinds of internal layer of mark in airborne computer as composite figure When the distance range of polygon, select polygonal profile identical with two kinds of polygons of marker internal layer as composite figure;When When composite figure is identical as the combined method of mark, then multiple polygonal shape mark is regarded as.
8. the unmanned plane vertical landing method according to claim 7 based on the guidance of multiple polygonal shape mark, feature exist In, position coordinates of the current unmanned plane of calculating described in step (8), camera focal length is f, in the composite figure of video camera shooting The length in pixels of internal polygonal side length is h, and the physical length of marker figure is L, relative altitude H, the size contracting of image Putting ratio is d, can be derived by the calculation formula (3) of relative altitude H:
Airborne camera center point coordinate (xi, yi) distance marker object centre coordinate (xo, yo) horizontal pixel coordinate be x=xi- xo, longitudinal pixel coordinate is y=yi-yo, the practical relative coordinate of horizontal direction unmanned plane is (X, Y), can be derived according to formula (4) It obtains:
X=x × d;Y=y × d (4)
Wherein, d is the size scaling ratio of image, and camera focal length f needs to carry out demarcating it to camera by being calculated Available camera transverse direction focal length fx afterwards, longitudinal focal length fy;The calculation formula (5) of camera focal length f are as follows:
F=(fx+fy)/2 (5)。
9. the unmanned plane vertical landing method according to claim 8 based on the guidance of multiple polygonal shape mark, feature exist In carrying out described in step (9) with the coordinate of pattern center point that expectation displacement is calculated, calculation method is expectation displacementX, Y are the practical relative coordinate values of unmanned plane.
10. the unmanned plane vertical landing method according to claim 1 based on the guidance of multiple polygonal shape mark, feature exist In, if not screening to obtain satisfactory pattern, repeatedly step (2), step (3) and step (4);Marker figure is not found Case, then unmanned plane keeps hovering in situ t seconds, and 1≤t≤5 carry out marker detection;If do not sent out always within hovering t seconds Existing marker, unmanned plane climb vertically upward, carry out marker detection;Maximum climb altitude is that unmanned plane can recognize that mark The maximum distance of beyond the region of objective existence portion polygon.
CN201910179487.7A 2019-03-11 2019-03-11 Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance Active CN109885086B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910179487.7A CN109885086B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910179487.7A CN109885086B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance

Publications (2)

Publication Number Publication Date
CN109885086A true CN109885086A (en) 2019-06-14
CN109885086B CN109885086B (en) 2022-09-23

Family

ID=66931566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910179487.7A Active CN109885086B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance

Country Status (1)

Country Link
CN (1) CN109885086B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597282A (en) * 2019-09-05 2019-12-20 中国科学院长春光学精密机械与物理研究所 Unmanned aerial vehicle autonomous landing control system
CN110989687A (en) * 2019-11-08 2020-04-10 上海交通大学 Unmanned aerial vehicle landing method based on nested square visual information
CN111105394A (en) * 2019-11-27 2020-05-05 北京华捷艾米科技有限公司 Method and device for detecting characteristic information of luminous ball
CN111307157A (en) * 2020-03-12 2020-06-19 上海交通大学 Navigation information resolving method for unmanned aerial vehicle autonomous landing based on vision
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing
CN112365673A (en) * 2020-11-12 2021-02-12 光谷技术股份公司 Forest fire monitoring system and method
CN112907574A (en) * 2021-03-25 2021-06-04 成都纵横自动化技术股份有限公司 Method, device and system for searching landing point of aircraft and storage medium
CN113949142A (en) * 2021-12-20 2022-01-18 广东科凯达智能机器人有限公司 Inspection robot autonomous charging method and system based on visual identification
CN114460970A (en) * 2022-02-22 2022-05-10 四川通信科研规划设计有限责任公司 Unmanned aerial vehicle warehouse positioning identification display method and unmanned aerial vehicle landing method
WO2022104746A1 (en) * 2020-11-20 2022-05-27 深圳市大疆创新科技有限公司 Return control method and device, unmanned aerial vehicle, and computer readable storage medium
CN115790610A (en) * 2023-02-06 2023-03-14 北京历正飞控科技有限公司 System and method for accurately positioning unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN104166854A (en) * 2014-08-03 2014-11-26 浙江大学 Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle
CN106054929A (en) * 2016-06-27 2016-10-26 西北工业大学 Unmanned plane automatic landing guiding method based on optical flow
CN107194399A (en) * 2017-07-14 2017-09-22 广东工业大学 A kind of vision determines calibration method, system and unmanned plane
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN110989687A (en) * 2019-11-08 2020-04-10 上海交通大学 Unmanned aerial vehicle landing method based on nested square visual information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN104166854A (en) * 2014-08-03 2014-11-26 浙江大学 Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN106054929A (en) * 2016-06-27 2016-10-26 西北工业大学 Unmanned plane automatic landing guiding method based on optical flow
CN107194399A (en) * 2017-07-14 2017-09-22 广东工业大学 A kind of vision determines calibration method, system and unmanned plane
CN110989687A (en) * 2019-11-08 2020-04-10 上海交通大学 Unmanned aerial vehicle landing method based on nested square visual information

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘全波 等: "基于视觉的无人机自动着陆定位算法", 《国防电子》 *
曹彦琛: "基于PX4无人机降落控制系统设计与实现", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
葛致磊 等编: "《国防工业出版社》", 31 March 2016 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597282A (en) * 2019-09-05 2019-12-20 中国科学院长春光学精密机械与物理研究所 Unmanned aerial vehicle autonomous landing control system
CN110989687B (en) * 2019-11-08 2021-08-10 上海交通大学 Unmanned aerial vehicle landing method based on nested square visual information
CN110989687A (en) * 2019-11-08 2020-04-10 上海交通大学 Unmanned aerial vehicle landing method based on nested square visual information
CN111105394A (en) * 2019-11-27 2020-05-05 北京华捷艾米科技有限公司 Method and device for detecting characteristic information of luminous ball
CN111105394B (en) * 2019-11-27 2023-06-30 北京华捷艾米科技有限公司 Method and device for detecting characteristic information of luminous pellets
CN111307157A (en) * 2020-03-12 2020-06-19 上海交通大学 Navigation information resolving method for unmanned aerial vehicle autonomous landing based on vision
CN112215860B (en) * 2020-09-23 2024-05-10 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing
CN112215860A (en) * 2020-09-23 2021-01-12 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle positioning method based on image processing
CN112365673A (en) * 2020-11-12 2021-02-12 光谷技术股份公司 Forest fire monitoring system and method
CN112365673B (en) * 2020-11-12 2022-08-02 光谷技术有限公司 Forest fire monitoring system and method
WO2022104746A1 (en) * 2020-11-20 2022-05-27 深圳市大疆创新科技有限公司 Return control method and device, unmanned aerial vehicle, and computer readable storage medium
CN112907574A (en) * 2021-03-25 2021-06-04 成都纵横自动化技术股份有限公司 Method, device and system for searching landing point of aircraft and storage medium
CN112907574B (en) * 2021-03-25 2023-10-17 成都纵横自动化技术股份有限公司 Landing point searching method, device and system of aircraft and storage medium
CN113949142A (en) * 2021-12-20 2022-01-18 广东科凯达智能机器人有限公司 Inspection robot autonomous charging method and system based on visual identification
CN113949142B (en) * 2021-12-20 2022-09-02 广东科凯达智能机器人有限公司 Inspection robot autonomous charging method and system based on visual recognition
CN114460970A (en) * 2022-02-22 2022-05-10 四川通信科研规划设计有限责任公司 Unmanned aerial vehicle warehouse positioning identification display method and unmanned aerial vehicle landing method
CN115790610A (en) * 2023-02-06 2023-03-14 北京历正飞控科技有限公司 System and method for accurately positioning unmanned aerial vehicle

Also Published As

Publication number Publication date
CN109885086B (en) 2022-09-23

Similar Documents

Publication Publication Date Title
CN109885086A (en) A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
US11693428B2 (en) Methods and system for autonomous landing
CN110991207B (en) Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN107240063B (en) Autonomous take-off and landing method of rotor unmanned aerial vehicle facing mobile platform
CN113597591B (en) Geographic benchmarking for unmanned aerial vehicle navigation
CN107194399B (en) Visual calibration method, system and unmanned aerial vehicle
CN108305264B (en) A kind of unmanned plane precision landing method based on image procossing
CN107544550B (en) Unmanned aerial vehicle automatic landing method based on visual guidance
CN105318888B (en) Automatic driving vehicle paths planning method based on unmanned plane perception
CN104049641B (en) A kind of automatic landing method, device and aircraft
CN108153334B (en) Visual autonomous return and landing method and system for unmanned helicopter without cooperative target
Thurrowgood et al. A biologically inspired, vision‐based guidance system for automatic landing of a fixed‐wing aircraft
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN106127201A (en) A kind of unmanned plane landing method of view-based access control model positioning landing end
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
De Croon et al. Sky segmentation approach to obstacle avoidance
CN109739257A (en) Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
CN106326892A (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN107063261A (en) The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane
CN104076817A (en) High-definition video aerial photography multimode sensor self-outer-sensing intelligent navigation system and method
CN109521781A (en) Unmanned plane positioning system, unmanned plane and unmanned plane localization method
CN110068321B (en) UAV relative pose estimation method of fixed-point landing sign
CN109613926A (en) Multi-rotor unmanned aerial vehicle land automatically it is High Precision Automatic identification drop zone method
CN113377118A (en) Multi-stage accurate landing method for unmanned aerial vehicle hangar based on vision
CN103697883A (en) Aircraft horizontal attitude determination method based on skyline imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant