CN109460046B - Unmanned aerial vehicle natural landmark identification and autonomous landing method - Google Patents

Unmanned aerial vehicle natural landmark identification and autonomous landing method Download PDF

Info

Publication number
CN109460046B
CN109460046B CN201811213147.3A CN201811213147A CN109460046B CN 109460046 B CN109460046 B CN 109460046B CN 201811213147 A CN201811213147 A CN 201811213147A CN 109460046 B CN109460046 B CN 109460046B
Authority
CN
China
Prior art keywords
image
unmanned aerial
aerial vehicle
landing
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811213147.3A
Other languages
Chinese (zh)
Other versions
CN109460046A (en
Inventor
朱航
裴思宇
李宏泽
黄钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811213147.3A priority Critical patent/CN109460046B/en
Publication of CN109460046A publication Critical patent/CN109460046A/en
Application granted granted Critical
Publication of CN109460046B publication Critical patent/CN109460046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Abstract

A method for recognizing natural landmark and autonomously landing an unmanned aerial vehicle belongs to the technical field of machine vision navigation, and comprises the steps of determining a landing area on a satellite digital map according to a given pre-landing coordinate, shooting an aerial image by the unmanned aerial vehicle at the pre-landing coordinate, carrying out filtering, graying, binarization processing, edge feature extraction and Hough transformation on the satellite digital map and the aerial image, extracting a continuous geometric curve, matching the satellite digital map and the aerial image by adopting a weighted Hausdorff distance matching algorithm, calculating the coordinate of the centroid of the area in the aerial image of the unmanned aerial vehicle relative to the unmanned aerial vehicle according to the Green's theorem, calculating the space coordinate of the centroid of the area according to a projection relation, and guiding the unmanned aerial vehicle to autonomously land at the space coordinate of the centroid of the area. The method can ensure that the unmanned aerial vehicle autonomously identifies the optimal landing point within the designated range, accurately lands, can make up for the defect of large autonomous landing error under GPS navigation, and improves the safety and reliability of autonomous landing.

Description

Unmanned aerial vehicle natural landmark identification and autonomous landing method
Technical Field
The invention belongs to the technical field of machine vision navigation, and particularly relates to a natural landmark identification and autonomous landing method for an unmanned aerial vehicle.
Background
In recent years, with the development of micro inertial navigation systems, flight control systems, micro electromechanical systems and novel materials, the research on micro unmanned aerial vehicles has greatly progressed. The rotary wing type micro unmanned aerial vehicle has the advantages of being good in flexibility, compact in structure, low in cost, fast in data acquisition and the like, and the application range also covers various fields including but not limited to pesticide spraying, geological surveying, searching and rescuing, cargo transportation, mapping and the like. Due to the limitation of the response speed and the working efficiency of the people for acquiring the information, the tasks are automatically completed by the unmanned aerial vehicle as much as possible, the actions of automatic take-off and landing, path planning, obstacle avoidance, ground imitation flying and the like are realized through a set program or the automatic planning of the unmanned aerial vehicle, and the accuracy and the reliability of the operation process are ensured.
In the aspect of unmanned aerial vehicle autonomous landing, an autonomous landing mode based on GPS navigation is mostly used at present, namely, a GPS sensor carried by the unmanned aerial vehicle records the geographic coordinate of a take-off time body, or a certain geographic coordinate is specified artificially, and when the unmanned aerial vehicle lands, a GPS positioning system guides the unmanned aerial vehicle to hover over the recorded geographic coordinate and descend for landing. The GPS navigation has the defects of large interference by non-air media, low positioning accuracy and the like, so that the unmanned aerial vehicle has large autonomous landing error in remote areas or areas with a large number of shelters and cannot accurately complete a landing task.
An unmanned aerial vehicle autonomous landing method based on machine vision is one of approaches for solving inaccurate positioning of a GPS (global positioning system), and currently, an autonomous landing method based on artificial landmarks is more applied to a rotor unmanned aerial vehicle. And along with the application of unmanned aerial vehicles in various fields is more and more extensive, also higher and more high to unmanned aerial vehicle's environmental suitability requirement. Some specific tasks require the drone to land in places where artificial landmarks are not suitable, and even require the drone to autonomously find a suitable landing place in a specific area, which requires the drone to have the ability to recognize natural landmarks. Therefore, in order to provide accurate navigation information to the drone and complete a specific autonomous landing task, a natural landmark identification and autonomous landing method for the drone is urgently needed.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle natural landmark identification and autonomous landing method based on machine vision and a satellite digital map, which aims to solve the problems in the prior art.
The natural landmark identification and autonomous landing method for the unmanned aerial vehicle comprises the following steps:
1.1 according to the given Pre-landing coordinates (X)0,Y0,Z0) Determining a landing area P with a convex polygon outline on a satellite digital map, firstly carrying out filtering, graying and binarization processing on an image of the area P, then further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on the area, and finally extracting a continuous geometric curve through Hough transformation to obtain an outline curve I and a reference image A of the area P on the satellite digital map, wherein the binarization processing adopts a maximum inter-class variance method;
1.2, flying an unmanned aerial vehicle to the air above a given pre-landing coordinate, carrying out filtering, graying and binarization processing on an aerial image, further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on a region, and finally extracting a continuous geometric curve through Hough transformation to obtain a profile curve II and a measured image B of a region P on a satellite digital map, wherein the binarization processing also adopts a maximum inter-class variance method;
1.3, matching the reference image A obtained in the step 1.1 with the actual measurement image B obtained in the step 1.2, and confirming a landing area P in the aerial image of the unmanned aerial vehicle; the image matching adopts a weighted Hausdorff distance matching algorithm, and comprises the following steps:
1.3.1 in the reference image A and the actual measurement image B, the 3-4DT algorithm is adopted to carry out the distance conversion of the characteristic point set in the two-dimensional space, and an image distance conversion matrix J is obtainedAAnd JB
1.3.2 extracting branch points in a reference image A and a measured image B, and respectively storing the branch points in matrixes A and B;
1.3.3 according to JA、JBAnd A and B calculate the weighted Hausdorff distance:
H(A,B)=max(hWHD(A,B),hWHD(B,A))
Figure GDA0001893972810000021
Figure GDA0001893972810000022
wherein: A. b is two point sets; n is a radical ofaIs the total number of feature points in point set a; a is a feature point belonging to A; d (a, B) is the distance from the characteristic point a on the point set A to the point set B; h isWHD(A, B) represents the directed distance from point set A to point set B; h isWHD(B, A) represents the directed distance from point set B to point set A;
the point with the minimum Hausdorff distance is the final matching point, so that the preliminary positioning information is obtained;
1.3.4, utilizing a least square algorithm to carry out one-to-one correspondence on all matching point pairs to obtain more accurate position information;
1.4 establishing a two-dimensional plane rectangular coordinate system by taking the unmanned aerial vehicle camera as the origin of coordinates, and calculating the coordinate (x) of the centroid of the area P in the aerial image of the unmanned aerial vehicle relative to the unmanned aerial vehicle according to the Green's theoremc,yc);
1.5 calculating the coordinates (X) of the P centroid of the region from the projection relationshipc,Yc,Zc) The method specifically comprises the following steps:
1.5.1 calculating ground resolution GSD:
Figure GDA0001893972810000023
wherein: GSD represents ground resolution (m); f is the focal length (mm) of the lens; p is the pixel size (mm) of the imaging sensor; h is the corresponding flight height (m) of the unmanned aerial vehicle;
1.5.2 calculating the actual ground distance of the image diagonal, and obtaining the ground distance L between the image diagonal according to the width w and the height h of the image:
Figure GDA0001893972810000024
wherein: GSD represents ground resolution (m); w is the image width; h is the image height;
1.5.3 according to the longitude and latitude of the central point of the image, the distance and the direction angle of the area P centroid relative to the central point, the geographical coordinate of the area P centroid is obtained:
Figure GDA0001893972810000025
Figure GDA0001893972810000031
wherein: theta0∈(0,2π);LonaLongitude of the image center point; lataThe latitude of the central point of the image; ri6378137m is taken as the equatorial radius; rjTaking 6356725m as the extreme radius;
1.5.4 converting geographic coordinates into spatial coordinates to obtain spatial coordinates (X) of P centroid of regionc,Yc,Zc):
Figure GDA0001893972810000032
Wherein: n is the curvature radius; lon is longitude; lat is latitude; h is elevation;
1.6 unmanned aerial vehicle flies to space coordinate (X)c,Yc,Zc) And (5) landing in the vertical direction in the air.
The method can ensure that the unmanned aerial vehicle autonomously identifies the optimal landing point within the designated range, accurately lands, can make up for the defect of large autonomous landing error under GPS navigation, and improves the safety and reliability of autonomous landing.
Drawings
FIG. 1 is a flowchart of a method for identifying natural landmarks and autonomously landing for unmanned aerial vehicles
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below.
Step one, according to the given pre-landing coordinate (X)0,Y0,Z0) Determining a proper landing area P (requiring the outline of the area to be a convex polygon) on a satellite digital map, firstly carrying out filtering processing, graying processing and binarization processing on the image of the area P, further carrying out edge feature extraction, removing partial miscellaneous points, reserving main edge features based on the area, finally extracting a continuous geometric curve through Hough transformation to obtain an outline curve of the area P on the satellite digital map, and obtaining a reference image A, wherein the binarization processing selects a maximum inter-class variance method, supposing that T is a selected global threshold, pixels of all pixel points of the image are divided into a foreground and a background according to T as a boundary, and omega is a global threshold1And ω2Respectively representing the proportion of pixels belonging to the background and the foreground in the whole image, then:
Figure GDA0001893972810000033
Figure GDA0001893972810000034
wherein: p (i) represents the probability of the pixel with the pixel value i appearing in the image.
μ0And mu1Respectively representing the average value of the pixels of the background pixel and the foreground pixel, and if mu is the average pixel value of all the pixels, then:
Figure GDA0001893972810000035
Figure GDA0001893972810000036
Figure GDA0001893972810000037
the variance σ between classes corresponding to the threshold2(T) is defined as:
σ2(T)=ω0(T)[μ0(T)-μ(T)]21(T)[μ1(T)-μ(T)]2=ω0(T)ω1(T)[μ0(T)-μ1(T)]2
and traversing each gray value, and finding out the threshold T corresponding to the maximum inter-class variance, namely the threshold.
And secondly, flying the unmanned aerial vehicle to the vicinity of the upper space of the given pre-landing coordinate, carrying out filtering processing, graying processing and binarization processing on the aerial image, further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on the area, and finally extracting a continuous geometric curve through Hough transformation to obtain a contour curve of the area P on the satellite digital map so as to obtain a real-measurement image B. The binarization processing also selects a maximum inter-class variance method.
And step three, matching the reference image A with the actual measurement image B, and confirming the landing area P in the aerial image of the unmanned aerial vehicle. The image matching adopts a weighted Hausdorff distance matching algorithm, and comprises the following specific steps:
(1) in the reference image A and the actual measurement image B, the distance conversion of the feature point set in the two-dimensional space is carried out by adopting a 3-4DT algorithm to obtain an image distance conversion matrix JAAnd JB
(2) Extracting branch points in the reference image A and the measured image B, and respectively storing the branch points in the matrixes A and the matrix B;
(3) according to JA、JBAnd A and B calculate the weighted Hausdorff distance:
H(A,B)=max(hWHD(A,B),hWHD(B,A))
Figure GDA0001893972810000041
Figure GDA0001893972810000042
wherein: A. b is two sets of points, NaIs the total number of feature points in the point set A, a is a feature point belonging to A, d (a, B) is the distance from the feature point a to the point set B on the point set A, hWHD(A,B)、hWHD(B, A) represent the directional distances from point set A to point set B and from point set B to point set A, respectively.
The point with the minimum Hausdorff distance is the final matching point to obtain preliminary positioning information.
(4) And performing one-to-one correspondence on all the matching point pairs by using a least square algorithm to acquire more accurate position information.
Establishing a two-dimensional plane rectangular coordinate system by taking the unmanned aerial vehicle camera as a coordinate origin, and calculating the coordinate (x) of the centroid of the region P in the aerial image of the unmanned aerial vehicle relative to the unmanned aerial vehiclec,yc)。
According to green's theorem, the closed contour along region P integrates:
Figure GDA0001893972810000043
Figure GDA0001893972810000044
after discretization, the above formula translates to:
Figure GDA0001893972810000051
Figure GDA0001893972810000052
step five, calculating the coordinates (X) of the P centroid of the region according to the projection relationc,Yc,Zc):
(1) Calculating the ground resolution:
Figure GDA0001893972810000053
wherein: GSD represents ground resolution (m), f is lens focal length (mm), P is imaging sensor's pixel size (mm), and H is the corresponding flight height (m) of unmanned aerial vehicle.
(2) Calculating the actual ground distance of the image diagonal, and obtaining the ground distance between the image diagonal according to the width w and the height h of the image:
Figure GDA0001893972810000054
(3) according to the longitude and latitude of the central point of the image, the distance and the direction angle of the area P centroid relative to the central point, the geographic coordinate of the area P centroid is obtained:
Figure GDA0001893972810000055
Figure GDA0001893972810000056
wherein: theta0∈(0,2π),Lona、LataIs the longitude and latitude of the center point of the image, RiTaking 6378137m, R as the equatorial radiusj6356725m was taken for the polar radius.
(4) Conversion between a geographical coordinate to an inter-spatial coordinate system
Figure GDA0001893972810000057
Wherein: n is curvature radius, Lon, Lat and H are longitude, latitude and elevation respectively, and space coordinates (X) of the centroid of the area P is obtainedc,Yc,Zc)。
Step six, the unmanned plane flies to a space coordinate (X)c,Yc,Zc) Go to hang in the upper airLanding in a straight direction.

Claims (1)

1. A natural landmark identification and autonomous landing method for an unmanned aerial vehicle is characterized by comprising the following steps:
1.1 Pre-landing X according to given0,Y0,Z0Determining a landing area P with a convex polygon outline on a satellite digital map, performing filtering, graying and binarization processing on an image of the area P, further performing edge feature extraction, removing part of miscellaneous points, reserving main edge features based on the area, and finally extracting a continuous geometric curve through Hough transformation to obtain an outline curve I and a reference image A of the area P on the satellite digital map, wherein the binarization processing adopts a maximum inter-class variance method;
1.2, flying an unmanned aerial vehicle to the air above a given pre-landing coordinate, carrying out filtering, graying and binarization processing on an aerial image, further carrying out edge feature extraction, removing part of miscellaneous points, reserving main edge features based on a region, and finally extracting a continuous geometric curve through Hough transformation to obtain a profile curve II and a measured image B of a region P on a satellite digital map, wherein the binarization processing also adopts a maximum inter-class variance method;
1.3, matching the reference image A obtained in the step 1.1 with the actual measurement image B obtained in the step 1.2, and confirming a landing area P in the aerial image of the unmanned aerial vehicle; the image matching adopts a weighted Hausdorff distance matching algorithm, and comprises the following steps:
1.3.1 in the reference image A and the actual measurement image B, the 3-4DT algorithm is adopted to carry out the distance conversion of the characteristic point set in the two-dimensional space, and an image distance conversion matrix J is obtainedAAnd JB
1.3.2 extracting branch points in a reference image A and a measured image B, and respectively storing the branch points in matrixes A and B;
1.3.3 according to JA、JBAnd A and B calculate the weighted Hausdorff distance:
H(A,B)=max(hWHD(A,B),hWHD(B,A))
Figure FDA0001831586700000011
Figure FDA0001831586700000012
wherein: A. b is two point sets; n is a radical ofaThe total number of the characteristic points in the point set A is; a is a feature point belonging to A; d (a, B) is the distance from the characteristic point a on the point set A to the point set B; h isWHD(A, B) represents the directed distance from point set A to point set B; h isWHD(B, A) represents the directed distance from point set B to point set A;
the point with the minimum Hausdorff distance is the final matching point, so that the preliminary positioning information is obtained;
1.3.4, utilizing a least square algorithm to carry out one-to-one correspondence on all matching point pairs to obtain more accurate position information;
1.4 use unmanned aerial vehicle camera as the origin of coordinates to establish two-dimensional plane rectangular coordinate system, calculate according to Green's theorem that the centroid of regional P is for unmanned aerial vehicle's x in the unmanned aerial vehicle image of taking photo by planec,ycCoordinates;
1.5 calculating X of P centroid of region according to projection relationc,Yc,ZcThe coordinates specifically include:
1.5.1 calculating ground resolution GSD:
Figure FDA0001831586700000013
wherein: GSD represents ground resolution (m); f is the focal length (mm) of the lens; p is the pixel size (mm) of the imaging sensor; h is the corresponding flight height (m) of the unmanned aerial vehicle;
1.5.2 calculating the actual ground distance of the image diagonal, and obtaining the ground distance L between the image diagonals according to the width w and the height h of the image:
Figure FDA0001831586700000021
wherein: GSD represents ground resolution (m); w is the image width; h is the image height;
1.5.3 according to the longitude and latitude of the central point of the image, the distance and the direction angle of the area P centroid relative to the central point, the geographical coordinate of the area P centroid is obtained:
Figure FDA0001831586700000022
Figure FDA0001831586700000023
wherein: theta0∈(0,2π);LonaLongitude of the image center point; lataThe latitude of the central point of the image; ri6378137m is taken as the equatorial radius; rjTaking 6356725m as the extreme radius;
1.5.4 converting geographic coordinates into spatial coordinates to obtain spatial coordinates X of P centroid of regionc,Yc,Zc
Figure FDA0001831586700000024
Wherein: n is the curvature radius; lon is longitude; lat is latitude; h is elevation;
1.6 unmanned aerial vehicle flies to space coordinate Xc,Yc,ZcAnd (5) landing in the vertical direction in the air.
CN201811213147.3A 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method Active CN109460046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811213147.3A CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811213147.3A CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Publications (2)

Publication Number Publication Date
CN109460046A CN109460046A (en) 2019-03-12
CN109460046B true CN109460046B (en) 2021-08-06

Family

ID=65607782

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811213147.3A Active CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Country Status (1)

Country Link
CN (1) CN109460046B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968112B (en) * 2019-12-12 2023-08-01 哈尔滨工程大学 Unmanned aerial vehicle autonomous landing method based on monocular vision
CN111324145B (en) * 2020-02-28 2022-08-16 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium
CN111626260A (en) * 2020-06-05 2020-09-04 贵州省草业研究所 Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology
CN112419374B (en) * 2020-11-11 2022-12-27 北京航空航天大学 Unmanned aerial vehicle positioning method based on image registration
CN115526896A (en) * 2021-07-19 2022-12-27 中核利华消防工程有限公司 Fire prevention and control method and device, electronic equipment and readable storage medium
CN114998773B (en) * 2022-08-08 2023-02-17 四川腾盾科技有限公司 Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100095665A (en) * 2009-02-12 2010-09-01 한양대학교 산학협력단 Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105550994A (en) * 2016-01-26 2016-05-04 河海大学 Satellite image based unmanned aerial vehicle image rapid and approximate splicing method
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100095665A (en) * 2009-02-12 2010-09-01 한양대학교 산학협력단 Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105550994A (en) * 2016-01-26 2016-05-04 河海大学 Satellite image based unmanned aerial vehicle image rapid and approximate splicing method
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于视觉的无人机自主着陆地标识别方法;李宇 等;《计算机应用研究》;20120731;第29卷(第7期);第2780-2783页 *
新型的无人机自主着陆地标设计与研究;陈勇 等;《电子科技大学学报》;20161130;第45卷(第6期);第934-938页 *

Also Published As

Publication number Publication date
CN109460046A (en) 2019-03-12

Similar Documents

Publication Publication Date Title
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
US20200344464A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects
Vallet et al. Photogrammetric performance of an ultra light weight swinglet UAV
US20190068829A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN109885086B (en) Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance
Hebel et al. Simultaneous calibration of ALS systems and alignment of multiview LiDAR scans of urban areas
CN106054929A (en) Unmanned plane automatic landing guiding method based on optical flow
Hosseinpoor et al. Pricise target geolocation and tracking based on UAV video imagery
CN111492326A (en) Image-based positioning for unmanned aerial vehicles and related systems and methods
JP2015006874A (en) Systems and methods for autonomous landing using three dimensional evidence grid
US20220074744A1 (en) Unmanned Aerial Vehicle Control Point Selection System
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
CN110570463B (en) Target state estimation method and device and unmanned aerial vehicle
CN111024072B (en) Satellite map aided navigation positioning method based on deep learning
Bao et al. Vision-based horizon extraction for micro air vehicle flight control
CN114089787A (en) Ground three-dimensional semantic map based on multi-machine cooperative flight and construction method thereof
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
CN114077249B (en) Operation method, operation equipment, device and storage medium
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN112119428A (en) Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position
KR102289752B1 (en) A drone for performring route flight in gps blocked area and methed therefor
Kamat et al. A survey on autonomous navigation techniques
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN111089580B (en) Unmanned war chariot simultaneous positioning and map construction method based on covariance intersection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant