CN109460046A - A kind of unmanned plane identify naturally not with independent landing method - Google Patents

A kind of unmanned plane identify naturally not with independent landing method Download PDF

Info

Publication number
CN109460046A
CN109460046A CN201811213147.3A CN201811213147A CN109460046A CN 109460046 A CN109460046 A CN 109460046A CN 201811213147 A CN201811213147 A CN 201811213147A CN 109460046 A CN109460046 A CN 109460046A
Authority
CN
China
Prior art keywords
unmanned plane
point
coordinate
image
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811213147.3A
Other languages
Chinese (zh)
Other versions
CN109460046B (en
Inventor
朱航
裴思宇
李宏泽
黄钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811213147.3A priority Critical patent/CN109460046B/en
Publication of CN109460046A publication Critical patent/CN109460046A/en
Application granted granted Critical
Publication of CN109460046B publication Critical patent/CN109460046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Astronomy & Astrophysics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A kind of unmanned plane identify naturally not with independent landing method category machine vision navigation technical field, the present invention determines touchdown area according to given pre- landing coordinate on satellite digital map, Aerial Images are shot in pre- landing coordinate using unmanned plane, satellite digital map is filtered with Aerial Images, gray processing, binary conversion treatment, Edge Gradient Feature and Hough transform, extract continuous geometry curve, and the two is matched using Weighted Hausdorff distance matching algorithm, coordinate of the centroid in region in unmanned plane image relative to unmanned plane is calculated according to Green's theorem, according to the space coordinate of projection relation zoning centroid, and guide unmanned plane in the space coordinate independent landing of regional centroid.The present invention can guarantee the unmanned plane best landing point of autonomous classification within the specified range, and precisely land, and can make up the deficiency that independent landing error is big under GPS navigation, and the safety and reliability of independent landing is improved.

Description

A kind of unmanned plane identify naturally not with independent landing method
Technical field
The invention belongs to machine vision navigation technical fields, and in particular to a kind of unmanned plane does not identify not and independent landing naturally Method.
Background technique
In recent years, with the development of micro- inertial navigation system, flight control system, MEMS and new material, miniature drone Research achieve great progress.Wherein rotary wind type Small and micro-satellite is good, compact-sized, at low cost with flexibility, obtains Access according to it is quick the features such as, application range also covers various fields, including but not limited to pesticide spraying, geological exploration, search With rescue, cargo transport and mapping etc..The reaction speed of information and the limitation of working efficiency are obtained due to people, these Business is independently completed by unmanned plane as far as possible, realizes autonomous landing, path by the contexture by self of blas or unmanned plane Planning, avoidance and imitative fly etc. act, and guarantee the accuracy and reliability of operation process.
In terms of unmanned plane independent landing, using more at present is the independent landing mode based on GPS navigation, i.e., by nothing Geographical coordinate locating for man-machine included GPS sensor record takeoff opportunity body, or by artificially specifying some geographical coordinate, in nothing When man-machine landing, is hovered by GPS positioning system guidance unmanned plane in the geographical coordinate overhead recorded and decline landing.And GPS Navigation causes unmanned plane to be located in remote or shelter more there is big, the disadvantages such as positioning accuracy is low are interfered by non-air medium Regional independent landing error it is big, can not accurately complete landing task.
Unmanned plane independent landing method based on machine vision is to solve one of the approach of GPS system position inaccurate, mesh Preceding apply more on rotor wing unmanned aerial vehicle is the independent landing method based on artificial landmark.And as unmanned plane is in every field Using more and more extensive, the adaptive capacity to environment of unmanned plane is required also higher and higher.Some specific mission requirements nobody Machine lands in the place that should not place artificial landmark, or even unmanned plane is required independently to find suitable land in specific region Place, this requires unmanned planes to possess identification target ability naturally.Therefore, believe to provide accurately navigation to unmanned plane Breath, completes specific independent landing task, need the mark naturally of unmanned plane a kind of not with independent landing method.
Summary of the invention
It is an object of the invention to above-mentioned prior art there are aiming at the problem that, propose a kind of based on machine vision and satellite The unmanned plane of numerical map identifies naturally not to be sat using satellite digital map in given pre- landing with independent landing method Mark nearby finds suitable touchdown area, and the Aerial Images of unmanned plane single camera vision system and satellite digital map is made to carry out image Matching, by the processing to unmanned plane image, is calculated best with eliminating the error of GPS navigation in touchdown area Land point coordinate, is precisely landed.
Unmanned plane of the invention identifies naturally not to be included the following steps: with independent landing method
1.1 according to given pre- landing coordinate (X0,Y0,Z0), determine that one piece of profile is convex polygon on satellite digital map The touchdown area P of shape is first filtered the image of region P, gray processing and binary conversion treatment, further implements edge feature It extracts, rejects the miscellaneous point in part, retain the main edge feature based on region, extract continuous geometry curve finally by Hough transform, Obtain the contour curve I and reference picture A of region P on satellite digital map, wherein binary conversion treatment uses maximum between-cluster variance Method;
Aerial Images are filtered, gray processing and binaryzation to given pre- landing coordinate overhead by 1.2 unmanned plane during flyings Edge Gradient Feature is further implemented in processing, rejects the miscellaneous point in part, retains the main edge feature based on region, finally by Hough transform extracts continuous geometry curve, obtains the contour curve II and measuring image B of region P on satellite digital map, In, binary conversion treatment equally uses maximum variance between clusters;
1.3 match the reference picture A that step 1.1 obtains with the measuring image B that step 1.2 obtains, in unmanned plane Touchdown area P is confirmed in Aerial Images;Images match uses Weighted Hausdorff distance matching algorithm, including the following steps:
1.3.1 in reference picture A and measuring image B, feature point set is carried out in two-dimensional space using 3-4DT algorithm Distance conversion, obtains image distance transition matrix JAAnd JB
1.3.2 the branch point in reference picture A and measuring image B is extracted, and is respectively stored into matrix A and B;
1.3.3 according to JA、JB, A and B calculate Weighted Hausdorff distance:
H (A, B)=max (hWHD(A,B),hWHD(B,A))
Wherein: A, B is two point sets;NaIt is the sum of characteristic point in point set A;A is a characteristic point for belonging to A;d(a, B) be on point set A characteristic point a to the distance of point set B;hWHD(A, B) is represented from point set A to the directed distance of point set B;hWHD(B,A) It represents from point set B to the directed distance of point set A;
Point with minimum Hausdorff distance is exactly final match point, thus obtains preliminary location information;
1.3.4 all matching double points are corresponded using least-squares algorithm, to obtain more accurate position Confidence breath;
1.4 establish two-dimensional surface rectangular coordinate system by coordinate origin of unmanned plane camera, calculate nothing according to Green's theorem Coordinate (x of the centroid of region P relative to unmanned plane in man-machine Aerial Imagesc,yc);
1.5 according to the coordinate (X of the projection relation zoning p-shaped heartc,Yc,Zc), it specifically includes:
1.5.1 ground resolution GSD is calculated:
Wherein: GSD indicates ground resolution (m);F is lens focus (mm);P is the pixel dimension of imaging sensor (mm);H is the corresponding flying height (m) of unmanned plane;
1.5.2 it calculates image diagonal line True Ground Range and image diagonal is obtained according to the width w and height h of image Ground distance L between line:
Wherein: GSD indicates ground resolution (m);W is image width;H is image height;
1.5.3 region is acquired with respect to the distance and deflection of central point according to image center point longitude and latitude and the region p-shaped heart The geographical coordinate of the p-shaped heart:
Wherein: θ0∈(0,2π);LonaFor the longitude of image center point;LataFor the latitude of image center point;RiFor equator Radius takes 6378137m;RjFor polar radius, 6356725m is taken;
1.5.4 geographical coordinate is carried out to the conversion of space coordinate, obtains the space coordinate (X of the region p-shaped heartc,Yc,Zc):
Wherein: N is radius of curvature;Lon is longitude;Lat is latitude;H is elevation;
1.6 unmanned plane during flyings are to space coordinate (Xc,Yc,Zc) overhead, carry out vertical direction landing.
The present invention can guarantee the unmanned plane best landing point of autonomous classification within the specified range, and precisely land, and can make up GPS The big deficiency of lower independent landing error of navigating, and the safety and reliability of independent landing is improved.
Detailed description of the invention
Fig. 1 is that unmanned plane identifies the flow chart not with independent landing method naturally
Specific embodiment
For the purposes, technical schemes and advantages in the present invention are more clearly understood, following present invention is further specifically It is bright.
Step 1, according to given pre- landing coordinate (X0,Y0,Z0), determined on satellite digital map one piece it is suitable Land region P (it is required that the region contour is convex polygon), is first filtered the image of region P, gray processing processing, two-value Change processing, further implements Edge Gradient Feature, rejects the miscellaneous point in part, retains the main edge feature based on region, finally leads to It crosses Hough transform extraction continuous geometry curve and obtains the contour curve of region P on satellite digital map, obtain reference picture A, In, binary conversion treatment selects maximum variance between clusters, it is assumed that T is the global threshold chosen, by the pixel of image all pixels point It is that line of demarcation is divided into foreground and background, ω according to T1And ω2It respectively indicates to belong to background and belong to foreground pixel and accounts for entire image Ratio, then:
Wherein: p (i) indicates that pixel value is the probability that the pixel of i occurs in the picture.
μ0And μ1The average value of background and foreground pixel point pixel is respectively indicated, μ is the average pixel value of all pixels point, Then:
The corresponding inter-class variance σ of the threshold value2(T) is defined as:
σ2(T)=ω0(T)[μ0(T)-μ(T)]21(T)[μ1(T)-μ(T)]20(T)ω1(T)[μ0(T)-μ1(T)]2
Each gray value is traversed, the maximum corresponding threshold value T of inter-class variance, as required threshold value are found.
Near unmanned plane during flying to given pre- landing coordinate overhead, Aerial Images are filtered for step 2, ash Degreeization processing, binary conversion treatment further implement Edge Gradient Feature, reject the miscellaneous point in part, retain the main side based on region Edge feature extracts continuous geometry curve finally by Hough transform and obtains the contour curve of region P on satellite digital map, obtains To measuring image B.Wherein, binary conversion treatment equally selects maximum variance between clusters.
Reference picture A is matched with measuring image B, confirms touchdown area in unmanned plane image by step 3 P.Images match uses Weighted Hausdorff distance matching algorithm, the specific steps are as follows:
(1) in reference picture A and measuring image B, using 3-4DT algorithm carry out feature point set in two-dimensional space away from From conversion, image distance transition matrix J is obtainedAAnd JB
(2) branch point in reference picture A and measuring image B is extracted, and is respectively stored into matrix A and B;
(3) according to JA、JB, A and B calculate Weighted Hausdorff distance:
H (A, B)=max (hWHD(A,B),hWHD(B,A))
Wherein: A, B is two point sets, NaIt is the sum of characteristic point in point set A, a is a characteristic point for belonging to A, d (a, It B) is distance of the characteristic point a to point set B, h on point set AWHD(A,B)、hWHD(B, A) respectively represented from point set A to point set B and from Directed distance of the point set B to point set A.
Point with minimum Hausdorff distance is exactly thus final match point obtains preliminary location information.
(4) all matching double points are corresponded using least-squares algorithm, to obtain more accurate position Information.
Step 4 establishes two-dimensional surface rectangular coordinate system by coordinate origin of unmanned plane camera, calculates unmanned plane Coordinate (x of the centroid of region P relative to unmanned plane in imagec,yc)。
According to Green's theorem, the closed contour along region P is integrated:
After discretization, above formula conversion are as follows:
Step 5, according to the coordinate (X of the projection relation zoning p-shaped heartc,Yc,Zc):
(1) ground resolution is calculated:
Wherein: GSD indicates ground resolution (m), and f is lens focus (mm), and P is the pixel dimension of imaging sensor (mm), H is the corresponding flying height (m) of unmanned plane.
(2) image diagonal line True Ground Range is calculated, image diagonal line is obtained according to the width w of image and height h Between ground distance:
(3) region P is acquired with respect to the distance and deflection of central point according to image center point longitude and latitude and the region p-shaped heart The geographical coordinate of centroid:
Wherein: θ0∈(0,2π),Lona、LataFor the longitude and latitude of image center point, Ri6378137m, R are taken for equatorial radiusj 6356725m is taken for polar radius.
(4) it carries out geographical coordinate and is transformed between space the conversion between coordinate system
Wherein: N is radius of curvature, and Lon, Lat, H are respectively longitude, latitude and elevation, and the space for obtaining the region p-shaped heart is sat Mark (Xc,Yc,Zc)。
Step 6, unmanned plane during flying to space coordinate (Xc,Yc,Zc) overhead, carry out vertical direction landing.

Claims (1)

1. a kind of unmanned plane identify naturally not with independent landing method, it is characterised in that include the following steps:
1.1 according to given pre- landing X0,Y0,Z0Coordinate, on satellite digital map determine one piece of profile be convex polygon Land region P is first filtered the image of region P, gray processing and binary conversion treatment, further implements Edge Gradient Feature, The miscellaneous point in part is rejected, the main edge feature based on region is retained, continuous geometry curve is extracted finally by Hough transform, obtains The contour curve I and reference picture A of region P on satellite digital map, wherein binary conversion treatment uses maximum variance between clusters;
Aerial Images are filtered, at gray processing and binaryzation by 1.2 unmanned plane during flyings to given pre- landing coordinate overhead Reason further implements Edge Gradient Feature, rejects the miscellaneous point in part, retains the main edge feature based on region, finally by Hough transform extracts continuous geometry curve, obtains the contour curve II and measuring image B of region P on satellite digital map, In, binary conversion treatment equally uses maximum variance between clusters;
1.3 match the reference picture A that step 1.1 obtains with the measuring image B that step 1.2 obtains, in unmanned plane Touchdown area P is confirmed in image;Images match uses Weighted Hausdorff distance matching algorithm, including the following steps:
1.3.1 in reference picture A and measuring image B, distance of the feature point set in two-dimensional space is carried out using 3-4DT algorithm Conversion, obtains image distance transition matrix JAAnd JB
1.3.2 the branch point in reference picture A and measuring image B is extracted, and is respectively stored into matrix A and B;
1.3.3 according to JA、JB, A and B calculate Weighted Hausdorff distance:
H (A, B)=max (hWHD(A,B),hWHD(B,A))
Wherein: A, B is two point sets;NaFor the sum of characteristic point in point set A;A is a characteristic point for belonging to A;D (a, B) is a little Collect the distance of characteristic point a to point set B on A;hWHD(A, B) is represented from point set A to the directed distance of point set B;hWHD(B, A) represent from Directed distance of the point set B to point set A;
Point with minimum Hausdorff distance is exactly final match point, thus obtains preliminary location information;
1.3.4 all matching double points are corresponded using least-squares algorithm, to obtain more accurate position letter Breath;
1.4 establish two-dimensional surface rectangular coordinate system by coordinate origin of unmanned plane camera, calculate unmanned plane according to Green's theorem X of the centroid of region P relative to unmanned plane in Aerial Imagesc,ycCoordinate;
1.5 according to the X of the projection relation zoning p-shaped heartc,Yc,ZcCoordinate specifically includes:
1.5.1 ground resolution GSD is calculated:
Wherein: GSD indicates ground resolution (m);F is lens focus (mm);P is the pixel dimension (mm) of imaging sensor;H is The corresponding flying height (m) of unmanned plane;
1.5.2 calculate image diagonal line True Ground Range, according to the width w and height h of image, obtain image diagonal line it Between ground distance L:
Wherein: GSD indicates ground resolution (m);W is image width;H is image height;
1.5.3 region p-shaped is acquired with respect to the distance and deflection of central point according to image center point longitude and latitude and the region p-shaped heart The geographical coordinate of the heart:
Wherein: θ0∈(0,2π);LonaFor the longitude of image center point;LataFor the latitude of image center point;RiFor equatorial radius, Take 6378137m;RjFor polar radius, 6356725m is taken;
1.5.4 geographical coordinate is carried out to the conversion of space coordinate, obtains the space coordinate X of the region p-shaped heartc,Yc,Zc:
Wherein: N is radius of curvature;Lon is longitude;Lat is latitude;H is elevation;
1.6 unmanned plane during flyings are to space coordinate Xc,Yc,ZcOverhead carries out vertical direction landing.
CN201811213147.3A 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method Active CN109460046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811213147.3A CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811213147.3A CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Publications (2)

Publication Number Publication Date
CN109460046A true CN109460046A (en) 2019-03-12
CN109460046B CN109460046B (en) 2021-08-06

Family

ID=65607782

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811213147.3A Active CN109460046B (en) 2018-10-17 2018-10-17 Unmanned aerial vehicle natural landmark identification and autonomous landing method

Country Status (1)

Country Link
CN (1) CN109460046B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968112A (en) * 2019-12-12 2020-04-07 哈尔滨工程大学 Unmanned aerial vehicle autonomous landing system and method based on monocular vision
CN111324145A (en) * 2020-02-28 2020-06-23 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium
CN111626260A (en) * 2020-06-05 2020-09-04 贵州省草业研究所 Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology
CN112419374A (en) * 2020-11-11 2021-02-26 北京航空航天大学 Unmanned aerial vehicle positioning method based on image registration
CN114998773A (en) * 2022-08-08 2022-09-02 四川腾盾科技有限公司 Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system
CN115526896A (en) * 2021-07-19 2022-12-27 中核利华消防工程有限公司 Fire prevention and control method and device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100095665A (en) * 2009-02-12 2010-09-01 한양대학교 산학협력단 Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105550994A (en) * 2016-01-26 2016-05-04 河海大学 Satellite image based unmanned aerial vehicle image rapid and approximate splicing method
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100095665A (en) * 2009-02-12 2010-09-01 한양대학교 산학협력단 Automatic landing method, landing apparatus of scanning probe microscope and scanning probe microscope using the same
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105550994A (en) * 2016-01-26 2016-05-04 河海大学 Satellite image based unmanned aerial vehicle image rapid and approximate splicing method
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李宇 等: "基于视觉的无人机自主着陆地标识别方法", 《计算机应用研究》 *
陈勇 等: "新型的无人机自主着陆地标设计与研究", 《电子科技大学学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968112A (en) * 2019-12-12 2020-04-07 哈尔滨工程大学 Unmanned aerial vehicle autonomous landing system and method based on monocular vision
CN111324145A (en) * 2020-02-28 2020-06-23 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium
CN111324145B (en) * 2020-02-28 2022-08-16 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium
CN111626260A (en) * 2020-06-05 2020-09-04 贵州省草业研究所 Aerial photo ground object feature point extraction method based on unmanned aerial vehicle remote sensing technology
CN112419374A (en) * 2020-11-11 2021-02-26 北京航空航天大学 Unmanned aerial vehicle positioning method based on image registration
CN112419374B (en) * 2020-11-11 2022-12-27 北京航空航天大学 Unmanned aerial vehicle positioning method based on image registration
CN115526896A (en) * 2021-07-19 2022-12-27 中核利华消防工程有限公司 Fire prevention and control method and device, electronic equipment and readable storage medium
CN114998773A (en) * 2022-08-08 2022-09-02 四川腾盾科技有限公司 Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system

Also Published As

Publication number Publication date
CN109460046B (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN109460046A (en) A kind of unmanned plane identify naturally not with independent landing method
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN109992006B (en) A kind of accurate recovery method and system of power patrol unmanned machine
CN106647814B (en) A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference
US10778967B2 (en) Systems and methods for improving performance of a robotic vehicle by managing on-board camera defects
CN107544550B (en) Unmanned aerial vehicle automatic landing method based on visual guidance
Martínez et al. On-board and ground visual pose estimation techniques for UAV control
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN103411609B (en) A kind of aircraft return route planing method based on online composition
WO2018035835A1 (en) Methods and system for autonomous landing
CN105644785B (en) A kind of UAV Landing method detected based on optical flow method and horizon
CN110569838A (en) Autonomous landing method of quad-rotor unmanned aerial vehicle based on visual positioning
CN111492326A (en) Image-based positioning for unmanned aerial vehicles and related systems and methods
CN110221625A (en) The Autonomous landing guidance method of unmanned plane exact position
CN110222612A (en) Dynamic target recognition and tracking for unmanned plane Autonomous landing
CN111024072B (en) Satellite map aided navigation positioning method based on deep learning
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
CN108426576A (en) Aircraft paths planning method and system based on identification point vision guided navigation and SINS
CN110058604A (en) A kind of accurate landing system of unmanned plane based on computer vision
CN110083177A (en) A kind of quadrotor and control method of view-based access control model landing
CN107063261A (en) The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
CN116578035A (en) Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
Andert et al. Improving monocular SLAM with altimeter hints for fixed-wing aircraft navigation and emergency landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant