CN110488848A - Unmanned plane vision guide it is autonomous drop method and system - Google Patents

Unmanned plane vision guide it is autonomous drop method and system Download PDF

Info

Publication number
CN110488848A
CN110488848A CN201910783859.7A CN201910783859A CN110488848A CN 110488848 A CN110488848 A CN 110488848A CN 201910783859 A CN201910783859 A CN 201910783859A CN 110488848 A CN110488848 A CN 110488848A
Authority
CN
China
Prior art keywords
unmanned plane
pattern
drop point
drop
scene image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910783859.7A
Other languages
Chinese (zh)
Other versions
CN110488848B (en
Inventor
王荣阳
孙亚
李威
孙红伟
王经典
郭文骏
崔亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Aeronautical Radio Electronics Research Institute
Original Assignee
China Aeronautical Radio Electronics Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Aeronautical Radio Electronics Research Institute filed Critical China Aeronautical Radio Electronics Research Institute
Priority to CN201910783859.7A priority Critical patent/CN110488848B/en
Publication of CN110488848A publication Critical patent/CN110488848A/en
Application granted granted Critical
Publication of CN110488848B publication Critical patent/CN110488848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a kind of unmanned plane vision guide it is autonomous drop method, comprising: step 1: unmanned plane flies to drop point overhead effective coverage, monocular cam photographic subjects scene image;Wherein: including identification pattern in target scene image, there is internal pattern in the centre of identification pattern;Step 2: target scene image being handled, identification marking pattern frame by frame, extract feature, solve relative position of the unmanned plane relative to drop point;Wherein: when the unmanned plane calculated is greater than threshold level relative to the height of drop point, using the characteristic size of whole identification pattern as reference measurement relative position;When the unmanned plane that position calculates is lower than threshold level relative to the height of drop point, using the characteristic size of the inside pattern in identification pattern as reference measurement relative position.Step 3: controlling the centering that unmanned plane is completed with drops point, uniform descent to drop point depending on the relative position.The present invention can guide unmanned plane precision approach to designated place.

Description

Unmanned plane vision guide it is autonomous drop method and system
Technical field
The present invention relates to Navigation of Pilotless Aircraft technical field, in particular to a kind of unmanned plane vision guide it is autonomous the method for drop And system.
Background technique
As unmanned air vehicle technique is gradually expanded military, civil field application range, there is the nothing of precisely landing function It is man-machine more and more attention has been paid to, such as accurate landing warship, fixed point launch.Traditional bootstrap technique includes inertial guidance, radar Guidance, the guidance of high-precision satellite etc..The location error of inertial navigation can accumulate increase at any time, influence navigation accuracy;The positioning of radar Precision is limited, and the device is complicated;The guidance of high-precision satellite is by satellite-signal, vulnerable to interference, is difficult to ensure in last depression of order section Reliability.Vision guide carries out respective handling based on image processing techniques, to camera acquired image, in conjunction with camera shooting Posture information in the solving targets motion processes such as head internal reference, constraint condition information, so that the one kind for controlling aircraft landing is new Type airmanship, because its precision is high, anti-interference, imaging and passive imaging, it is at low cost the advantages that, receive extensive attention.Vision measurement master It is divided into monocular vision measurement and two kinds of Binocular vision photogrammetry, Binocular vision photogrammetry needs to install two camera shootings additional on unmanned plane Head, camera installation accuracy is more demanding, realizes complexity, and measures distance and far require baseline longer, for small drone For it is difficult to install.Monocular vision measurement only needs a camera, and the dimension information of combining target pattern can be realized opposite Positioning, system structure is simple, and installation requirement is low, has greater advantage for the application scenarios in specified landing place.
Summary of the invention
In order to solve the defects of prior art, goal of the invention of the invention is to provide a kind of unmanned plane vision guide autonomous Drop method and with a kind of unmanned plane vision guide it is autonomous drop system, target scene image is obtained by monocular cam, is led to It crosses the technologies such as image procossing, target identification, accurate extraction, state switching control and realizes that relative position resolves, for unmanned plane drop Accurate guidance data are provided,
A goal of the invention of the invention is achieved through the following technical solutions:
A kind of unmanned plane vision guide it is autonomous drop method, include the following steps:
Step 1: unmanned plane flies the monocular cam being mounted under unmanned plane ventral to drop point overhead effective coverage Photographic subjects scene image;Wherein: including identification pattern in target scene image, there is internal pattern in the centre of identification pattern;
Step 2: target scene image being handled, identification marking pattern frame by frame, extract feature, solve unmanned plane phase For the relative position of drop point;Wherein: when the unmanned plane calculated is greater than threshold level h ' relative to the height of drop point, When identifying next frame target scene image using the characteristic size of whole identification pattern as reference measurement relative position;Work as position When setting the unmanned plane calculated relative to the height of point is dropped lower than threshold level h ', when identifying next frame target scene image Using the characteristic size of the inside pattern in identification pattern as reference measurement relative position;
Step 3: the centering with drop point is completed relative to the relative position control unmanned plane of drop point according to unmanned plane, it is even Speed drops to drop point.
Preferably, effective coverage is an inverted rotary table, the range of effective coverage are as follows:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
Wherein, height of the upper bottom of rotary table apart from drop point is h1, height of the bottom of rotary table apart from drop point is h2, circle The upper bottom radius of platform is r1, the bottom radius of rotary table is r2, the optical axis direction of camera and the angle in fuselage vertical axis direction are θ, camera vertical resolution be D, the side length of complete identification pattern is w1, internal pattern side length be w2, correctly regarded The minimum resolution for feeling identification pattern needed for measuring is Pmin
Preferably, internal pattern is equipped with the feature of anti-shadow occlusion.
Preferably, extraction feature refers to special by edge of the image processing techniques to complete identification pattern or internal pattern Sign extracts, and using the actual size of the edge feature as measurement parameter.
Preferably, threshold level h ' are as follows:
Another goal of the invention of the invention is achieved through the following technical solutions:
A kind of unmanned plane vision guide it is autonomous drop system, including be mounted on unmanned plane ventral bottom monocular cam, Embedded vision processor and the flight control computer being mounted on inside unmanned plane, it is characterised in that:
When unmanned plane flies to after dropping point overhead effective coverage, monocular cam photographic subjects scene image transmits frame by frame Give embedded vision processor;It wherein, include identification pattern in target scene image, there is internal pattern in the centre of identification pattern;
Embedded vision processor for handling target scene image frame by frame, identification marking pattern, extracts feature, Relative position of the unmanned plane relative to drop is solved, and the station-keeping data of unmanned plane and drop point is assisted according to certain View format is sent to flight control computer;
Flight control computer completes pair with drop point according to unmanned plane relative to the relative position control unmanned plane of drop point In, uniform descent to drop point.
Preferably, flight control computer is also used in the case where insufficient light, and control lighting apparatus carries out identification pattern Illumination.
Beneficial effects of the present invention are to fly to regard to drop point overhead and after entering effective range by monocular in unmanned plane Feel that guide means accurately calculate the relative position between unmanned plane and landing point, and be sent to the flight control computer of unmanned plane, In the case where insufficient light by the way of illuminating marker pattern, so as to all the period of time guidance unmanned plane precision approach to refer to Determine place.
Detailed description of the invention
Fig. 1 be unmanned plane vision guide shown in embodiment one it is autonomous the flow diagram of drop method.
Fig. 2 is the schematic diagram of drop point overhead effective coverage.
Fig. 3 is identification pattern schematic layout pattern.
Fig. 4 be unmanned plane vision guide it is autonomous the structural schematic diagram of drop system.
Fig. 5 is high-precision satellite/vision measurement data and curves of height Z.
Fig. 6 is X-direction high-precision satellite/vision measurement data and curves.
Fig. 7 is Y-direction high-precision satellite/vision measurement data and curves.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.
Embodiment one
Present embodiments provide a kind of unmanned plane vision guide it is autonomous drop method, include the following steps:
Step 1: unmanned plane flies the monocular cam being mounted under unmanned plane ventral to drop point overhead effective coverage Photographic subjects scene image.
Shown in Figure 2, effective coverage is an inverted rotary table, if height of the upper bottom of rotary table apart from drop point is h1, Height of the bottom of rotary table apart from drop point is h2, the upper bottom radius of rotary table is r1, the bottom radius of rotary table is r2, camera The angle in optical axis direction and fuselage vertical axis direction is θ, the side that the vertical resolution of camera is D, complete identification pattern A length of w1, internal pattern side length be w2, it is correct carry out vision measurement needed for identification pattern minimum resolution PminIt is related, effectively The range in region are as follows:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
When unmanned plane enters effective coverage, being fixed on must in the target scene image of the video camera shooting of ventral bottom It so can be comprising the identification pattern dropped on point be arranged in.Shown in Figure 3, identification pattern is square image, pattern identification Middle is equipped with internal pattern, can increase the feature of anti-shadow occlusion on internal pattern, be incident upon on pattern in shadows of objects When can still provide for correctly identify measurement.
Step 2: target scene image being handled, identification marking pattern frame by frame, extract feature, solve unmanned plane phase For the relative position of drop point.
Wherein, when the unmanned plane calculated is greater than threshold level h ' relative to the height of drop point, in identification next frame Using the characteristic size of whole identification pattern as reference measurement relative position when target scene image;When the nothing that position calculates When the man-machine height relative to drop point is lower than threshold level h ', when identifying next frame target scene image in identification pattern Inside pattern characteristic size as reference measurement relative position.
Extraction feature refers to be carried out by edge feature of the image processing techniques to complete identification pattern or internal pattern It extracts, and using the actual size of the edge feature as measurement parameter.
Threshold level h ' is in order to avoid when height reduces, whole identification pattern exceeds camera field of view range, and mentions Preceding setting switches to the height for identifying internal pattern, calculation method are as follows:
Step 3: the centering with drop point is completed relative to the relative position control unmanned plane of drop point according to unmanned plane, it is even Speed drops to drop point.
Embodiment two
It is shown in Figure 3, present embodiments provide a kind of unmanned plane vision guide it is autonomous drop system, including be mounted on nothing The monocular cam of man-machine ventral bottom, the embedded vision processor being mounted on inside unmanned plane and flight control computer.Insertion Formula vision processor is connect with monocular cam by video interface (such as USB, DVI), passes through data-interface with flight control computer (such as network interface, serial ports) connection.
The Focussing of monocular cam is certain particular values, and inner parameter is in these focal lengths in advance by mark It is fixed.The optical axis direction of camera and fuselage vertical axis angular separation θ it is known that the camera optical axis direction hang down with fuselage The mechanical devices such as holder real-time measurement can be passed through to axis direction angle theta and feed back to unmanned plane flying control equipment.When unmanned plane flies To drop point overhead effective coverage, monocular cam photographic subjects scene image sends embedded vision processor to frame by frame. Wherein, in target scene image comprising identification pattern, effective coverage and identification pattern with effective coverage described in embodiment one and Identification pattern is not repeating herein.
Embedded vision processor for handling target scene image frame by frame, identification marking pattern, extracts feature, Relative position of the unmanned plane relative to drop is solved, and the station-keeping data of unmanned plane and drop point is assisted according to certain View format is sent to flight control computer.
Wherein, when the unmanned plane calculated is greater than threshold level h ' relative to the height of drop point, in identification next frame Using the characteristic size of whole identification pattern as reference measurement relative position when target scene image;When the nothing that position calculates When the man-machine height relative to drop point is lower than threshold level h ', when identifying next frame target scene image in identification pattern Inside pattern characteristic size as reference measurement relative position.
Extraction feature refers to be carried out by edge feature of the image processing techniques to complete identification pattern or internal pattern It extracts, and using the actual size of the edge feature as measurement parameter.
Threshold level h ' is in order to avoid when height reduces, whole identification pattern exceeds camera field of view range, and mentions Preceding setting switches to the height for identifying internal pattern, calculation method are as follows:
Flight control computer completes pair with drop point according to unmanned plane relative to the relative position control unmanned plane of drop point In, uniform descent to drop point.
In the case where insufficient light, flight control computer can also control lighting apparatus and illuminate to identification pattern, protect It is able to use under card night, dark surrounds.
This example is in entire descent, and with high precision on the basis of satellite measurement data, the data of vision measurement pass through Coordinate system be converted to aircraft mass center relative to drop point position coordinates (X, Y, Z) as illustrated in figs. 5-7, wherein Fig. 5 be height Z High-precision satellite/vision measurement data and curves, Fig. 6 be X-direction high-precision satellite/vision measurement data and curves, Fig. 7 be the side Y To high-precision satellite/vision measurement data and curves, accuracy of alignment (X, Y-direction) is centimetre rank, height Z essence as seen from the figure Degree is decimetre rank.
In conclusion the present invention for rotor wing unmanned aerial vehicle it is autonomous drop provide a kind of simple and reliable, high-precision vision and draw Lead means, can for unmanned plane it is accurate round the clock drop provide safeguard.
The above description is only a preferred embodiment of the present invention, is not intended to restrict the invention, for those skilled in the art For member, the invention may be variously modified and varied.All within the spirits and principles of the present invention, it is made it is any modification, Equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (7)

1. a kind of unmanned plane vision guide it is autonomous drop method, include the following steps:
Step 1: unmanned plane flies the monocular cam shooting being mounted under unmanned plane ventral to drop point overhead effective coverage Target scene image;Wherein: including identification pattern in target scene image, there is internal pattern in the centre of identification pattern;
Step 2: target scene image is handled frame by frame, identification marking pattern, extract feature, solve unmanned plane relative to Drop point relative position;Wherein: when the unmanned plane calculated is greater than threshold level h ' relative to the height of drop point, knowing Using the characteristic size of whole identification pattern as reference measurement relative position when other next frame target scene image;When position solves When the unmanned plane of calculating is lower than threshold level h ' relative to the height of drop point, in identification next frame target scene image Shi Yibiao The characteristic size of the inside pattern in pattern is known as reference measurement relative position;
Step 3: according to unmanned plane relative to drop point relative position control unmanned plane complete with drop point centering, at the uniform velocity under It is down to drop point.
2. a kind of unmanned plane vision guide according to claim 1 it is autonomous drop method, it is characterised in that effective coverage is One inverted rotary table, the range of effective coverage are as follows:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
Wherein, height of the upper bottom of rotary table apart from drop point is h1, height of the bottom of rotary table apart from drop point is h2, rotary table Upper bottom radius is r1, the bottom radius of rotary table is r2, the optical axis direction of camera and the angle in fuselage vertical axis direction are θ, take the photograph As the vertical resolution of head is D, the side length of complete identification pattern is w1, internal pattern side length be w2, correct carry out vision survey The minimum resolution of identification pattern needed for measuring is Pmin
3. a kind of unmanned plane vision guide according to claim 1 it is autonomous drop method, it is characterised in that on internal pattern Feature equipped with anti-shadow occlusion.
4. a kind of unmanned plane vision guide according to claim 1 it is autonomous drop method, it is characterised in that extraction is characterized in Refer to and extracted by edge feature of the image processing techniques to complete identification pattern or internal pattern, and is special using the edge The actual size of sign is as measurement parameter.
5. a kind of unmanned plane vision guide according to claim 1 it is autonomous drop method, it is characterised in that threshold level h ' Are as follows:
6. a kind of unmanned plane vision guide it is autonomous drop system, monocular cam, peace including being mounted on unmanned plane ventral bottom Embedded vision processor and flight control computer inside unmanned plane, it is characterised in that:
When unmanned plane flies to after dropping point overhead effective coverage, monocular cam photographic subjects scene image is sent to embedding frame by frame Enter formula vision processor;It wherein, include identification pattern in target scene image, there is internal pattern in the centre of identification pattern;
Embedded vision processor for handling target scene image frame by frame, identification marking pattern, extracts feature, solves Out unmanned plane relative to drop point relative position, and by unmanned plane with drop point station-keeping data according to certain agreement lattice Formula is sent to flight control computer;
Flight control computer completes the centering with drop point relative to the relative position control unmanned plane of drop point according to unmanned plane, even Speed drops to drop point.
7. a kind of unmanned plane vision guide according to claim 6 it is autonomous drop system, it is characterised in that flight control computer It is also used in the case where insufficient light, control lighting apparatus illuminates identification pattern.
CN201910783859.7A 2019-08-23 2019-08-23 Unmanned aerial vehicle vision-guided autonomous landing method and system Active CN110488848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910783859.7A CN110488848B (en) 2019-08-23 2019-08-23 Unmanned aerial vehicle vision-guided autonomous landing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910783859.7A CN110488848B (en) 2019-08-23 2019-08-23 Unmanned aerial vehicle vision-guided autonomous landing method and system

Publications (2)

Publication Number Publication Date
CN110488848A true CN110488848A (en) 2019-11-22
CN110488848B CN110488848B (en) 2022-09-06

Family

ID=68553221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910783859.7A Active CN110488848B (en) 2019-08-23 2019-08-23 Unmanned aerial vehicle vision-guided autonomous landing method and system

Country Status (1)

Country Link
CN (1) CN110488848B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111412898A (en) * 2020-04-16 2020-07-14 中国建筑股份有限公司 Large-area deformation photogrammetry method based on ground-air coupling
CN113759943A (en) * 2021-10-13 2021-12-07 北京理工大学重庆创新中心 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
CN113867387A (en) * 2021-09-27 2021-12-31 中国航空无线电电子研究所 Unmanned aerial vehicle autonomous landing course identification method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011178186A (en) * 2010-02-26 2011-09-15 Mitsubishi Heavy Ind Ltd Landing guide device and method
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN106371447A (en) * 2016-10-25 2017-02-01 南京奇蛙智能科技有限公司 Controlling method for all-weather precision landing of unmanned aerial vehicle
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN109885084A (en) * 2019-03-08 2019-06-14 南开大学 A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control
CN113448345A (en) * 2020-03-27 2021-09-28 北京三快在线科技有限公司 Unmanned aerial vehicle landing method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011178186A (en) * 2010-02-26 2011-09-15 Mitsubishi Heavy Ind Ltd Landing guide device and method
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN106371447A (en) * 2016-10-25 2017-02-01 南京奇蛙智能科技有限公司 Controlling method for all-weather precision landing of unmanned aerial vehicle
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN109885084A (en) * 2019-03-08 2019-06-14 南开大学 A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control
CN113448345A (en) * 2020-03-27 2021-09-28 北京三快在线科技有限公司 Unmanned aerial vehicle landing method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
VIDYA SUDEVAN 等: "Vision based autonomous landing of an Unmanned Aerial Vehicle on a stationary target", 《2017 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS)》 *
吴益超 等: "无人直升机着舰视觉引导系统设计与试验", 《电光与控制》 *
邢伯阳 等: "基于复合地标导航的动平台四旋翼飞行器自主优化降落技术", 《航空学报》 *
魏祥灰: "着陆区域视觉检测及无人机自主着陆导引研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111412898A (en) * 2020-04-16 2020-07-14 中国建筑股份有限公司 Large-area deformation photogrammetry method based on ground-air coupling
CN113867387A (en) * 2021-09-27 2021-12-31 中国航空无线电电子研究所 Unmanned aerial vehicle autonomous landing course identification method
CN113867387B (en) * 2021-09-27 2024-04-12 中国航空无线电电子研究所 Unmanned aerial vehicle autonomous landing course recognition method
CN113759943A (en) * 2021-10-13 2021-12-07 北京理工大学重庆创新中心 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system

Also Published As

Publication number Publication date
CN110488848B (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN106774386B (en) Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN105335733B (en) Unmanned aerial vehicle autonomous landing visual positioning method and system
CN110488848A (en) Unmanned plane vision guide it is autonomous drop method and system
CN105501457A (en) Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle)
US20190197908A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
WO2010108301A1 (en) Ground-based videometrics guiding method for aircraft landing or unmanned aerial vehicles recovery
WO2012081755A1 (en) Automatic recovery method for an unmanned aerial vehicle
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
EP2987001A1 (en) Landing system for an aircraft
EP3392153B1 (en) Method and system for providing docking guidance to a pilot of a taxiing aircraft
CN109581456A (en) Unmanned plane Laser navigation system based on Position-Sensitive Detector
CN109035294B (en) Image extraction system and method for moving target
CN107576329B (en) Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision
CN108955685A (en) A kind of tanker aircraft tapered sleeve pose measuring method based on stereoscopic vision
CN113759943A (en) Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
US20110262008A1 (en) Method for Determining Position Data of a Target Object in a Reference System
JP6791387B2 (en) Aircraft, air vehicle control device, air vehicle control method and air vehicle control program
Pollini et al. Experimental evaluation of vision algorithms for formation flight and aerial refueling
KR20190097350A (en) Precise Landing Method of Drone, Recording Medium for Performing the Method, and Drone Employing the Method
KR101537324B1 (en) Automatic carrier take-off and landing System based on image processing
CN113436276B (en) Visual relative positioning-based multi-unmanned aerial vehicle formation method
CN115755950A (en) Unmanned aerial vehicle fixed-point landing method based on laser radar and camera data fusion
CN108319287A (en) A kind of UAV Intelligent hides the system and method for flying object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant