GB2302318A - Aircraft landing procedure - Google Patents

Aircraft landing procedure Download PDF

Info

Publication number
GB2302318A
GB2302318A GB9611536A GB9611536A GB2302318A GB 2302318 A GB2302318 A GB 2302318A GB 9611536 A GB9611536 A GB 9611536A GB 9611536 A GB9611536 A GB 9611536A GB 2302318 A GB2302318 A GB 2302318A
Authority
GB
United Kingdom
Prior art keywords
landing
aircraft
field
sensor
marks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9611536A
Other versions
GB2302318B (en
GB9611536D0 (en
Inventor
Uwe Georg Hingst
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bodenseewerk Geratetechnik GmbH
Original Assignee
Bodenseewerk Geratetechnik GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bodenseewerk Geratetechnik GmbH filed Critical Bodenseewerk Geratetechnik GmbH
Publication of GB9611536D0 publication Critical patent/GB9611536D0/en
Publication of GB2302318A publication Critical patent/GB2302318A/en
Application granted granted Critical
Publication of GB2302318B publication Critical patent/GB2302318B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • B64F1/18Visual or acoustic landing aids
    • B64F1/20Arrangement of optical beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0078Surveillance aids for monitoring traffic from the aircraft

Description

1 Landing Procedure for Unmanned Aircraft 2302318 The invention relates to
a landing procedure for aircraft which comprisea flight guidance system, an image resolving sensor and an image processing system.
Unmanned aircraft are used for the purpose of aerial reconnaissance. Such unmanned reconnaissance aircraft are often equipped with an image resolving sensor and an image processing system, in order to be able to recognize targets also at night or poor visibility.
One problem is the landing of such unmanned aircraft. Usually, no normal airfield is available for this purpose. According to the prior art, landing of such unmanned aircraft is effected by means of a parachute or of impactdampening air cussions.
More complex, vertical take-off, unmanned aircraft are known. Such aircraft require so much weight and space for the drive, that the range or payload of the aircraft are substantially limited.
Other unmanned aircraft are landed by means of radio remote control. This requires highly qualified ground personnel.
The landing requires a sufficiently large landing strip.
Furthermore, landing can take place at daylight only.
It is an object of the invention to provide alanding procedure for aircraft, in particular unmanned aircraft of the type described hereinbefore, which 2 permits landing of the aircraft on an easy to prepare landing field and proceeds automatically and without high requirements on qualification of the ground personnel.
According to the invention, this object is achieved with an aircraft of the type mentioned in the beginning by the following steps:
(a) marking a landing field at well-defined relative positions by marks which can be recognized by the sensor, (b) determining the positions of these marks within the field of view detected by the sensor, (c) determining approach parameters by means of the image processing system from the positions of the marks within the field of view, and (d)'applying the approach parameters thus determined to the flight guidance system for guiding the aircraft to land on the landing field.
Thus, according to the invention, the aircraft lands normally for example on a meadow. The aircraft may land on skids. The landing field is marked. The marking is detected by the image resolving sensor such as an IR-video camera.
By means of the image processing system, the position and attitude of the aircraft (altitude, elevation and azimuth) relative to the landing field and relative to a touch-down point can be determined. The flight guidance system controls the azimuth angle to zero. Then the elevation 3 relative to the touch-down point provides the glide path angle. This elevation angle is then maintained by the control as glide path angle. Thereby, the aircraft is guided to the touch-down point. In many cases, the unmanned aircraft is provided with an image resolving sensor anyhow for aerial reconaissance purposes. This sensor can then be utilized for the landing procedure.
If the landing field is marked by infrared-emitting marks and the sensor of the aircraft responds to infrared radiation, it is possible to land in the darkness. The landing field is not visible for the enemy, unless he has also an infrared-sensitive sensor.
Modifications of the invention are subject matter of further sub-claims.
An embodiment of the invention is described hereinbelow with reference to the accompanying drawings.
Fig. 1 is a schematic-perspective illustration and shows the guiding of the unmanned aircraft towards the landing direction.
Fig. 2 is a plan view of the landing f ield and shows the meaning of the variuos designations.
Fig. 3 shows an unmanned aircraft approaching at an angle with respect to the longitudinal direction of the landing field.
Fig. 4 shows the image, which is detected by the sensor responding to infrared radiation and arranged in the aircraft, if the aircraft has the attitude and position illustrated in Fig. 3, measures to be used 4 in the picture processing being added to the picture.
Fig. 5 shows a side elevation of the landing field and of the aircraft during its approach.
Fig.6 is a block diagram and illustrates schematically the sensor, the image processing system and the flight guidance system.
Fig. 7 shows the image of the landing f ield detected by the sensor at a distance of 900 meters.
Fig.8 shows the image of the landing f ield detected by the sensor at a distance of 200 meters.
Fig. 9 shows the image of the landing f ield detected by the sensor at a distance of 25 meters.
Fig.1 shows a landing field 10 embedded in a landscape-The landing field is elongated-rectangular and marked out by four marks 14, 16, 18 and 20 on a substantially plane area 12, for example a meadow or a f ield. The marks 14, 16, 18 and 20 are heated plates or hemispheres, which emit infrared radiation but no visible light. The two marks 14 and 16 define the approach-side narrow side of the landing field. Numeral 22 designates a longitudinal center line of the landing field 10. A fifth mark 24 in the form of a landing-T or a landing cross is placed on the longitudinal center line halfway between the two approach-side marks 14 and'16.
An unmanned aircraft 26, for example an unmanned aerial reconnaissance aircraft, flies along a trajectory 28. The aircraft 26 is steered up to a point "X" by the navigation means provided for its mission such as satellite navigation (GPS), inertial navigation or directional radio beacon. At the point "X", an image resolving sensor provided in the aircraft detects the landing field 10. By means of an image processing system also provided in the aircraft, the aircraft 26 is guided to the point "Y" approximately in the direction of landing. The touch-down point can be recognized by means of the landing-T 24, which is arranged at the approach-side narrow side of the landing field 10. The landing direction results from the extension of the longitudinal center line passing through the landing-T.
Fig.2 shows a plan view of the landing field 10. The length A - B of the approach-side narrow side of the landing field between the marks 14 and 16 is BL. The length D - C of the narrow side of the landing field remot from the approachside between the marls 18 and 20 is bL. In the present case, BL = bL. The length of the longitudinal sides of the landing field 10 is LL
Fig.3 is a perspective illustration of the landing field 10 and of the unmanned aircraft 26. The aircraft 26 is located on a line of sight 30 as viewed from the landing-T 24, this line of sight forming an angle a with the longitudinal center line 22. The projection of the line of sight 30 on the horizontal plane forma an angle a' with the longitudinal center line 22.
Fig.2 shows the image which is "seen" by the image resolving sensor of the aircraft 26 with this configuration. This image is a perspective one: The two parallel longitudinal sides A - D and B - C of the rectangle and the longitudinal center line 22 intersect, in the perspective, at a vanishing point 32. The image of the approach side narrow side A - B is longer than the image of 6 the narrow side D - C remote from the approach side. The length of the former image is Bi, the length of the latter image is Bi. The distance of the images of the two narrow sides, which appear substantially parallel, measured normal to the direction of these images is designated by hi.
Furthermore, the line of sight from the aircraft to the landing-T 24 has been added in Fig.4 as dotted line.
It can be recognized from the oblique position of the longitudinal center line 22 in the image of Fig.4 that the unmanned aircraft 26 is located lateral from this longitudinal center line 22. A steering signal can be derived herefrom by the image processing system, this steering signal being applied to the flight guidance system, taking the aircraft to a vertical plane containing the longitudinal center line 22, and maintaining the aircraft in this plane.
This yields a situation as illustrated in Fig.5.
The aircraft 26 is located substantially in a vertical plane passing through the longitudinal center line 22 of the landing field 10. Its altitude above ground is H, The horizontal distance of the aircraft 26 ftom the landing-T 24 is Xu. The elevation angle, at which the aircraft is seen from the landing-T, and thus the glide path angle of the glide path 34 which the aircraft 26 has to follow in order to touch down at the landing-T 24 is designated by P. These parameters can be determined from the image (Fig.4) of the known landing field 10 as received by the sensor. It is:
4 H. = gIL',[X,,+LLI B, L, 7 Z1 b, X - 8,-b, 81 0 = arctg [ 6, L, 1 These values are applied to the flight guidance system of the unmanned aircraft 26. The flight guidance system keeps the glide angle 0 constant through the thrust setting and/or spoilers. Influences of head wind or tail wind are automatically taken into account by the thrust setting or the extension of spoilers. Thereby, the aircraft 26 is guided to the landing-T 24. The aircraft 26 touches down with skids. Prior to reaching the landing-T 24, a flaring procedure under the control of altitude and distance information can be initiated, in order to achieve softer touch down of the aircraft on the landing field.
Fig.6 is a block diagram and illustrates the sensor 38, the image processing system 40 and the flight guidance system 42.
The sensor 38 is an image resolving sensor responding to infrared radiation similar to a video camera. The image processing system 40 is designed to recognize and identify the marks in the f ield of view and to determine in the image both the distances of the marks thus recognized and the direction of the longitudinal center line 22. The direction of the longitudinal center line provides a measure of the lateral deviation of the aircraft 26 from the vertical plane passing through the longitudinal center line 22. The flight guidance system, through the rudder, controls the trajectory and the attitude of the aircraft such that, in the image detected by the sensor, the angle a becomes substantially zero, and the longitudinal center line 22 extends in the middle and substantially vertical in 8 the image of Fig.4. Furthermore, the image processing system 40 provides the glide angle P. The flight guidance system then keeps this angle 0 constant through the thrust setting and the spoilers.
Figs.7 to 9 show the images of a landing field having dimensions of 15 meters x 100 meters consecutively detected by the image resolving sensor during the landing at distances of 900 meters, 200 meters and 25 meters, respectively. It can be seen from Fig.9 that during the end phase of the landing approach, the sensor no longer detects the whole landing field 10, but that the front corners are cut off by the frame limits. In the above equation for the elevation angle 0, however, only the quantities bi and hi appear, i.e. the width of the rear narrow side of the landing field, appearing in the image, and the length of the landing field measured along the longitudinal center line 22, als appearing in the image. These quantities can still be gathered even from Fig.9- BL and LL are known parameters inputted prior to the mission. Therefore, the landing procedure described operates also during the end phase of the landing approach, when the sensor no longer "sees" all of the landing field.
This indicates, that only three marks are sufficient for marking the landing field to guide the aircraft 26 along the glide path 34, namely the two rear marks 18 and 20 and the landing-T 24. Instead of the landing- T, an ordinary mark may be used which is identical in design with the marks 14 to 20.
The described landing procedure permits the use of unmanned aircraft also during darkness or poor vision.
9

Claims (8)

Claims
1. A landing procedure for aircraft (26) which comprise a flight guidance system (42), an image resolving sensor (38) and an image processing system (40), comprising the steps of (a) marking a landing field (10) at well-defined relative positions by marks (14,16,18,20,24) which can be recognized by the sensor (38), (b) determining the positions of these marks (14,16,18,20,24) within the field of view detected by the sensor (38), (c) determining approach parameters by means of the image processing system (40) from the positions of the marks (14,16,18,20,24) within the field of view, and (d) applying the approach parameters thus determined to the flight guidance system (42) for guiding the aircraft (26) to land on the landing field.
2. ' A landing procedure as claimed in claim 1, characterized in that the landing field (10) is marked by infrared-emitting marks (14,16,18,20,24), and the sensor (38) of the aircraft (26) responds to infrared radiation.
A landing procedure as claimed in claim 1 or 2, characterized in that a rectangle is marked as landing field (10) by four marks (14,16,18,20,24), the dimensions of the rectangle being inputted as parameters into the image processing system (40) of the aircraft (26) prior to starting on the mission.
4. A landing procedure as claimed in claim 3, characterized in that a fifth mark is placed in the middle between the marks (14,16) determining the approach-side narrow side of the rectangle.
5. A landing procedure as claimed in claim 3, characterized in that the fifth mark (24) is designed to form a landing cross or landing-T.
A landing procedure as claimed in claim 4 or 5, characterized in that the distance xu of the aircraft (26) from the touch-down point and the glide path angle 0 are determined in accordance with the relations L L b, X = 8, -b, h Bt,
6 = arcig[ b, Lfl 1 wherein BL is the width of the landing field (10), LL 'is the length of the landing field (10), Bi is the width of the approach-side narrow side of the landing field (10) in the image detected by the sensor (38), 11 is the width of the narrow side of the landing field (10) remote from the direction of approach in the image detected by the sensor, and is the distance of the narrow sides in the image detected by the sensor (38).
7. A landing procedure as claimed in anyone of the claims 1 to 6, characterized in that the aircraft is unmanned.
8. A landing procedure substantially as hereinbefore described with reference to the accompanying drawings.
GB9611536A 1995-06-14 1996-06-03 Landing procedure for unmanned aircraft Expired - Fee Related GB2302318B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE19521600A DE19521600A1 (en) 1995-06-14 1995-06-14 Landing procedures for unmanned aerial vehicles

Publications (3)

Publication Number Publication Date
GB9611536D0 GB9611536D0 (en) 1996-08-07
GB2302318A true GB2302318A (en) 1997-01-15
GB2302318B GB2302318B (en) 1999-09-08

Family

ID=7764342

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9611536A Expired - Fee Related GB2302318B (en) 1995-06-14 1996-06-03 Landing procedure for unmanned aircraft

Country Status (3)

Country Link
DE (1) DE19521600A1 (en)
FR (1) FR2735445B1 (en)
GB (1) GB2302318B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2325578A (en) * 1997-04-04 1998-11-25 Evans & Sutherland Computer Co Camera/lens calibration
GB2329292A (en) * 1997-09-12 1999-03-17 Orad Hi Tec Systems Ltd Camera position sensing system
CN101244765B (en) * 2008-03-14 2010-06-02 南京航空航天大学 Visual guidance for takeoff and landing of airplane in low visibility condition, monitor system and technique thereof
CN101923789A (en) * 2010-03-24 2010-12-22 北京航空航天大学 Safe airplane approach method based on multisensor information fusion
CN104340371A (en) * 2013-07-24 2015-02-11 空中客车营运有限公司 Autonomous and automatic landing method and system
FR3024127A1 (en) * 2014-07-25 2016-01-29 Airbus Operations Sas AUTONOMOUS AUTOMATIC LANDING METHOD AND SYSTEM
WO2017009471A1 (en) * 2015-07-16 2017-01-19 Safran Electronics & Defense Method for automatically assisting with the landing of an aircraft
US9857178B2 (en) 2014-03-05 2018-01-02 Airbus Ds Gmbh Method for position and location detection by means of virtual reference images
WO2018024069A1 (en) * 2016-08-04 2018-02-08 北京京东尚科信息技术有限公司 Method, device and system for guiding unmanned aerial vehicle to land
CN112455705A (en) * 2020-12-04 2021-03-09 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2835314B1 (en) * 2002-01-25 2004-04-30 Airbus France METHOD FOR GUIDING AN AIRCRAFT IN THE FINAL LANDING PHASE AND CORRESPONDING DEVICE
DE10305993B4 (en) * 2003-02-12 2006-01-05 Deutsches Zentrum für Luft- und Raumfahrt e.V. Surveying procedure for flight and vehicle guidance
DE102014014446A1 (en) 2014-09-26 2016-03-31 Airbus Defence and Space GmbH Redundant determination of position data for an automatic landing system
CN104309811B (en) * 2014-09-28 2016-08-17 中国船舶工业系统工程研究院 A kind of aircraft helps fall system and centering guidance method
CN105068542A (en) * 2015-07-15 2015-11-18 北京理工大学 Rotor unmanned aerial vehicle guided flight control system based on vision
US10220964B1 (en) 2016-06-21 2019-03-05 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration validation before flight
US9972212B1 (en) 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle camera calibration as part of departure or arrival at a materials handling facility
US9823089B1 (en) 2016-06-21 2017-11-21 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration as part of departure from a materials handling facility
US9969486B1 (en) * 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle heat sensor calibration
US10032275B1 (en) 2016-06-21 2018-07-24 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration during flight
CN107424440B (en) * 2017-06-06 2023-07-18 中国民用航空总局第二研究所 Aircraft approach landing monitoring system
CN107146475B (en) * 2017-06-06 2023-07-18 中国民用航空总局第二研究所 Ground service system, airborne guiding system and aircraft approach landing guiding system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3610930A (en) * 1968-12-31 1971-10-05 Texas Instruments Inc Independent infrared landing monitor
FR2028600A5 (en) * 1969-01-03 1970-10-09 Thomson Brandt Csf
US4210930A (en) * 1977-11-18 1980-07-01 Henry Richard D Approach system with simulated display of runway lights and glide slope indicator
DE2938853A1 (en) * 1979-09-26 1981-04-09 Vereinigte Flugtechnische Werke Gmbh, 2800 Bremen AREA NAVIGATION SYSTEM FOR AIRCRAFT
DE2944337A1 (en) * 1979-11-02 1982-06-03 Vereinigte Flugtechnische Werke Gmbh, 2800 Bremen ARRANGEMENT FOR THE AUTOMATIC LANDING OF AN AIRCRAFT
US4748569A (en) * 1983-10-17 1988-05-31 Bristow Helicopters Limited Helicopter navigation and location system
US4792904A (en) * 1987-06-17 1988-12-20 Ltv Aerospace And Defense Company Computerized flight inspection system
US4866626A (en) * 1987-09-18 1989-09-12 Egli Werner H Navigation by a video-camera sensed ground array
IL88263A (en) * 1988-11-02 1993-03-15 Electro Optics Ind Ltd Navigation system
GB2272343A (en) * 1992-11-10 1994-05-11 Gec Ferranti Defence Syst Automatic aircraft landing system calibration
ES2194027T3 (en) * 1994-08-19 2003-11-16 Safyan Anatolii Dimitrivich NAVIGATION SYSTEM FOR HELICOPTERS.

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2325578A (en) * 1997-04-04 1998-11-25 Evans & Sutherland Computer Co Camera/lens calibration
GB2325578B (en) * 1997-04-04 2001-11-07 Evans & Sutherland Computer Co Camera/lens calibration apparatus and method
GB2329292A (en) * 1997-09-12 1999-03-17 Orad Hi Tec Systems Ltd Camera position sensing system
CN101244765B (en) * 2008-03-14 2010-06-02 南京航空航天大学 Visual guidance for takeoff and landing of airplane in low visibility condition, monitor system and technique thereof
CN101923789A (en) * 2010-03-24 2010-12-22 北京航空航天大学 Safe airplane approach method based on multisensor information fusion
CN101923789B (en) * 2010-03-24 2011-11-16 北京航空航天大学 Safe airplane approach method based on multisensor information fusion
US9260180B2 (en) 2013-07-24 2016-02-16 Airbus Operations (S.A.S.) Autonomous and automatic landing method and system
CN104340371A (en) * 2013-07-24 2015-02-11 空中客车营运有限公司 Autonomous and automatic landing method and system
CN104340371B (en) * 2013-07-24 2018-04-03 空中客车营运有限公司 Autonomous and automatic landing concept and system
US9857178B2 (en) 2014-03-05 2018-01-02 Airbus Ds Gmbh Method for position and location detection by means of virtual reference images
FR3024127A1 (en) * 2014-07-25 2016-01-29 Airbus Operations Sas AUTONOMOUS AUTOMATIC LANDING METHOD AND SYSTEM
US9939818B2 (en) 2014-07-25 2018-04-10 Airbus Operations (S.A.S.) Method and system for automatic autonomous landing of an aircraft
WO2017009471A1 (en) * 2015-07-16 2017-01-19 Safran Electronics & Defense Method for automatically assisting with the landing of an aircraft
FR3038991A1 (en) * 2015-07-16 2017-01-20 Sagem Defense Securite AUTOMATIC ASSISTANCE METHOD FOR LANDING AN AIRCRAFT
US10175699B2 (en) 2015-07-16 2019-01-08 Safran Electronics And Defense Method for automatically assisting with the landing of an aircraft
WO2018024069A1 (en) * 2016-08-04 2018-02-08 北京京东尚科信息技术有限公司 Method, device and system for guiding unmanned aerial vehicle to land
US11383857B2 (en) 2016-08-04 2022-07-12 Beijing Jingdong Qianshi Technology Co., Ltd. Method, device and system for guiding unmanned aerial vehicle to land
CN112455705A (en) * 2020-12-04 2021-03-09 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method

Also Published As

Publication number Publication date
GB2302318B (en) 1999-09-08
FR2735445B1 (en) 2000-03-17
DE19521600A1 (en) 1996-12-19
FR2735445A1 (en) 1996-12-20
GB9611536D0 (en) 1996-08-07

Similar Documents

Publication Publication Date Title
GB2302318A (en) Aircraft landing procedure
US7113202B2 (en) Autotiller control system for aircraft utilizing camera sensing
US9401094B2 (en) Method for assisting the piloting of an aircraft on the ground and system for its implementation
EP3078988B1 (en) Flight control system with dual redundant lidar
US5235513A (en) Aircraft automatic landing system
US5716032A (en) Unmanned aerial vehicle automatic landing system
US7818127B1 (en) Collision avoidance for vehicle control systems
US10878709B2 (en) System, method, and computer readable medium for autonomous airport runway navigation
US8494760B2 (en) Airborne widefield airspace imaging and monitoring
US10384801B2 (en) Device for assisting the piloting of a rotorcraft, associated display, and a corresponding method of assisting piloting
EP3274779B1 (en) Path-based flight maneuvering system
EP3077760B1 (en) Payload delivery
EP3077880B1 (en) Imaging method and apparatus
US6965342B2 (en) Method for recognizing and identifying objects
CN111498127A (en) Directional lighting system mounted on an aircraft and associated lighting method
KR102079727B1 (en) Automatic drones landing gear and method using vision recognition
CN204297108U (en) Helicopter obstacle avoidance system
KR102391067B1 (en) Fast autonomous UAV landing system with upright structures and its control strategy
KR102375458B1 (en) Autonomous landing system of UAV on moving platform
KR20180017256A (en) System for combination aviation of manned airplane and uav with portable uav control device
RU2578202C1 (en) Method for helicopter navigation, takeoff and landing
JPH0516894A (en) Landing aid system for unmanned aircraft
CN113295164B (en) Unmanned aerial vehicle visual positioning method and device based on airport runway
US20230196931A1 (en) Method for identifying a landing zone, computer program and electronic device therefor
Hebel et al. Imaging sensor fusion and enhanced vision for helicopter landing operations

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20130603