WO2014127607A1 - 基于视觉的飞机降落辅助装置 - Google Patents

基于视觉的飞机降落辅助装置 Download PDF

Info

Publication number
WO2014127607A1
WO2014127607A1 PCT/CN2013/080265 CN2013080265W WO2014127607A1 WO 2014127607 A1 WO2014127607 A1 WO 2014127607A1 CN 2013080265 W CN2013080265 W CN 2013080265W WO 2014127607 A1 WO2014127607 A1 WO 2014127607A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
runway
angle
aircraft
further characterized
Prior art date
Application number
PCT/CN2013/080265
Other languages
English (en)
French (fr)
Inventor
张国飙
Original Assignee
Zhang Guobiao
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhang Guobiao filed Critical Zhang Guobiao
Publication of WO2014127607A1 publication Critical patent/WO2014127607A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • B64D45/08Landing aids; Safety measures to prevent collision with earth's surface optical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • G01C5/005Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels altimeters for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to the field of aviation and, more particularly, to landing aids for aircraft.
  • Landing is the most challenging part of the flight.
  • the pilot pulls the nose up to reduce the aircraft's descent speed.
  • This operation is called leveling, and the time and height at which the leveling operation is started are referred to as the leveling time and the leveling level, respectively.
  • the level of the leveling is generally on the ground. 5m to 10m Within. Since it is often difficult for flight students to judge the level of leveling, they need to practice hundreds of landings to master the level of leveling. Such a large number of landing exercises increase training time, waste a lot of fuel, and have a negative impact on the environment.
  • radar altimeters or laser altimeters can be used to help leveling, they are more expensive. It is best to use low-cost landing aids to help flight learners master landing skills.
  • US Patent 8,315,748 (Inventor: Lee, Authorization Day: November 20, 2012) A vision-based height measurement method was proposed. It uses a circular sign as a vertical takeoff and landing aircraft (VTOL ) A reference for vertical takeoff and landing. The camera in the aircraft first acquires an image of the circular marker and then measures the horizontal diameter and vertical diameter of the circular marker in the image. Finally, the altitude of the aircraft can pass through these diameter data, the actual diameter of the circular marker, the circular marker and the aircraft. The distance between the descending points and the heading attitude of the aircraft (ie, heading angle, pitch angle, and tilting angle) are calculated. For fixed-wing aircraft, the distance between the circular sign and the aircraft's projected point on the ground varies, so this method is not applicable.
  • VTOL vertical takeoff and landing aircraft
  • the invention proposes a vision based aircraft landing aid. It consists of a camera and a processor.
  • the camera is mounted on the front of the aircraft, facing the runway and capturing a series of original runway images.
  • the processor extracts the roll angle ⁇ from the original runway image. After ⁇ is obtained, the original runway image is rotated by - ⁇ around its optical origin for gamma correction, and the horizon of the corrected runway image (corrected runway image) becomes horizontal (if its horizon can be seen).
  • the image processing after this is performed in the corrected runway image.
  • the horizontal line passing through the optical origin is referred to as the main horizontal line H
  • the vertical line passing through the optical origin is referred to as the main vertical line V.
  • f is the focal length of the camera.
  • the aircraft landing aid may also include a sensor such as an inertial sensor (such as a gyroscope) or a magnetic field sensor (such as a magnetic field meter). It can be used to measure attitude angles (such as pitch angle ⁇ , heading angle ⁇ , and tilt angle) ⁇ ).
  • attitude angles such as pitch angle ⁇ , heading angle ⁇ , and tilt angle
  • the attitude angle measured directly with the sensor simplifies the height calculation. For example, the measured tilt angle ⁇ can be used directly to rotate the original runway image; the measured pitch angle ⁇ and heading angle ⁇ Can be used to calculate the height.
  • Using sensor data can reduce processor workload and speed up image processing.
  • Vision-based height measurement is especially suitable as an application (app ) Installed on a smartphone.
  • the smartphone contains all the components needed for this height measurement (including cameras, sensors and processors). Since smartphones are ubiquitous, vision-based aircraft landing aids do not require additional hardware, just install a 'landing aid' on the smartphone. App can be. This software-based aircraft landing aid has the lowest cost.
  • the main beneficial effect of the present invention is to provide a low cost aircraft landing assist device.
  • Another beneficial effect of the present invention is to help flight students master the landing skills.
  • Another beneficial effect of the present invention is to conserve energy resources and improve environmental quality.
  • Figure 1 shows the relative position of an airplane and a runway.
  • Figure 2A - Figure 2C are functional block diagrams of three vision-based aircraft landing aids.
  • Figure 3 illustrates the definition of the tilt angle ( ⁇ ).
  • Figure 4 shows an original runway image
  • Figure 5 shows a corrected runway image
  • Figure 6 illustrates the definition of the pitch angle ( ⁇ ).
  • Figure 7 illustrates the definition of the heading angle ( ⁇ ).
  • Figure 8 shows a vision-based height measurement method.
  • Figures 9A-9B are aircraft landing aids with directional functions.
  • aircraft 10 is equipped with a vision based landing assist device 20.
  • the device 20 is mounted on the aircraft 10 The rear of the windshield faces forward. It can be a camera, a computer with a camera function or a computer-like device, or a smartphone. Its optical origin is marked as O'.
  • Landing aid 20 Use computer vision to measure its height to ground 0 A. Runway 100 is located on ground 0 and is in front of the aircraft. It has a length of L and a width of W.
  • the ground coordinates are defined as: its origin o is O' projection on ground 0 with its x-axis parallel to the longitudinal axis of runway 100 (longitudinal length), the y-axis parallel to the runway's horizontal axis (runway width), and the z-axis perpendicular to the x-y plane.
  • the axis is defined solely by the runway surface and is shared by many of the coordinates in this specification.
  • Figures 2A-2C show three vision-based aircraft landing aids 20 .
  • the embodiment in Figure 2A contains a camera 30 and a processor 70. It calculates the height A using the runway width W and the runway image acquired by the camera 30. Users can get from the airport information form ( Airport Directory The runway width W is obtained and entered manually; the landing aid 20 can also obtain the runway width W by electronic retrieval directly from the airport database. The aircraft landing aid 20 The height can be measured, the future altitude of the aircraft can be predicted, and pilots can be provided prior to the decision point (eg visual and / Or voice indication). For example, two short beeps and one long beep sound two seconds before the landing operation (such as leveling or pre-landing operations). The pilot should be prepared for the first two short beeps and operate during the last long beep.
  • the embodiment of Figure 2B also includes a sensor 40 as compared to Figure 2A.
  • a sensor 40 such as an inertial sensor (such as a gyroscope) or a magnetic field sensor (such as a magnetic field meter). It can be used to measure attitude angles (such as pitch angle ⁇ , heading angle ⁇ , and tilt angle ⁇ ).
  • attitude angles such as pitch angle ⁇ , heading angle ⁇ , and tilt angle ⁇ ).
  • the attitude angle measured directly with the sensor simplifies the height calculation. For example, the measured tilt angle ⁇ can be used directly to rotate the original runway image; the measured pitch angle ⁇ and heading angle ⁇ Can be used to calculate the height (see Figure 8).
  • Using sensor data can reduce processor workload and speed up image processing.
  • the embodiment in Figure 2C is a smartphone 80. It also includes a memory 50 that is 50 Store the 'Aircraft Landing' application (app) 60 . By running 'Airplane Landing' app 60, smartphone 80 The altitude can be measured, the future altitude of the aircraft can be predicted, and the pilot can be instructed prior to the decision point.
  • the smartphone contains all the components needed for height measurement (including cameras, sensors and processors), which can easily assist the aircraft to land. Since smartphones are ubiquitous, vision-based aircraft landing aids do not require additional hardware, just install a 'landing aid' on the smartphone. App can be. This software-based aircraft landing aid has the lowest cost.
  • Figures 3 - 5 depict a method of obtaining the tilt angle ( ⁇ ).
  • Figure 3 defines the tilt angle of the camera 30 ( ⁇ ). Since the image sensor 32 of the camera 30 (such as a CCD sensor or a CMOS sensor) is rectangular in the image plane 36, the original image coordinates XYZ can be defined as follows: O is the optical origin of image sensor 32. The X and Y axes are the two centerlines of the rectangle, and the Z axis is perpendicular to the X-Y plane. Here the line N is perpendicular to both z and Z The axis is always parallel to the runway plane.
  • the tilt angle ( ⁇ ) is defined as the angle between the Y axis and the line N.
  • Image coordinates XYZ rotate around the Z axis - ⁇ After correction, the corrected image coordinates (corrected image coordinates) X*Y*Z* are formed.
  • the line N is also the Y* axis of the corrected image coordinates.
  • Figure 4 shows the original runway image 100i acquired by camera 30. Since the camera 30 has a tilt angle ⁇ , the image of the horizon The 120i is tilted and has an angle ⁇ to the Y axis.
  • the image 100i can be gamma corrected by rotating - ⁇ around the origin O.
  • Figure 5 shows ⁇
  • the corrected runway image (corrected runway image) 100* has a horizon 120* horizontal, ie parallel to the Y* axis.
  • the Y* axis is called the main horizontal line H
  • the vertical line (ie, the X* axis) through its optical origin O is called the main vertical line V.
  • Figure 6 - Figure 8 will correct the runway image 100* Do further analysis.
  • Figure 6 defines the pitch angle ( ⁇ ) of camera 30.
  • Optical coordinate X'Y'Z' is the corrected image coordinate X*Y*Z*
  • the distance f is translated along the Z* axis.
  • f is the focal length of the lens 38.
  • ground coordinates corrected ground coordinates
  • x*y*z* The origin o* and z* axes are the same as the ground coordinate xyz, and the x* axis is in the same plane as the X' axis.
  • the distance from the optical origin O' of the lens to the ground is the height A .
  • the elevation angle ( ⁇ ) is the angle between the Z' axis and the x* axis.
  • Figure 7 defines the heading angle ( ⁇ ) of camera 30.
  • the figure shows the ground coordinates xyz and corrected ground coordinates x*y*z* . They are rotated by ⁇ along the z axis. Note that ⁇ is defined relative to the longitudinal axis (length direction) of the runway 100. Corrected ground coordinates, although the x-axis is parallel to the longitudinal axis of runway 100 x*y*z* is more computationally efficient, so this manual analyzes the runway image in this coordinate.
  • Figure 8 shows a step in height measurement.
  • the roll angle ⁇ is extracted from the horizon 120i of the original runway image (Fig. 4, step 210).
  • the original runway image is rotated by - ⁇ around the optical origin for gamma correction (Fig. 5, step 220).
  • the intersection of the left and right edge extension lines 160*, 180* of the runway is marked P, and its coordinates (X P , Y P ) (X P is the distance between the intersection point P and the main horizontal line H; Y P
  • the steps in Figure 8 can be skipped or reversed for those skilled in the art.
  • the measured tilt angle ⁇ can be used directly to rotate the original runway image (skip step 210); measured pitch angle ⁇ And the heading angle ⁇ can be used to calculate the height (skip steps 230, 240).
  • Using sensor data can reduce processor workload and speed up image processing.
  • Figure 9A - Figure 9B is an aircraft landing assist device with directional function 20 . It ensures that the horizon in the runway image is always level, eliminating the need for gamma correction of the runway image, which simplifies altitude calculations.
  • an aircraft landing aid (such as a mobile phone) 20 is placed in an orienter 19 Medium.
  • the director 19 is comprised of a cradle 18, a weight 14 and a base 12 of the handset. Bracket 17 is fixed to aircraft 10, cradle 18 is ball bearing 16 Supported on the bracket 17.
  • the weight 14 ensures that the longitudinal axis of the handset 20 is always along gravity z The direction.
  • the weight 14 preferably contains a metallic material to form a pair of dampers with the magnet 15 to help stabilize the cradle 18 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明提出了一种基于视觉的飞机降落辅助装置。该装置获取一系列跑道图像,然后对原始跑道图像进行倾侧角(γ)校正,并根据γ校正后的跑道图像和跑道宽度(W)计算飞机高度(Α)。该装置还可以含有一传感器,它测量至少一个姿态角,并利用该姿态角加速图像处理。智能手机非常适合作为飞机降落辅助装置。

Description

基于视觉的飞机降落辅助装置 技术领域
本发明涉及航空领域,更确切地说,涉及飞机的降落辅助装置。
背景技术
着陆是飞行中最具挑战性的部分。当飞机进入地面效应区域时,飞行员将机头拉起,以降低飞机的下降速度。该操作称为拉平,开始拉平操作的时刻和高度分别称为拉平时刻和拉平高度。对于小型飞机,拉平高度一般为地面上 5m 到 10m 以内。由于飞行学员通常较难判断拉平高度,他们需要练习几百次降落才能掌握拉平高度。如此大量的降落练习增加了训练时间,浪费大量的燃料,且对环境有负面影响。尽管雷达测高仪或者激光测高仪可以用来帮助拉平,但它们比较昂贵。最好用低成本的降落辅助装置来帮助飞行学员掌握降落技巧。
技术问题
以往技术也采用计算机视觉来辅助飞机降落。美国专利 8,315,748 (发明人: Lee ,授权日: 2012 年 11 月 20 日)提出了一种基于视觉的高度测量方法。它使用一种圆形标志作为垂直起降飞机( VTOL )垂直起降时的参考物。飞机中的相机首先获取圆形标志的图像,然后测量该图像中圆形标志的水平直径和竖直直径,最后飞机高度可以通过这些直径数据、圆形标志的实际直径、圆形标志和飞机起降点之间的距离,以及飞机的航向姿态(即航向角、俯仰角和倾侧角)计算出来。对于固定翼飞机来说,圆形标志与飞机在地面投影点之间的距离是变化的,因此这种方法不适用。
技术解决方案
本发明提出了一种基于视觉的飞机降落辅助装置。它由一个相机和一个处理器组成。相机安装在飞机前端,面向跑道并获取一系列原始跑道图像。处理器从原始跑道图像提取倾侧角 γ 。在得到 γ 后,将原始跑道图像围绕其光学原点旋转 -γ 以进行 γ 校正,校正后的跑道图像(校正跑道图像)的地平线变为水平(如果能够看见其地平线的话)。在此之后的图像处理均在校正跑道图像中进行。其通过光学原点的水平线被称为主水平线 H ,通过光学原点的垂直线被称为主垂直线 V 。同时,跑道左右边缘延长线的交点标记为 P ,其坐标 XP (即交点 P 与主水平线 H 的距离)可以用来计算俯仰角 ρ=atan(XP/f) ,其坐标 YP (即交点 P 与主垂直线 V 的距离)可以用来计算航向角 α=atan[(YP/f)*cos(ρ)] 。这里, f 为相机的焦距。最后,跑道左右边缘延长线与主水平线 H 交点 A 、 B 的距离 Δ 可以用来计算飞机的高度 A=W*sin(ρ)/cos(α)/(Δ/f) ,其中 W 为跑道宽度。此外,跑道左右边缘延长线与主水平线 H 之间的夹角 θA 和 θB 也可以用来计算 A=W*cos(ρ)/cos(α)/[cot(θA)- cot(θB)] 。
飞机降落辅助装置还可以包括一个传感器,如一个惯性传感器(如陀螺仪)或者一个磁场传感器(如磁场仪)。它可以用来测量姿态角(如俯仰角 ρ 、航向角 α 、倾侧角 γ )。直接采用传感器测量的姿态角可以简化高度计算。例如说,测量的倾侧角 γ 可直接用来转动原始跑道图像;测量的俯仰角 ρ 和航向角 α 可之间用来计算高度。使用传感器数据可以减少处理器的工作量,加速图像处理。
基于视觉的高度测量尤其适合作为应用软件( app )安装在智能手机上。智能手机含有所有该高度测量所需的部件(包括相机、传感器和处理器)。由于智能手机无处不在,基于视觉的飞机降落辅助装置不需要增加硬件,仅需在智能手机上安装一个'降落辅助' app 即可。这种基于软件的飞机降落辅助装置具有最低成本。
有益效果
本发明的主要有益效果是提供一种低成本的飞机降落辅助装置。
本发明的另一有益效果是帮助飞行学员掌握降落技巧。
本发明的另一有益效果是节约能源资源,并提高环境质量。
附图说明
图 1 显示一架飞机和一条跑道的相对位置。
图 2A -图 2C 为三个基于视觉的飞机降落辅助装置的功能框图。
图 3 说明倾侧角( γ )的定义。
图 4 为一个原始跑道图像。
图 5 为一个校正跑道图像。
图 6 说明俯仰角( ρ )的定义。
图 7 说明航向角( α )的定义。
图 8 表示一种基于视觉的高度测量方法。
图 9A -图 9B 为具有定向功能的飞机降落辅助装置。
注意到,这些附图仅是概要图,它们不按比例绘图。为了显眼和方便起见,图中的部分尺寸和结构可能做了放大或缩小。在不同实施例中,相同的符号一般表示对应或类似的结构。
本发明的实施方式
在图 1 的实施例中,飞机 10 装有一台基于视觉的降落辅助装置 20 。该装置 20 安装在飞机 10 挡风玻璃的后面,面向前方。它可以是相机、带相机功能的计算机或类计算机装置、或智能手机。其光学原点标记为 O' 。降落辅助装置 20 利用计算机视觉测量它到地面 0 的高度 A 。跑道 100 位于地面 0 上并处于飞机前方。其长度为 L ,宽度为 W 。此处,地面坐标定义为:其原点 o 为 O' 在地面 0 上的投影,其 x 轴平行于跑道 100 的纵轴(跑道长度方向), y 轴平行于跑道的横轴(跑道宽度方向), z 轴垂直于 x-y 平面。 z 轴单独由跑道表面来定义,它被本说明书中许多坐标共用。
图 2A -图 2C 表示三种基于视觉的飞机降落辅助装置 20 。图 2A 中的实施例含有一个相机 30 和一个处理器 70 。它利用跑道宽度 W 和相机 30 获取的跑道图像来计算高度 A 。用户可以从机场信息表( Airport Directory )中获取跑道宽度 W ,并手动输入;降落辅助装置 20 也可以直接从机场数据库通过电子检索获得跑道宽度 W 。该飞机降落辅助装置 20 可以测量高度,预测飞机的未来高度,并在决策点之前为飞行员提供指示(如视觉和 / 或声音指示)。比如,在飞机降落操作(如拉平或预着陆操作)前两秒,发出两个短哔声和一个长哔声。飞行员应该在前两次短哔声时做好准备,在最后的长哔声时进行操作。
与图 2A 相比,图 2B 中的实施例还包括一个传感器 40 ,如一个惯性传感器(如陀螺仪)或者一个磁场传感器(如磁场仪)。它可以用来测量姿态角(如俯仰角 ρ 、航向角 α 、倾侧角 γ )。直接采用传感器测量的姿态角可以简化高度计算。例如说,测量的倾侧角 γ 可直接用来转动原始跑道图像;测量的俯仰角 ρ 和航向角 α 可之间用来计算高度(参见图 8 )。使用传感器数据可以减少处理器的工作量,加速图像处理。
图 2C 中的实施例为一台智能手机 80 。它还包括一存储器 50 ,该存储器 50 存储'飞机降落'应用软件( app ) 60 。通过运行'飞机降落' app 60 ,智能手机 80 可以测量高度,预测飞机的未来高度,并在决策点之前为飞行员提供指示。智能手机含有所有高度测量所需的部件(包括相机、传感器和处理器),它可以容易地辅助飞机降落。由于智能手机无处不在,基于视觉的飞机降落辅助装置不需要增加硬件,仅需在智能手机上安装一个'降落辅助' app 即可。这种基于软件的飞机降落辅助装置具有最低成本。
图 3 -图 5 描述了一种获取倾侧角( γ )的方法。图 3 定义了相机 30 的倾侧角( γ )。由于相机 30 的图像传感器 32 (如 CCD 传感器或者 CMOS 传感器)在图像平面 36 中为长方形,原始图像坐标 XYZ 可以定义如下:,原点 O 为图像传感器 32 的光学原点, X 、 Y 轴为长方形的两条中心线, Z 轴垂直于 X-Y 平面。这里直线 N 同时垂直于 z 和 Z 轴,且总是平行于跑道平面。倾侧角( γ )定义为 Y 轴与直线 N 之间的夹角。图像坐标 XYZ 围绕 Z 轴旋转 -γ 后形成校正后的图像坐标(校正图像坐标) X*Y*Z* 。这里,直线 N 也为校正图像坐标的 Y* 轴。
图 4 为相机 30 获取的原始跑道图像 100i 。由于相机 30 有倾侧角 γ ,地平线的图像 120i 是倾斜的,它与 Y 轴之间的夹角为 γ 。将图像 100i 围绕原点 O 旋转 -γ ,可以对它进行 γ 校正。图 5 为 γ 校正后的跑道图像(校正跑道图像) 100* ,其地平线 120* 水平,即平行于 Y* 轴。在校正跑道图像 100* 中,通过其光学原点 O 的水平线(即 Y* 轴)被称为主水平线 H ,通过其光学原点 O 的垂直线(即 X* 轴)被称为主垂直线 V 。图 6 -图 8 将对校正跑道图像 100* 做进一步分析。
图 6 定义了相机 30 的俯仰角( ρ )。光学坐标 X'Y'Z' 为校正图像坐标 X*Y*Z* 沿着 Z* 轴平移距离 f 形成的。这里, f 为透镜 38 的焦距。这里还定义了一个 α 校正后(参见图 7 )的地面坐标(校正地面坐标) x*y*z* ,其原点 o* 和 z* 轴与地面坐标 xyz 相同, x* 轴与 X' 轴在相同平面内。透镜的光学原点 O' 到地面(即原点 o* )的距离为高度 A 。俯仰角( ρ )为 Z' 轴与 x* 轴的夹角。对于一在地面 0 上、坐标为( x*, y*, 0 )的点 R (在校正地面坐标 x*y*z* 中),其在图像传感器 32 上形成的图像的坐标 (X*, Y*, 0) (在校正图像坐标 X*Y*Z* 中)可以表达为: δ=ρ-atan(A/x*) , X*=-f*tan(δ) , Y*=f*y*/sqrt(x*^2+A^2)/cos(δ) 。
图 7 定义了相机 30 的航向角( α )。该图显示了地面坐标 xyz 和校正地面坐标 x*y*z* 。它们之间沿 z 轴旋转了 α 。注意 α 是相对于跑道 100 的纵轴(长度方向)定义的。尽管 x 轴平行于跑道 100 的纵轴,采用校正地面坐标 x*y*z* 在计算上更高效,因此本说明书在该坐标中分析跑道图像。
图 8 展示了一种高度测量的步骤。首先,从原始跑道图像的地平线 120i 中提取倾侧角 γ (图 4 ,步骤 210 )。在获得 γ 后,将原始跑道图像围绕光学原点旋转 -γ 以进行 γ 校正(图 5 ,步骤 220 )。在校正跑道图像 100* 中,跑道左右边缘延长线 160* 、 180* 的交点标记为 P ,其坐标( XP, YP )( XP 为交点 P 与主水平线 H 之间的距离; YP 为交点 P 与主垂直线 V 之间的距离)可以分别表达为: XP=f*tan(ρ) , YP=f*tan(α)/cos(ρ) ,由此可以计算出俯仰角 ρ=atan(XP/f) (图 5 ,步骤 230 ),以及航向角 α=atan[(YP/f)*cos(ρ)] (图 5 ,步骤 240 )。
最后,测量跑道左右边缘延长线 160* 、 180* 与主水平线 H 之间交点 A 和 B 的距离 Δ ,(图 5 ,步骤 250 ),并用此来计算飞机高度 A=W*sin(ρ)/cos(α)/(Δ/f) 。此外,跑道左右边缘延长线与主水平线 H 之间的夹角 θA 和 θB 也可以用来计算 A=W*cos(ρ)/cos(α)/[cot(θA)- cot(θB)] 。
对于熟悉本领域的技术人员来说,图 8 中的步骤可以跳过或调换顺序。比如,当传感器 40 用于测量至少一个姿态角(如俯仰角 ρ 、航向角 α 、倾侧角 γ )时,测量的倾侧角 γ 可直接用来转动原始跑道图像(跳过步骤 210 );测量的俯仰角 ρ 和航向角 α 可之间用来计算高度(跳过步骤 230 , 240 )。使用传感器数据可以减少处理器的工作量,加速图像处理。
图 9A -图 9B 为具有定向功能的飞机降落辅助装置 20 。它保证跑道图像中的地平线始终水平,从而不需要对跑道图像进行 γ 校正,这能简化高度计算。具体来说,飞机降落辅助装置(如手机) 20 放置在一个定向器 19 中。该定向器 19 由摇篮 18 、重块( weight ) 14 和手机底座 12 组成。支架 17 固定在飞机 10 上,摇篮 18 由球轴承 16 支撑在支架 17 上。无论飞机 10 是在水平方向(图 9A )还是有一个俯仰角 ρ (图 9B ),重块 14 可以保证手机 20 的纵轴总是沿着重力 z 的方向。重块 14 最好含有金属材料,以与磁铁 15 形成一对阻尼器,从而帮助稳定摇篮 18 。
应该了解,在不远离本发明的精神和范围的前提下,可以对本发明的形式和细节进行改动,这并不妨碍它们应用本发明的精神。例如说,本发明中的实施例均应用在固定翼飞机中,它也可以用在旋转翼飞机(如直升飞机)或无人飞行器( UAV )中。因此,除了根据附加的权利要求书的精神,本发明不应受到任何限制。

Claims (10)

  1. 一种基于视觉的飞机降落辅助装置,其特征在于含有:
    一图像单元,该图像单元获取至少一原始跑道图像;
    一处理单元,该处理单元从校正跑道图像中测量跑道左右边缘延长线的特性,并根据所述特性和跑道宽度 (W) 计算飞机高度 (A) ,该校正跑道图像由该原始跑道图像转动得到。
  2. 根据权利要求 1 所述的装置,其特征还在于:所述特性包括跑道左右边缘延长线与主水平线交点之间的距离 (Δ) ,且高度 (A) 由下述公式计算 A=W*sin(ρ)/cos(α)/(Δ/f) ,其中, f 是该图像单元透镜的焦距, ρ 是俯仰角, α 是航向角。
  3. 根据权利要求 1 所述的装置,其特征还在于:所述特性包括跑道左右边缘延长线与主水平线交点之间的夹角 (θA, θB) ,且高度 (A) 由下述公式计算 A=W*cos(ρ)/cos(α)/[cot(θA)- cot(θB)] ,其中, ρ 是俯仰角, α 是航向角。
  4. 根据权利要求 1 所述的装置,其特征还在于:所述特性包括跑道左右边缘延长线的交点与该转动后跑道图像的主水平线之间的距离 (XP) ,且俯仰角 (ρ) 由下述公式计算 ρ=atan(XP/f) ,其中, f 是该图像单元透镜的焦距。
  5. 根据权利要求 1 所述的装置,其特征还在于:所述特性包括跑道左右边缘延长线的交点与该转动后跑道图像的主垂直线的之间距离 (YP) ,且航向角 (α) 由下述公式计算 α=atan[(YP/f)*cos(ρ)] ,其中, f 是该图像单元的透镜焦距, ρ 是俯仰角。
  6. 根据权利要求 1 所述的装置,其特征还在于:在该原始跑道图像地平线不水平的情况下,所述处理单元转动该原始跑道图像获得该校正跑道图像,该校正跑道图像的地平线水平。
  7. 根据权利要求 1 所述的装置,其特征还在于含有:一传感器,该传感器测量至少一姿态角。
  8. 根据权利要求 1 所述的装置,其特征还在于含有:一定向单元,该定向单元保证该原始跑道图像无倾侧角。
  9. 根据权利要求 1 所述的装置,其特征还在于:该飞机是一固定翼飞机、旋转翼飞机或无人飞行器。
  10. 根据权利要求 1 所述的装置,其特征还在于:该装置是一智能手机。
PCT/CN2013/080265 2013-02-21 2013-07-29 基于视觉的飞机降落辅助装置 WO2014127607A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361767792P 2013-02-21 2013-02-21
US61/767,792 2013-02-21

Publications (1)

Publication Number Publication Date
WO2014127607A1 true WO2014127607A1 (zh) 2014-08-28

Family

ID=51351825

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/080265 WO2014127607A1 (zh) 2013-02-21 2013-07-29 基于视觉的飞机降落辅助装置

Country Status (3)

Country Link
US (2) US20140236398A1 (zh)
CN (1) CN104006790A (zh)
WO (1) WO2014127607A1 (zh)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9208689B2 (en) 2011-08-19 2015-12-08 Aerovironment Inc. Deep stall aircraft landing
US9511858B2 (en) 2011-08-19 2016-12-06 Aerovironment, Inc. Inverted-landing aircraft
FR3018383B1 (fr) * 2014-03-07 2017-09-08 Airbus Operations Sas Procede et dispositif de determination de parametres de navigation d'un aeronef lors d'une phase d'atterrissage.
CN104503459A (zh) * 2014-11-25 2015-04-08 深圳市鸣鑫航空科技有限公司 多旋翼无人机回收系统
CN106448275B (zh) * 2014-12-30 2023-03-17 大连现代高技术集团有限公司 基于可视化的飞机泊位实时引导系统
WO2016187760A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
US10104282B2 (en) * 2015-09-30 2018-10-16 Ricoh Co., Ltd. Yaw user interface
CN105513106B (zh) * 2015-12-05 2018-08-17 中国航空工业集团公司洛阳电光设备研究所 一种平显等角跑道符号绘制方法
CN114476105A (zh) 2016-08-06 2022-05-13 深圳市大疆创新科技有限公司 自动着陆表面地形评估以及相关的系统和方法
IL249870B (en) * 2016-12-29 2022-02-01 Israel Aerospace Ind Ltd Autonomous landing with the help of an image
CN106628211B (zh) * 2017-03-16 2019-02-26 山东大学 基于led点阵的地面引导式无人机飞行降落系统及方法
CN108540731A (zh) * 2018-04-17 2018-09-14 北京艾沃次世代文化传媒有限公司 实拍摄影机与虚拟场景实时同步显示方法
CN110456804A (zh) * 2018-05-07 2019-11-15 北京林业大学 一种手机app控制的飞行摄影测图方法
CN110968107A (zh) * 2019-10-25 2020-04-07 深圳市道通智能航空技术有限公司 一种降落控制方法、飞行器及存储介质
CN111220132B (zh) * 2019-11-13 2021-07-06 中国电子科技集团公司第二十研究所 一种基于图像匹配的飞行器离地高度测量方法
CN110796660B (zh) * 2020-01-04 2020-04-07 成都科睿埃科技有限公司 用于机场跑道的图像清晰度评价方法
CN112198902A (zh) * 2020-11-18 2021-01-08 普宙飞行器科技(深圳)有限公司 一种无人机降落控制方法、系统、存储介质及电子设备
CN112797982A (zh) * 2020-12-25 2021-05-14 中国航空工业集团公司沈阳飞机设计研究所 一种基于机器视觉的无人机自主着陆测量方法
CN113295164B (zh) * 2021-04-23 2022-11-04 四川腾盾科技有限公司 一种基于机场跑道的无人机视觉定位方法及装置
FR3122408B1 (fr) * 2021-04-29 2023-06-09 Airbus Sas Syteme et procede d’assistance d’approche d’aeroports
CN115019827B (zh) * 2021-09-15 2024-06-18 杭州爱华智能科技有限公司 飞机噪声的自动监测方法及飞机噪声的自动监测系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007080313A1 (fr) * 2006-01-11 2007-07-19 Airbus France Procede et dispositif d'aide au pilotage d'un aeronef lors d'une approche autonome
CN101109640A (zh) * 2006-07-19 2008-01-23 北京航空航天大学 基于视觉的无人驾驶飞机自主着陆导航系统
CN101976278A (zh) * 2010-09-29 2011-02-16 南京信息工程大学 基于虚拟现实技术的飞机降落辅助系统及方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL88263A (en) * 1988-11-02 1993-03-15 Electro Optics Ind Ltd Navigation system
GB2233527B (en) * 1989-06-23 1993-05-26 Marconi Gec Ltd Aircraft landing system
US5716032A (en) * 1996-04-22 1998-02-10 United States Of America As Represented By The Secretary Of The Army Unmanned aerial vehicle automatic landing system
US6157876A (en) * 1999-10-12 2000-12-05 Honeywell International Inc. Method and apparatus for navigating an aircraft from an image of the runway
FR2835314B1 (fr) * 2002-01-25 2004-04-30 Airbus France Procede de guidage d'un aeronef en phase finale d'atterrissage et dispositif correspondant

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007080313A1 (fr) * 2006-01-11 2007-07-19 Airbus France Procede et dispositif d'aide au pilotage d'un aeronef lors d'une approche autonome
CN101109640A (zh) * 2006-07-19 2008-01-23 北京航空航天大学 基于视觉的无人驾驶飞机自主着陆导航系统
CN101976278A (zh) * 2010-09-29 2011-02-16 南京信息工程大学 基于虚拟现实技术的飞机降落辅助系统及方法

Also Published As

Publication number Publication date
US20140236398A1 (en) 2014-08-21
CN104006790A (zh) 2014-08-27
US20150314885A1 (en) 2015-11-05

Similar Documents

Publication Publication Date Title
WO2014127607A1 (zh) 基于视觉的飞机降落辅助装置
JP7226487B2 (ja) 制御装置および制御方法
US20230106526A1 (en) Control device, control method, and computer program
US20220107643A1 (en) Control device, imaging device, control method, imaging method, and computer program
CN104298248B (zh) 旋翼无人机精确视觉定位定向方法
Thurrowgood et al. A biologically inspired, vision‐based guidance system for automatic landing of a fixed‐wing aircraft
Kim et al. Feasibility of employing a smartphone as the payload in a photogrammetric UAV system
CN106124517A (zh) 检测结构件表面裂缝的多旋翼无人机检测平台系统及其用于检测结构件表面裂缝的方法
CN104007766A (zh) 无人飞行器飞行控制方法及装置
CN205920057U (zh) 检测结构件表面裂缝的多旋翼无人机检测平台系统
CN111670339A (zh) 用于无人飞行器和地面载运工具之间的协作地图构建的技术
CN106647785B (zh) 无人机停机坪控制方法及装置
CN109839943A (zh) 一种基于rtk差分定位的无人机系统
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム
CN112797982A (zh) 一种基于机器视觉的无人机自主着陆测量方法
WO2020062024A1 (zh) 基于无人机的测距方法、装置及无人机
CN104833338A (zh) 基于视觉的飞机降落辅助装置
KR102578260B1 (ko) 헬리캠과 시뮬레이터를 이용한 간접여행 시스템
WO2022202429A1 (ja) 情報処理装置、情報処理方法、プログラム及び情報処理システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13875431

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13875431

Country of ref document: EP

Kind code of ref document: A1