CN108225273B - Real-time runway detection method based on sensor priori knowledge - Google Patents

Real-time runway detection method based on sensor priori knowledge Download PDF

Info

Publication number
CN108225273B
CN108225273B CN201611153162.4A CN201611153162A CN108225273B CN 108225273 B CN108225273 B CN 108225273B CN 201611153162 A CN201611153162 A CN 201611153162A CN 108225273 B CN108225273 B CN 108225273B
Authority
CN
China
Prior art keywords
runway
image
point
template
search area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611153162.4A
Other languages
Chinese (zh)
Other versions
CN108225273A (en
Inventor
程岳
李亚晖
谢建春
张磊
文鹏程
白林亭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Aeronautics Computing Technique Research Institute of AVIC
Original Assignee
Xian Aeronautics Computing Technique Research Institute of AVIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Aeronautics Computing Technique Research Institute of AVIC filed Critical Xian Aeronautics Computing Technique Research Institute of AVIC
Priority to CN201611153162.4A priority Critical patent/CN108225273B/en
Publication of CN108225273A publication Critical patent/CN108225273A/en
Application granted granted Critical
Publication of CN108225273B publication Critical patent/CN108225273B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/182Network patterns, e.g. roads or rivers

Abstract

The invention belongs to the technical field of embedded computer image processing, and particularly relates to a real-time runway detection method based on sensor priori knowledge. The existing runway detection algorithm mainly aims at satellite images and downward-looking high-altitude aerial images, is inconsistent with forward-looking aerial images required by an airborne synthetic vision system, generally aims at static images for detection, is long in calculation time and cannot meet the real-time requirement. Therefore, the invention provides a real-time runway detection algorithm based on sensor priori knowledge, and the method also includes the calculation of the flight position, attitude and other information provided by an airborne sensor, so that the search area for runway detection is greatly reduced, and the runway position in the image is quickly and accurately detected.

Description

Real-time runway detection method based on sensor priori knowledge
Technical Field
The invention belongs to the technical field of embedded computer image processing, and particularly relates to a real-time runway detection method based on sensor priori knowledge.
Background
With the development of airborne avionics technology, many flight assistance systems have emerged. The enhanced vision system can improve the perception of the pilot to the external environment and enhance the flight safety. Runway detection techniques play an important role in enhancing vision systems. However, the conventional runway detection method mainly aims at satellite images and downward-looking high-altitude aerial images, and is not consistent with forward-looking aerial images required by an airborne synthetic vision system, and generally aims at static images for detection, so that the calculation time is long, and the real-time requirement cannot be met. There is a need for a real-time and efficient runway detection method that can accurately and quickly detect a runway in a real-time image.
The invention provides a real-time runway detection method based on sensor priori knowledge to solve the problems.
Disclosure of Invention
The purpose of the invention is:
the runway detection problem in the airborne real-time image is solved.
The technical solution of the invention is as follows:
the prior runway detection method mostly takes a runway as one of targets, adopts a general target identification method, and has the main defects that: 1) the computational complexity is high. The searching, matching and other processes of the algorithm are carried out on the whole image, and the operation complexity is high; 2) template-based methods have low and inaccurate template acquisition efficiency. Most of the templates are extracted by manual calibration and a template library, the former has low efficiency, and the latter is not accurate enough.
However, in an airborne system, there are a lot of prior knowledge, such as camera attitude, camera position, imaging parameters, etc., and these information can guide us to generate the template shape of the runway and locate the template position in the image, thereby narrowing the search range and reducing the computation.
In the enhanced vision system, there are different kinds of sensors that provide flight information including attitude, position, etc. The aircraft attitude sensors can provide the pitch, roll and yaw angles of the aircraft, and the GPS can provide the coordinates of the aircraft in the three-dimensional world. In addition, the parameters of the onboard camera and the coordinates of the corner points of the airport runway are also fixed and known when taking images.
In summary, the algorithm provided by the patent can calculate the template shape and position of the airport runway in an ideal state in the first step, so as to determine a search range and reduce the calculation amount of runway detection. The specific calculation process is as follows:
a real-time runway detection method based on sensor priori knowledge comprises the following steps:
the first step is as follows: suppose a runway corner point P in the world coordinate systemW=(xW,yW,zW)TNeeds to be converted to a point P in the camera coordinate systemC=(xC,yC,zC)TThen the conversion process requires one revolution
Figure GDA0002478801930000026
Plus one translation TWThe corner point positions in the camera coordinate system are obtained as follows:
Figure GDA0002478801930000021
wherein T isWThe following can be obtained directly from the GPS data:
TW=(xW,yW,zW)T
and rotating the matrix
Figure GDA0002478801930000022
The calculation process of (a) is slightly complicated; three attitude angles through the aircraft: rolling phi, pitching theta and yawing psi, and finally completing a complete rotation process after three rotations; so that the rotation matrix
Figure GDA0002478801930000023
Is obtained by multiplying three parts of rotation matrixes:
Figure GDA0002478801930000024
the second step is that: according to the imaging principle of the camera and the internal and external parameters of the shooting camera, the coordinate position of the runway corner point on the final image is calculated, and the specific imaging calculation process is as follows according to the equivalent geometric relationship;
one point P in the target plane is projected to the point P' of the imaging plane and finally converges with other points at the focusPolymerizing; p calculated from the first stepC=(xC,yC,zC)TFrom the similarity relationship of triangles:
Figure GDA0002478801930000025
wherein x isimgAnd yimgIs the coordinate of the target point on the imaging plane, PsIs the pixel side length of the imaging plane, H0And W0The number of pixel points in the height direction and the width direction of the imaging plane respectively; thereby solving for ximgAnd yimg
The third step: determining the shape and position of a runway template according to the coordinate position of the runway corner point on the image, and defining a search area according to the shape and position of the runway template; the specific calculation process is as follows:
respectively calculating coordinates of four corner points of the airport runway on the final image, and sequentially connecting the coordinates corresponding to the four corner points to obtain an airport runway template in an ideal state; in practical application, data obtained by an airplane attitude sensor and a GPS have certain errors, a search area is defined outside the calculated position of an airport runway template, and two parameters HE and WE are used for controlling the size of the search area, wherein the HE and the WE respectively take 1/2 lengths of the height and the width of the runway template;
the fourth step: in the search area, carrying out straight line extraction on the image to obtain a straight line image with more obvious characteristics; specifically, the LSD algorithm is adopted for straight line extraction, and the steps are as follows: 1) calculating image gradient; 2) pseudo-ordering of gradients; 3) growing or merging gradient neighborhoods; 4) fitting a straight line;
the fifth step: matching the runway in the search area by using the generated line graph and the runway template, and finally calculating the position of the runway; specifically, a directional Chamfer Matching method is adopted to match a runway in a search area, and the calculation formula is as follows:
Figure GDA0002478801930000031
wherein U is { U ═iIs the edge line image of the template, V ═ VjThe directional Chamfer distance d between U and V is the edge straight line image of the image to be detectedDCN(U, V) is defined as the average distance of each point in U to the nearest edge in V, [ phi ] (x) represents the edge direction at the edge point x, and [ lambda ] is the weight parameter between the position term and the direction term; dDCMAnd the position with the minimum (U, V) is the final search result.
The invention has the advantages that:
the algorithm aims at the forward-looking aerial image of the aircraft in the approach landing stage, and is convenient to apply to an airborne enhanced vision system. The algorithm can pre-judge the shape of the runway template and the position of the runway in the airborne real-time image by using the prior knowledge provided by the airborne sensor, thereby reducing the search area, and only matching the runway in the search area, thereby greatly improving the detection accuracy, reducing the calculation amount and achieving the effect of accurately detecting the runway in real time.
Drawings
Fig. 1 is a schematic block diagram of the present invention, and the algorithm is divided into five steps.
Fig. 2 is a simplified schematic diagram of the imaging principle of the camera in the present invention.
Fig. 3 is a schematic diagram illustrating the definition of a search area according to a runway template in the present invention.
Detailed Description
In an airborne enhanced synthetic vision system, runway information in a forward-looking visible light video needs to be detected and identified in real time. Taking the system as an example, the method described in the patent is applied specifically. The airborne enhanced composite visual system is provided with one path of video input and one path of video output, and the specific runway detection steps for each frame of input image are as follows:
the first step is as follows: suppose a runway corner point P in the world coordinate systemW=(xW,yW,zW)TNeeds to be converted to a point P in the camera coordinate systemC=(xC,yC,zC)TThen the conversion process requires one revolution
Figure GDA0002478801930000046
Plus one translation TWThe corner point positions in the camera coordinate system are obtained as follows:
Figure GDA0002478801930000041
where Tw can be obtained directly from GPS data:
TW=(xW,yW,zW)T(4-10)
and rotating the matrix
Figure GDA0002478801930000042
The calculation process of (a) is slightly complicated; three attitude angles through the aircraft: rolling phi, pitching theta and yawing psi, and finally completing a complete rotation process after three rotations; so that the rotation matrix
Figure GDA0002478801930000043
Is obtained by multiplying three parts of rotation matrixes:
Figure GDA0002478801930000044
the second step is that: according to the imaging principle of the camera and the internal and external parameters of the shooting camera, the coordinate position of the runway corner point on the final image is calculated, and the specific imaging calculation process is as follows according to the equivalent geometric relationship;
projecting a point P in the target plane to a point P' of the imaging plane, and finally converging the point P with other points at the focus; p calculated from the first stepC=(xC,yC,zC)TFrom the similarity relationship of triangles:
Figure GDA0002478801930000045
wherein x isimgAnd yimgIs the coordinate of the target point on the imaging plane, PsIs the pixel side length of the imaging plane, H0And W0The number of pixel points in the height direction and the width direction of the imaging plane respectively; thereby solving for ximgAnd yimg
The third step: determining the shape and position of a runway template according to the coordinate position of the runway corner point on the image, and defining a search area according to the shape and position of the runway template; the specific calculation process is as follows:
respectively calculating coordinates of four corner points of the airport runway on the final image, and sequentially connecting the coordinates corresponding to the four corner points to obtain an airport runway template in an ideal state; in practical application, data obtained by an airplane attitude sensor and a GPS have certain errors, a search area is defined outside the calculated position of an airport runway template, and two parameters HE and WE are used for controlling the size of the search area, wherein the HE and the WE respectively take 1/2 lengths of the height and the width of the runway template;
the fourth step: in the search area, carrying out straight line extraction on the image to obtain a straight line image with more obvious characteristics; specifically, the LSD algorithm is adopted for straight line extraction, and the steps are as follows: 1) calculating image gradient; 2) pseudo-ordering of gradients; 3) growing or merging gradient neighborhoods; 4) fitting a straight line;
the fifth step: matching the runway in the search area by using the generated line graph and the runway template, and finally calculating the position of the runway; specifically, a directional Chamfer Matching method is adopted to match a runway in a search area, and the calculation formula is as follows:
Figure GDA0002478801930000051
wherein U is { U ═iIs the edge line image of the template, V ═ VjThe directional Chamfer distance d between U and V is the edge straight line image of the image to be detectedDCM(U, V) is defined as the average distance of each point in U to the nearest edge in V, [ phi ] (x) represents the edge direction at the edge point x, and [ lambda ] is the weight parameter between the position term and the direction term; dDCM(U, V) the minimum position is the final search result; and marking the corresponding position of the input image with a markNote that the image is output as an output image.

Claims (1)

1. A real-time runway detection method based on sensor priori knowledge comprises the following steps:
the first step is as follows: suppose a runway corner point P in the world coordinate systemW=(xW,yW,zW)TNeeds to be converted to a point P in the camera coordinate systemC=(xC,yC,zC)TThen the conversion process requires one revolution
Figure FDA0002478801920000011
Plus one translation TWThe corner point positions in the camera coordinate system are obtained as follows:
Figure FDA0002478801920000012
wherein T isWThe following can be obtained directly from the GPS data:
TW=(xW,yW,zW)T
and rotating the matrix
Figure FDA0002478801920000013
The calculation process of (a) is slightly complicated; three attitude angles through the aircraft: rolling phi, pitching theta and yawing psi, and finally completing a complete rotation process after three rotations; so that the rotation matrix
Figure FDA0002478801920000014
Is obtained by multiplying three parts of rotation matrixes:
Figure FDA0002478801920000015
the second step is that: according to the imaging principle of the camera and the internal and external parameters of the shooting camera, the coordinate position of the runway corner point on the final image is calculated, and the specific imaging calculation process is as follows according to the equivalent geometric relationship;
projecting a point P in the target plane to a point P' of the imaging plane, and finally converging the point P with other points at the focus; p calculated from the first stepC=(xC,yC,zC)TFrom the similarity relationship of triangles:
Figure FDA0002478801920000016
wherein x isimgAnd yimgIs the coordinate of the target point on the imaging plane, PsIs the pixel side length of the imaging plane, H0And W0The number of pixel points in the height direction and the width direction of the imaging plane respectively; thereby solving for ximgAnd yimg
The third step: determining the shape and position of a runway template according to the coordinate position of the runway corner point on the image, and defining a search area according to the shape and position of the runway template; the specific calculation process is as follows:
respectively calculating coordinates of four corner points of the airport runway on the final image, and sequentially connecting the coordinates corresponding to the four corner points to obtain an airport runway template in an ideal state; in practical application, data obtained by an airplane attitude sensor and a GPS have certain errors, a search area is defined outside the calculated position of an airport runway template, and two parameters HE and WE are used for controlling the size of the search area, wherein the HE and the WE respectively take 1/2 lengths of the height and the width of the runway template;
the fourth step: in the search area, carrying out straight line extraction on the image to obtain a straight line image with more obvious characteristics; specifically, the LSD algorithm is adopted for straight line extraction, and the steps are as follows: 1) calculating image gradient; 2) pseudo-ordering of gradients; 3) growing or merging gradient neighborhoods; 4) fitting a straight line;
the fifth step: matching the runway in the search area by using the generated line graph and the runway template, and finally calculating the position of the runway; specifically, a directional Chamfer Matching method is adopted to match a runway in a search area, and the calculation formula is as follows:
Figure FDA0002478801920000021
wherein U is { U ═iIs the edge line image of the template, V ═ VjThe directional Chamfer distance d between U and V is the edge straight line image of the image to be detectedDCM(U, V) is defined as the average distance of each point in U to the nearest edge in V, [ phi ] (x) represents the edge direction at the edge point x, and [ lambda ] is the weight parameter between the position term and the direction term; dDCMAnd the position with the minimum (U, V) is the final search result.
CN201611153162.4A 2016-12-14 2016-12-14 Real-time runway detection method based on sensor priori knowledge Active CN108225273B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611153162.4A CN108225273B (en) 2016-12-14 2016-12-14 Real-time runway detection method based on sensor priori knowledge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611153162.4A CN108225273B (en) 2016-12-14 2016-12-14 Real-time runway detection method based on sensor priori knowledge

Publications (2)

Publication Number Publication Date
CN108225273A CN108225273A (en) 2018-06-29
CN108225273B true CN108225273B (en) 2020-06-30

Family

ID=62639019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611153162.4A Active CN108225273B (en) 2016-12-14 2016-12-14 Real-time runway detection method based on sensor priori knowledge

Country Status (1)

Country Link
CN (1) CN108225273B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109238265B (en) * 2018-07-20 2020-08-11 民航中南空管设备工程公司 Airport runway position measuring method
CN109341685B (en) * 2018-12-04 2023-06-30 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN109341700B (en) * 2018-12-04 2023-06-30 中国航空工业集团公司西安航空计算技术研究所 Visual auxiliary landing navigation method for fixed-wing aircraft under low visibility
CN110274622B (en) * 2019-06-13 2021-08-27 北京环奥体育发展有限公司 Method for calculating difficulty degree of track during marathon competition by field measurement and calculation
CN113295164B (en) * 2021-04-23 2022-11-04 四川腾盾科技有限公司 Unmanned aerial vehicle visual positioning method and device based on airport runway

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777181A (en) * 2010-01-15 2010-07-14 西安电子科技大学 Ridgelet bi-frame system-based SAR image airfield runway extraction method
WO2013051967A2 (en) * 2011-08-31 2013-04-11 Kirillov Andrey Porfir Evich Method for visual landing and kirillov device for visualizing takeoff or landing of an aircraft
CN103162669A (en) * 2013-03-01 2013-06-19 西北工业大学 Detection method of airport area through aerial shooting image
CN103577697A (en) * 2013-11-12 2014-02-12 中国民用航空总局第二研究所 FOD detection method based on road surface point cloud data
CN105302146A (en) * 2014-07-25 2016-02-03 空中客车运营简化股份公司 Method and system for automatic autonomous landing of an aircraft

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5858741B2 (en) * 2011-11-15 2016-02-10 キヤノン株式会社 Automatic tracking camera system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777181A (en) * 2010-01-15 2010-07-14 西安电子科技大学 Ridgelet bi-frame system-based SAR image airfield runway extraction method
WO2013051967A2 (en) * 2011-08-31 2013-04-11 Kirillov Andrey Porfir Evich Method for visual landing and kirillov device for visualizing takeoff or landing of an aircraft
CN103162669A (en) * 2013-03-01 2013-06-19 西北工业大学 Detection method of airport area through aerial shooting image
CN103577697A (en) * 2013-11-12 2014-02-12 中国民用航空总局第二研究所 FOD detection method based on road surface point cloud data
CN105302146A (en) * 2014-07-25 2016-02-03 空中客车运营简化股份公司 Method and system for automatic autonomous landing of an aircraft

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Runway incursion prevention systems: A review of runway incursion avoidance and alerting system approaches;J. Schonefeld etc.;《Progress in Aerospace Sciences》;20120407(第51期);全文 *
Vision-Based Road-Following Using a Small Autonomous Aircraft;Eric Frew etc.;《2004 IEEE Aerospace Conference Proceedings》;20041231;第5卷;全文 *
提取直线特征实现机场跑道实时检测;邸男 等;《光学精密工程》;20090930;第17卷(第9期);全文 *

Also Published As

Publication number Publication date
CN108225273A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108225273B (en) Real-time runway detection method based on sensor priori knowledge
CN111326023A (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN107390704B (en) IMU attitude compensation-based multi-rotor unmanned aerial vehicle optical flow hovering method
CN103559711A (en) Motion estimation method based on image features and three-dimensional information of three-dimensional visual system
CN109724586B (en) Spacecraft relative pose measurement method integrating depth map and point cloud
CN106500699B (en) A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room
Xu et al. Use of land’s cooperative object to estimate UAV’s pose for autonomous landing
CN114415736B (en) Multi-stage visual accurate landing method and device for unmanned aerial vehicle
CN107831776A (en) Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
CN104422425A (en) Irregular-outline object space attitude dynamic measuring method
Fan et al. Vision algorithms for fixed-wing unmanned aerial vehicle landing system
CN109978954A (en) The method and apparatus of radar and camera combined calibrating based on cabinet
CN107886541B (en) Real-time monocular moving target pose measuring method based on back projection method
Yan et al. SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios
Zhuang et al. Method of pose estimation for UAV landing
Zhigui et al. Review on vision-based pose estimation of UAV based on landmark
Zhang et al. Tracking and position of drogue for autonomous aerial refueling
Kim et al. Target detection and position likelihood using an aerial image sensor
CN112577463B (en) Attitude parameter corrected spacecraft monocular vision distance measuring method
CN105389819A (en) Robust semi-calibrating down-looking image epipolar rectification method and system
Zhang et al. Vision-based UAV Positioning Method Assisted by Relative Attitude Classification
CN113295171A (en) Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN109754412B (en) Target tracking method, target tracking apparatus, and computer-readable storage medium
Zhai et al. Target detection of low-altitude UAV based on improved YOLOv3 network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant