CN110751860A - Systems, methods, and computer-readable media for autonomous airport runway navigation - Google Patents

Systems, methods, and computer-readable media for autonomous airport runway navigation Download PDF

Info

Publication number
CN110751860A
CN110751860A CN201910652835.8A CN201910652835A CN110751860A CN 110751860 A CN110751860 A CN 110751860A CN 201910652835 A CN201910652835 A CN 201910652835A CN 110751860 A CN110751860 A CN 110751860A
Authority
CN
China
Prior art keywords
aircraft
sensor
runway
sensor data
computing system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910652835.8A
Other languages
Chinese (zh)
Other versions
CN110751860B (en
Inventor
S·丹姆
D·D·马格尼特
N·S·埃文斯
T·C·斯泰丁格
B·K·鲁普尼克
M·A·莫泽
K·S·卡拉汉
B·T·怀特海德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Publication of CN110751860A publication Critical patent/CN110751860A/en
Application granted granted Critical
Publication of CN110751860B publication Critical patent/CN110751860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • G08G5/065Navigation or guidance aids, e.g. for taxiing or rolling
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0083Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots to help an aircraft pilot in the rolling phase
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0202Control of position or course in two dimensions specially adapted to aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to systems, methods, and computer-readable media for autonomous airport runway navigation, with exemplary embodiments relating to autonomous airport runway navigation. An example system includes first and second sensors coupled to an aircraft at first and second locations, respectively, and a computing system configured to receive sensor data from one or both of the first and second sensors to detect airport signs located near a runway. The computing system is further configured to identify a centerline of the runway based on the airport marker and receive sensor data from both the first sensor and the second sensor to determine a lateral displacement representing a distance between a reference point of the aircraft and the centerline of the runway. The computing system is further configured to control instructions that indicate an adjustment for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.

Description

Systems, methods, and computer-readable media for autonomous airport runway navigation
Background
An airport is a complex that typically includes buildings and a series of ground paths and runways that enable an aircraft to navigate between the buildings in order to accommodate passengers before taking a takeoff or after completing a landing. Various types of aircraft are typically located at airports, some parked at the airport to carry passengers, and others using runways to initiate flight or to land. Because there are a large number of aircraft at an airport at any given time, and high speed aircraft are traveling on a runway during landing or after takeoff, it is important to safely control the aircraft while navigating on different ground paths, taxiways, and runways.
During airport ground navigation, aircraft are typically controlled by pilots, who may rely on airport signs for guidance. Airport signs, such as signs located near ground paths and various surface lines and signs painted on runways, taxiways and ground paths, provide directions and indications that may help pilots determine the proper course and speed to navigate the aircraft. For example, in navigating a parked aircraft from a passenger boarding bridge to a runway in preparation for and performing a takeoff, pilots may follow airport signs, including centerlines and boundaries located on the path and runway.
Disclosure of Invention
In one example, a system is described. The system includes a first sensor coupled to the aircraft at a first location, a second sensor coupled to the aircraft at a second location, and a computing system. The computing system is configured to receive a first set of sensor data including sensor data from one or both of the first sensor and the second sensor and detect one or more airport signs located near the runway using the first set of sensor data. The aircraft is located on a runway. The computing system is also configured to identify a centerline of the runway based on the one or more airport markings and receive a second set of sensor data including sensor data from both the first sensor and the second sensor. The computing system is also configured to determine a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the second set of sensor data. The lateral displacement is determined based on a comparison of a first position of the reference point relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point relative to the centerline of the runway as represented in the sensor data from the second sensor. The computing system is also configured to provide control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
In another example, a method is described. The method includes receiving, at a computing system, a first set of sensor data. The first set of sensor data includes sensor data from one or both of a first sensor coupled to the aircraft at a first location and a second sensor coupled to the aircraft at a second location. The method also includes detecting, by the computing system, one or more airport signs located near the runway using the first set of sensor data. The aircraft is located on a runway. The method includes identifying a centerline of the runway based on one or more airport signs and receiving a second set of sensor data at the computing system. The second set of sensor data includes sensor data from both the first sensor and the second sensor. The method also includes determining, by the computing system, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the second set of sensor data. The lateral displacement is determined based on a comparison of a first position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the second sensor. The method also includes providing, by the computing system, control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
In another example, a non-transitory computer-readable medium is described. The non-transitory computer-readable medium is configured to store instructions that, when executed by a computing system, enable the computing system to perform operations. The operations include receiving a first set of sensor data. The first set of sensor data includes sensor data from one or both of a first sensor coupled to the aircraft at a first location and a second sensor coupled to the aircraft at a second location. The operations also include detecting one or more airport signs located near the runway using the first set of sensor data. The aircraft is located on a runway. The operations include identifying a centerline of the runway based on the one or more airport signs and receiving a second set of sensor data. The second set of sensor data includes sensor data from both the first sensor and the second sensor. The operations include determining, using the second set of sensor data, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway. The lateral displacement is determined based on a comparison of a first position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the second sensor. The operations also include providing control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
The features, functions, and advantages that have been discussed can be achieved independently in various examples or may be combined in yet other examples, further details of which can be seen with reference to the following description and drawings.
Drawings
The novel features believed characteristic of the illustrative examples are set forth in the appended claims. The illustrative examples, however, as well as a preferred mode of use, further objectives and descriptions thereof, will best be understood by reference to the following detailed description of the illustrative examples of the present disclosure when read in conjunction with the accompanying drawings, wherein:
FIG. 1 is a block diagram depicting a system according to an example embodiment.
FIG. 2 illustrates a system coupled to an aircraft, according to an example embodiment.
FIG. 3 shows another view of a system coupled to an aircraft, according to an example embodiment.
Fig. 4 illustrates a first perspective (perspective) of a reference point of an aircraft corresponding to a first sensor coupled to the aircraft, according to an example embodiment.
Fig. 5 illustrates a second perspective of a reference point of the aircraft corresponding to a second sensor coupled to the aircraft, according to an example embodiment.
FIG. 6 illustrates a reference distance analysis using a first sensor coupled to an aircraft, according to an example embodiment.
Fig. 7 illustrates determining a lateral displacement between a reference point of an aircraft and a centerline of a runway according to an example embodiment.
FIG. 8 illustrates the detection of airport signs located near a runway using a set of sensor data, according to an example embodiment.
Fig. 9 illustrates identification of intersections of a runway according to an example embodiment.
Fig. 10 illustrates detection of a gap in the centerline of a runway according to an example embodiment.
Fig. 11 illustrates identification of runway boundaries based on airport markings, according to an example embodiment.
FIG. 12 illustrates sensor locations on an aircraft according to an example embodiment.
Fig. 13 shows a flow chart of a method according to an example embodiment.
Fig. 14 illustrates a flowchart of another method for use with the method illustrated in fig. 13, according to an example embodiment.
Fig. 15 shows a flowchart of another method for use with the method shown in fig. 13, according to an example embodiment.
Fig. 16 shows a flowchart of another method for use with the method shown in fig. 13, according to an example embodiment.
Fig. 17 illustrates a flowchart of another method for use with the method illustrated in fig. 13, according to an example embodiment.
Detailed Description
The disclosed examples will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all of the disclosed examples are shown. Indeed, several different examples may be described and should not be construed as limited to the examples set forth herein. Rather, these examples are described so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Ground navigation of an aircraft at an airport or similar venue typically involves the pilot controlling the aircraft using guidance from airport signs such as landmarks and various surface lines drawn along available ground paths, taxiways and runways. The pilot may observe and control the aircraft based on signs, path boundaries, and surface markings located on or along the ground path, while also identifying potential obstacles in the aircraft environment (e.g., other aircraft, vehicles, and people). Various signs and other airport signs may serve as waypoints that help pilots identify the overall location of the aircraft within the overall environment of the airport. For example, a pilot may determine that an aircraft on a path navigating toward a building is dropping passengers or approaching a particular runway based on information provided by signs and other airport signs. Thus, while a pilot may proficiently and safely navigate an aircraft between flights over the ground, ground navigation of an aircraft relies on the skills and decisions provided by the pilot observing the surrounding environment.
Example embodiments presented herein describe systems, methods, and computer-readable media for autonomous airport runway navigation, such as autonomous operation of an aircraft on a runway. Some examples relate to developing and providing information to pilots controlling aircraft on the ground in a manner that may assist the pilots in ground navigation in an airport or similar environment. For example, the system may include sensors configured to obtain sensor data representative of a forward environment of the aircraft and a computing system that processes the sensor data to develop and provide the sensor data in a format that may assist the pilot. In some cases, the system may use the sensor data to form control instructions that include adjustments for centering the aircraft along a desired path during subsequent navigation. For example, the control instructions may specify an adjustment that modifies the orientation of the aircraft to align the aircraft with the center of the path (e.g., the centerline of the runway).
Some examples involve using sensor data to develop control instructions that may enable an aircraft to navigate in a semi-autonomous or fully autonomous mode. When the control instructions enable semi-autonomous operation of the aircraft during ground navigation, the system may be configured to enable the pilot to maintain control of certain aspects of navigation while the control instructions cause the systems of the aircraft to perform other control operations. For example, the control commands may adjust the orientation of the aircraft during ground navigation to maintain a center on the ground path to avoid off-path navigation while enabling the pilot to control speed, braking, and other controls.
When the control commands enable the fully autonomous mode during ground navigation, the systems of the aircraft may use the control commands developed by the systems to safely navigate the aircraft without input from the pilot. The system may be configured to detect runways and paths, and further to navigate the aircraft according to the airport markers while operating in a fully autonomous mode. Thus, in the semi-autonomous and fully autonomous modes of ground navigation of the aircraft, the system may be configured to enable the pilot to assume control and override the automatic control of the aircraft.
To detect the desired path and understand the environment, the example system may capture sensor data of airport markers and objects in the navigation path of the aircraft using various sensors, such as cameras (e.g., thermal imaging cameras, RGB cameras), radar, light detection and ranging (LIDAR), or a combination of different sensors. The type, location, and number of sensors used may vary from example to example. One or more sensors may be coupled to the aircraft at locations capable of measuring areas near the aircraft, such as a forward environment of the aircraft that includes a ground path over which the aircraft is traveling. To illustrate an example layout, a system for ground navigation of an aircraft may include a first sensor coupled to a first wing of the aircraft and a second sensor coupled to a second wing of the aircraft. In an example layout, both sensors may be oriented to measure the forward path of the aircraft, such that a computing system within the system can perform computer vision processing based on the obtained sensors in order to estimate the position of objects, people, and airport markings (e.g., signs, surface lines) relative to the aircraft.
The sensors may be arranged in various ways to obtain sensor data for a particular area of the environment of the aircraft. For example, one or more sensors may have a forward orientation in order to measure the ground path and the overall environment located in front of the aircraft. Such one or more forward sensors may be configured to capture sensor measurements that comprise a portion of the aircraft to be used as a reference point. In particular, a computing system processing sensor measurements may use the portion of the aircraft as a reference point (e.g., the front wheels of the aircraft) when analyzing sensor data to estimate the position of objects (e.g., aircraft, vehicles), people, and airport markers relative to the aircraft.
The computing system may then use the estimated location of the object, the airport markings, and the like when determining control commands for subsequent ground navigation of the aircraft. For example, one or more cameras coupled to a wing or other aspect of an aircraft (e.g., a fuselage of the aircraft) may provide images including a nose and/or nose wheel of the aircraft and surrounding areas. The computing system processing the images may then use the nose or front wheel as a reference point in determining subsequent control instructions for navigating the aircraft along the ground path (e.g., instructions to align the nose or front wheel with the center of the ground path).
When initially formulating control instructions, the computing system may receive and use the sensor data to obtain a general understanding of the aircraft environment. In particular, the computing system uses sensor data to detect nearby airport markers such as signs, surface lines, changes in the materials that make up the path (e.g., concrete and grass boundaries), intersections between multiple paths (e.g., an intersection that exists between two runways), objects (e.g., aircraft, vehicles), and people in the aircraft environment. The computing system may then use the surface markers to determine a navigation path while also performing obstacle avoidance based on the sensor data. For example, the computing system may detect and identify the runway and the respective centerlines of the runway using the airport markers detected within the sensor data. In some examples, the computing system may use sensor data from one or more sensors to detect and identify path edge geometries (e.g., edge geometries of a runway) to define boundaries of a path if there are no visible marks (e.g., erased marks, blank portions of the path, challenging lighting conditions).
Additionally, measuring sensor data of airport signs may enable a computing system to use the sensor data to determine the position of an aircraft in the overall environment of the airport. The computing system may use computer vision or other sensor data processing techniques to detect, identify and use signs and other airport markings as waypoints in a manner similar to a pilot. For example, the computing system may use the identified signs and airport markings to indicate that the aircraft is located near a building or runway at the airport. The computing system may also detect and define boundaries of paths that the aircraft may use during subsequent ground navigation.
In some examples, the computing system may compare airport signs detected and identified using sensor data to one or more maps representing an airport layout to determine the location of the aircraft. The map may indicate the location of runways, taxiways, and other ground paths relative to airport buildings. Thus, the computing system may use the input sensor data to determine the overall position of the aircraft in the environment of the airport layout by comparing airport markers within the map. The map and sensor data may be used to supplement each other during processing and analysis by the computing system.
In some examples, the system may use localization techniques (e.g., best fit localization) to align images received from one or more sensors with a map of the airport. The computing system may further perform object detection and localization while also updating the airport map. In operating and using the map and incoming sensor data, the computing system may further plan a ground path for the aircraft, and also develop and execute control instructions for navigating according to the planned ground path.
The system may use sensor measurements to further improve navigation of the travel path. For example, after detecting and identifying the center of the path for navigating the aircraft (e.g., the centerline of the runway), the computing system may use additional sensor data from the sensors to determine navigation instructions that represent how to navigate the aircraft during subsequent ground navigation. In particular, the computing system may use the sensor data to determine control instructions to navigate the aircraft along a center of the target path (such as a centerline of a runway).
The computing system may use the sensor data to determine a lateral displacement representing a distance between a reference point of the aircraft (e.g., a nose or a nose) and a center of the target path (e.g., a centerline of the runway). The lateral displacement represents the offset of the aircraft (or a reference point of the aircraft) relative to the centerline of the runway at the current position and orientation. The computing system may use the lateral displacement to formulate control instructions that align a reference point of the aircraft with a centerline or estimated center of a path over which the aircraft is traveling. In some examples, the computing system may use sensor data from one or more sensors to also determine the heading of the aircraft relative to a centerline of the target path of travel. The computing system may use a combination of lateral displacement and heading of the aircraft when determining control commands for subsequent navigation of the aircraft.
In some examples, the computing system may further use additional sensor data from other sensors when assisting ground navigation of the aircraft. For example, the computing system may use the additional sensor data when determining the lateral displacement between the reference point of the aircraft and the target travel path. The computing system may receive trend information from one or both of an Attitude and Heading Reference System (AHRS) and a Global Positioning System (GPS) indicative of one or more previous movements performed by the aircraft. The AHRS and GPS may be located on the aircraft and measure recent movements of the aircraft.
A computing system within the system may receive and use the trend information to estimate a future position and orientation of the aircraft. For example, if the aircraft travels slightly away from the center of the path (e.g., the centerline of the runway) as indicated in the trend information, the computing system may estimate that the aircraft will continue to navigate away from the centerline unless the control of the aircraft is adjusted. Thus, the computing system may use the trend information and the estimated future position and orientation of the aircraft when determining the lateral displacement between the reference point of the aircraft and the desired path (e.g., the centerline of the runway). When determining the lateral displacement of the reference point of the aircraft relative to the target path traveled by the ground, the computing system may use the orientation, position, speed, acceleration, and other factors (e.g., wind speed in the environment, slope of the path).
In some examples, the computing system is configured to detect any noise that may occur within sensor data received from sensors coupled to the aircraft. Noise within the sensor data may signal the computing system that the sensor data may be unreliable. For example, noise may be caused by environmental conditions (e.g., rain, wind), poor lighting conditions, and/or by results during sensor capture. As a result of detecting noise in the sensor data, the computing system may be configured to perform additional actions to overcome the presence of noise. For example, the computing system may be configured to use the trend information in determining a lateral displacement representing a distance between a reference point of the aircraft and a desired path (e.g., a centerline of a runway). Further, the computing system may be configured to obtain additional sensor data to increase confidence within estimates made using the sensor data.
The computing system may also be configured to detect gaps in the centerline of the runway or in other surface lines used to guide the aircraft. The gap may correspond to a break in the surface line or the absence of a surface line in some areas, causing a separation between sections or portions of the surface line. For example, when the computing system relies on images from a camera coupled to the aircraft, the images may display gaps in the centerline of the runway or gaps in other types of surface lines, which may make it difficult to determine the lateral displacement between the reference point of the aircraft and the surface lines at the gaps. In this case, the computing system may use the trend information in determining the navigation instructions to determine the lateral displacement from the desired path. Further, the computing system may also be configured to not provide additional adjustments to change the orientation of the aircraft until additional sensor data is received that re-establishes the presence of the surface line for guiding the aircraft. In some cases, gaps may occur in the surface lines due to dust or other materials blocking the visibility of the surface lines.
As described above, the computing system may determine a control instruction that causes a reference point of the aircraft to align with a target path (e.g., a centerline of a runway) during a subsequent navigation. In some cases, execution of the control instructions may cause a gradual adjustment of the position and orientation of the aircraft toward the centerline until a reference point of the aircraft is navigating on the centerline of the runway. The gradual adjustment prevents unnecessary shifting that may annoy passengers on the aircraft or remove and transfer cargo carried by the aircraft. In other cases, execution of the control instructions may quickly align the reference point of the aircraft with the centerline of the runway. Thus, the amount of time required to align the reference point of the aircraft with the target path (e.g., the centerline of the runway) may depend on the magnitude of the lateral displacement, the ground speed of the aircraft, the orientation of the aircraft, and the size of the aircraft, among other factors.
The system may be configured to provide sensor measurements and/or control instructions to a remotely located computing device, such as a control system or remote user interface of an aircraft. For example, a remote user interface that receives sensor measurements or control instructions from the system may include a head-mountable display that may be configured to display information received from the system. In other examples, the remote user interface may have another configuration. Thus, the remote user interface may receive control commands and other information from the system that may be used as a training platform to simulate or train a pilot to control an aircraft on the ground. In other examples, a remotely located computing device receiving control instructions and other information from the system may be used to remotely control the aircraft during ground navigation.
In further examples, the computing system may also use the primary sensor to determine control instructions and one or more secondary sensors to confirm or modify measurements captured by the primary sensor.
Referring now to the drawings, FIG. 1 is a block diagram depicting a system 100 according to an example embodiment. System 100 represents an example navigation system configured to assist in ground navigation of an aircraft. For example, the system 100 may enable fully autonomous or semi-autonomous ground navigation of an aircraft by obtaining and processing sensor data representative of the aircraft environment. Some components of the system 100 may be installed or located on the aircraft, while other components are located remotely (e.g., on the ground at an airport). In other examples, all components of system 100 are coupled to or disposed on an aircraft.
As shown in fig. 1, the system 100 includes a computing system 102 that participates in wireless communication 124 with a first sensor 112, a second sensor 114, an AHRS116, a GPS118, a remote user interface 120, and a control system 122. The different devices shown in system 100 that communicate with computing system 102 are included for example purposes and may differ in other implementations. Further, the computing system 102 may communicate with various devices at different times. For example, the computing system 102 may be configured to communicate with the remote user interface 120 in some cases, and with the control system 122 in other cases. Further, other embodiments of the system for autonomous airport runway navigation may include more or fewer components in different arrangements. Other configurations are described below.
The system 100 is an example navigation system that may assist in ground navigation of an aircraft. The degree of assistance provided by the system 100 may vary in the exemplary embodiment. To assist in ground navigation of an aircraft, the system 100 is configured to obtain and use sensor data to develop an understanding of the environment in a manner similar to a pilot. The system 100 may use the sensor data to detect the layout of the environment, including detecting and identifying a desired ground path over which the aircraft is traveling. Additionally, the system 100 may detect and identify airport markings (e.g., signs, surface lines) to determine the position of the aircraft in the airport's environment and further plan routes for ground navigation. Thus, different detected airport markers may be used as waypoints that the system 100 may use for path planning and navigation.
The system 100 may further use the sensor data to develop control instructions that may assist in ground navigation of the aircraft. For example, the system 100 may provide control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a center of a given ground path (e.g., a centerline of a runway) during subsequent navigation of the aircraft.
In some examples, the system 100 may be used as a supplement to pilots controlling aircraft during ground navigation. For example, the system 100 may use the sensor data to develop and provide alerts to a pilot controlling the aircraft. The alert may indicate the presence of a potential obstacle or may assist in maintaining a centered travel path on the ground path on which the aircraft is located. The system 100 may also be configured to provide sensor measurements to the pilot for viewing and use during ground navigation. In another example, the system 100 uses sensor data and computer vision to control the aircraft (or send control instructions to the control system 122 of the aircraft) while the pilot remotely monitors the aircraft and the controls provided by the system 100.
Further, the system 100 may additionally determine and provide control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a desired path of the control system 122 used by the pilot (e.g., a centerline of a runway). In such a configuration, the system 100 may enable semi-autonomous operation of the aircraft, allowing the pilot to control certain aspects of the ground navigation of the aircraft while the system 100 controls other aspects. When the system 100 is used as a semi-autonomous system configured to assist a pilot, the system 100 may be configured to allow the pilot to overrule any controls or control recommendations.
In further examples, the system 100 is configured to enable autonomous operation of the aircraft during ground navigation at an airport or similar environment. For example, the system 100 may use the sensor data to form control instructions that are provided to a control system of the aircraft for execution to enable autonomous navigation. As such, system 100 may provide control instructions to control system 122 for autonomously controlling the aircraft, as described in further detail below.
The system 100 may also be configured to provide sensor measurements and control instructions to the remote user interface 120. For example, the system 100 may provide control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a desired path (e.g., a centerline of a runway) during subsequent navigation of the aircraft to the remote user interface 120. The remote user interface 120 may use input from the system 100 to train a pilot to learn how to navigate the aircraft on the ground at an airport or may be used to enable the pilot to remotely control the aircraft.
Within system 100, computing system 102 represents a device capable of receiving and processing sensor data obtained by sensors of system 100. For example, the computing system 102 may receive sensor data and execute a robust process that matches sensor fusion capabilities to vehicle dynamics in order to achieve a highly reliable means of controlling an aircraft through computer vision and remote monitoring.
The computing system 102 is also configured to communicate with other computing devices. In some examples, computing system 102 may correspond to multiple computing devices. Further, the computing system 102 may be located on the aircraft or may be physically separate from the aircraft. As shown in FIG. 1, computing system 102 includes a processor 104, a memory 106, and an interface 108, all of which may be coupled via a system bus 110 or similar mechanism. In other examples, computing system 102 may include other components.
The processor 104 may be one or more of any type of computer processing element, such as in the form of a Central Processing Unit (CPU), a coprocessor (e.g., a math, graphics, or cryptographic coprocessor), a Digital Signal Processor (DSP), a network processor, and/or an integrated circuit or controller that performs the processor operations. In some cases, the processor 104 may be one or more single-core processors. In other cases, processor 104 may be one or more multi-core processors having multiple independent processing units. The processor 104 may also include register memory for temporarily storing instructions and associated data being executed, and cache memory for temporarily storing recently used instructions and data.
The memory 106 may be any form of computer usable memory including, but not limited to, Random Access Memory (RAM), Read Only Memory (ROM), and non-volatile memory (e.g., flash memory, hard disk drives, solid state drives, Compact Discs (CDs), Digital Video Discs (DVDs), and/or tape storage). Thus, memory 106 represents both a main memory unit and a long-term storage.
Memory 106 may store program instructions and/or data upon which program instructions may operate. For example, the memory 106 may store these program instructions on a non-transitory computer-readable medium such that these instructions are executable by the processor 104 to implement any of the methods, processes, or operations disclosed in this specification or the figures.
Further, the memory 106 may also receive and store maps. In some examples, computing system 102 may receive a map of an airport or another environment from a database storing maps. The database may be specifically associated with an airport or with the system 100. Further, the computing system 102 may use the memory 106 to compare expected airport signs and pathway layouts to airport signs and pathway layouts detected by the system 100 using sensor data. In further examples, computing system 102 may develop a map using the sensor data and store the map in memory 106.
Interface 108 enables communication between computing system 102 and other computing devices. For example, the interface 108 may establish wired or wireless communication 124 with the first sensor 112, the second sensor 114, the AHRS116, the GPS118, the remote user interface 120, and the control system 122.
In some examples, the interface 108 may take the form of one or more wired interfaces, such as ethernet (e.g., fast ethernet, gigabit ethernet, etc.). Interface 108 may also support communication over one or more non-ethernet media, such as coaxial cable or power line, or over a wide area medium, such as Synchronous Optical Network (SONET) or Digital Subscriber Line (DSL) technology. The interface 108 may also take the form of one or more wireless interfaces, such as IEEE 802.11(Wifi),
Figure BDA0002135849860000111
Or a wide area wireless interface. However, other forms of physical layer interfaces and other types of standard or proprietary communication protocols may be used on interface 108. Further, interface 108 may include multiple physical interfaces. For example, some embodiments of the computing system 102 may include ethernet and Wifi interfaces.
System 100 includes a first sensor 112 and a second sensor 114, which represent sensors configured to capture sensor data for processing by computing system 102. The first sensor 112 and the second sensor 114 may be various types of sensors, such as cameras, radars, laser radars (LIDAR), electro-optical sensors, or a combination of different sensors. For example, in some examples, the first sensor 112 and the second sensor 114 are a pair of high resolution cameras. Further, although the system 100 is shown with only the first sensor 112 and the second sensor 114, other embodiments may include more or fewer sensors. For example, another example system may include two or more sensors coupled to each wing of an aircraft. In another example, the system 100 may use a single sensor (e.g., the first sensor 112) to determine subsequent control instructions for navigating the aircraft.
The first sensor 112 and the second sensor 114 are shown physically separate from the computing system 102 and in communication with the computing system 102 via wireless communication 124. In other examples, the first sensor 112, the second sensor 114, and other sensors may be coupled to the computing system 102 via wired connections in other examples. Thus, sensor data obtained by the first and second sensors 112, 114 and other sensors operating within the system 100 may be used to detect objects located around the aircraft, including other aircraft, people, buildings, and airport markings, such as signs and various surface lines.
The first sensor 112, the second sensor 114, and other sensors of the system 100 may be coupled to the aircraft at fixed orientations. At a fixed position, the sensor may have a fixed orientation that prevents or limits movement during ground navigation and aircraft flight. As a result, the first sensor 112 and the second sensor 114 may obtain sensor data from a particular perspective and provide the sensor data to the computing system 102 for processing of limited noise generated due to lack of stability of the sensors. The computing system 102, which processes the sensor data obtained by the first sensor 112 and the second sensor 114, may use the fixed positions of the sensors in determining the position of objects and surface lines in the environment relative to the aircraft. In some examples, the computing system 102 may use sensor data from the first sensor 112 to determine control instructions and further use sensor data from the second sensor 114 to confirm or modify the control instructions.
The one or more particular regions measured by the first sensor 112 and the second sensor 114 may comprise one or more portions of the aircraft. In particular, when comparing the position of the aircraft relative to a target path (e.g., the centerline of a runway) or an object detected near the aircraft, the computing system 102 may use one or more portions of the aircraft as reference points. For example, one or more portions of the aircraft located in the area measured by the sensors may include a nose and/or a nose wheel of the aircraft, either of which may be used as a reference point for determining navigation instructions for the aircraft.
When the system 100 includes multiple sensors, the computing system 102 may receive sensor data from multiple sensors representing the environment during the same time frame. For example, the computing system 102 may receive previously simultaneously captured images from both the first sensor 112 and the second sensor 114. The computing system 102 may process the incoming sensor data in various ways. For example, the computing system 102 may use static image pre-processing to crop and shift the perspective using the input images received from the first sensor 112 and the second sensor 114.
Further, the computing system 102 may also use dynamic image pre-processing, which involves processing the received images for real-time conditions. Dynamic image pre-processing may involve correcting the exposure of one or both sensors and performing image stabilization. The computing system 102 may also use other image processing techniques, such as least squares fitting, Maximum A Posteriori (MAP) estimation, robust estimation, Expectation Maximization (EM) procedures, and/or a near nearest neighbor rapid library (FLANN).
In some examples, the computing system 102 may use the front wheel or a portion of the front wheel as a reference point when comparing sensor data from the first sensor 112 and the second sensor 114 to determine the orientation and position of the aircraft relative to a desired path of ground navigation (e.g., the centerline of a runway). For example, the first sensor 112 and the second sensor 114 may be cameras configured to capture and provide images including the nose wheel and other forward portions of the aircraft. The images captured and provided by the camera may include portions of the aircraft as well as the environment of the aircraft. The sensors may capture sensor data for various portions of the aircraft when the aircraft is located in various areas. In further examples, the computing system may use multiple reference points on the aircraft when analyzing sensor data representing measurements of the aircraft environment.
The computing system 102 may be configured to use the orientation of each sensor in performing sensor data processing techniques to detect objects, airport signs, people, and other aspects of the environment. For example, the computing system 102 may receive images from the first sensor 112 and the second sensor 114 and use the orientation of the sensors, previously known distances to one or more reference points (e.g., reference points on the aircraft), and pixels within each image to estimate the altitude and position of the object relative to the aircraft. In some examples, a radar or LIDAR may capture sensor measurements that may indicate the position of an object relative to a reference point on the aircraft. In some examples, the computing system 102 is configured to fuse together sensor data from multiple sensors when determining control instructions for the aircraft.
In further examples, the first sensor 112 and the second sensor 114 are coupled to the aircraft based on a mechanism that can modify the orientation and/or position of each sensor. In such a configuration, the first sensor 112 and the second sensor 114 may change orientation and obtain sensor data from different areas of the environment relative to the aircraft. For example, the first sensor 112 may be rotated to capture sensor data from different perspectives relative to the aircraft.
The computing system 102 may use sensor data received from the first sensor 112, the second sensor 114, and/or other sensors to formulate instructions that may assist in ground navigation of the aircraft. While the aircraft is operating, the computing system 102 may receive one or more sets of sensor data comprised of sensor data from one or both of the first sensor 112 and the second sensor 114 and analyze the environment of the aircraft using the sets of sensor data.
Analyzing the environment of an aircraft may include performing general object detection to prevent collisions and detecting and identifying airport signs that may assist in developing navigation strategies. In particular, airports or similar environments often include various airport markers, such as surface lines, signs, and other markers positioned to assist pilots during ground navigation. The computing system 102 may use detected airport markers (e.g., surface lines, boundaries) in a manner similar to waypoints to determine a ground path for navigating the aircraft. The ground path may involve identifying and locating a desired runway and a corresponding centerline of the runway. For example, a desired runway may correspond to a travel path of a building for which an aircraft is prepared for takeoff or ground navigation toward unloading passengers.
The computing system 102 may also use sensor data from one or both of the first sensor 112 and the second sensor 114 to determine a lateral displacement representing a distance between a reference point of the aircraft and a target path (e.g., a centerline of a runway). For example, the computing system 102 may compare a first location of a reference point of the aircraft relative to a centerline (or other desired path) of the runway, as represented in sensor data from a first sensor, to a second location of the same reference point relative to the centerline of the runway, as represented in sensor data from a second sensor having a second location. When performing the comparison, the computing system 102 may use, among other considerations, information indicative of the location and known distances of the sensors to make measurements of reference points of the aircraft. The computing system 102 may also use sensor data from one or both of the first sensor 112 and the second sensor 114 to determine a heading of the aircraft relative to a centerline of the target path (e.g., a centerline of a runway).
When determining control commands for the aircraft, the computing system 102 may further use sensor data from other sensors, such as the AHRS116 and the GPS 118. AHRS116 is a sensor located on the aircraft that can provide trend information to computing system 102. In particular, the AHRS116 may use solid state or micro-electromechanical system (MEMS) gyroscopes, accelerometers, and magnetometers to measure trend information consisting of past movements and orientations of the aircraft. The trend information may reflect recent previous movements of the aircraft as the aircraft navigates runways and other paths in an airport or similar environment. For example, the trend information may indicate the orientation and position of the aircraft during the last meters, hundreds of meters, or another ground navigation. In some examples, the AHRS116 consists of three on-axis sensors that can provide attitude information for the aircraft, including roll, pitch, and yaw.
The GPS118 is a satellite-based radio navigation system configured to also provide trend information to the computing system 102. Using measurements from GPS satellites, a receiver of GPS118 may determine the location of the aircraft and change the location of the aircraft as the aircraft navigates. Computing system 102 may receive trend information from AHRS116 and/or GPS 118. The computing system 102 may also receive trend information regarding recent movements and orientations of the aircraft from other types of sensors, such as Inertial Measurement Units (IMUs) or independent accelerometers, gyroscopes, or magnetometers.
During ground navigation of the aircraft, the computing system 102 may receive trend information from one or both of the AHRS116 and the GPS118 indicative of one or more previous movements performed by the aircraft. The computing system 102 may further use the trend information to estimate a future position and orientation of the aircraft. The computing system 102 may then use the future position and orientation of the aircraft to determine a lateral displacement representing a distance between a reference point of the aircraft (e.g., a nose of the aircraft) and a centerline of the runway. In some cases, the future position and orientation of the aircraft is based on the current speed and acceleration of the aircraft, as well as other possible factors (e.g., the grade of the ground and the speed of the wind in the environment).
In some examples, the computing system 102 is configured to detect noise in the sensor data received from the first sensor 112 and the second sensor 114. In particular, the computing system 102 may receive multiple sets of sensor data during navigation of the aircraft. The sets of sensor data may include sensor data from one or both of the first sensor 112 and the second sensor 114. Thus, in some cases, the computing system 102 may detect noise in the sensor data that may interfere with determining the lateral displacement of the aircraft relative to the centerline of the runway. Similarly, the computing system 102 may also use other sensors or additional sensor data sets to detect system failures (e.g., sensor drops) in some sensor data sets.
In response to detecting noise in the sensor data, the computing system 102 may use the future position and orientation of the aircraft estimated using the trend information from one or both of the AHRS116 and the GPS118 to determine a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway. For example, the computing system 102 may determine that the aircraft was previously aligned with the target path prior to detecting noise within the sensor data received from one or both of the first sensor 112 and the second sensor 114. Thus, the computing system 102 may use the trend information to determine that no further adjustments are needed in the near future because the aircraft is already aligned with the target path. In some examples, trend information obtained by one or both of AHRS116 and GPS118 may supplement sensor data received from first sensor 112 and second sensor 114.
In some examples, the computing system 102 is configured to detect a gap in a surface line measured within sensor data obtained by the first sensor 112, the second sensor 114, and other sensors coupled to the aircraft. For example, when the aircraft is navigating according to a centerline of a runway or another path, the computing system may be configured to detect a gap in the centerline of the runway using sensor data received from the first sensor 112 and the second sensor 114. These gaps may result from dust or other debris blocking the visibility of the surface lines, or may be part of the overall configuration of the surface lines (i.e., the surface lines are intended to include gaps).
When one or more gaps in the target path are detected, the computing system 102 may be configured to use the trend information to assist in maintaining a centered position of the aircraft on the runway or other travel path. Thus, in response to detecting a gap in the centerline of the runway, the computing system 102 may use the future position and orientation of the aircraft estimated using trend information from one or both of the AHRS116 and the GPS118 to determine a lateral displacement representing a distance between a reference point of the aircraft and the target path (e.g., the centerline of the runway).
FIG. 1 also shows a computing system 102 in communication with a remote user interface 120. The remote user interface 120 may be any type of device including a smartphone, a remote control system, a desktop computer, or a wearable computing device, among others. As such, the remote user interface 120 is configured to display information received from the computing system 102, such as control instructions developed and provided by the computing system 102. For example, the remote user interface 120 may display one or more adjustments provided by the computing system 102 for modifying the path of the aircraft to align the aircraft with the target path. In further examples, the remote user interface 120 may also enable a user to remotely control the aircraft during ground navigation.
In some examples, the remote user interface 120 includes a head-mountable display. For example, the head wearable display may display adjustments to enable a user to provide input in response via an interface associated with the head wearable display.
Control system 122 represents a system that may control the navigation components of an aircraft. In some examples, control system 122 may be located inside an aircraft to enable an onboard pilot to use sensor data and/or instructions provided by computing system 102. Control system 122 may be part of a general control system used by pilots to navigate aircraft. For example, control system 122 may cause a steering system, a braking system, a lighting system, and/or other systems of the aircraft to execute control instructions received from system 100.
In some examples, control system 122 may be a system capable of autonomously navigating an aircraft using control instructions provided by computing system 102. For example, control system 122 may cause systems of the aircraft to execute control instructions received from computing system 102. Control system 122 and computing system 102 may be part of the same computing device. In another example, the control system 122 is located remotely from the aircraft. For example, control system 122 may have an orientation that enables control system 122 to remotely control the aircraft.
FIG. 2 illustrates a system coupled to an aircraft, according to an example embodiment. The system 100 is shown with the first sensor 112 and the second sensor 114 coupled to the aircraft 202, while other components of the system 100 (e.g., the system 102, the AHRS116, and the GPS 118) are not shown. In other examples, the system 100 may be coupled to another type of vehicle, such as a four-wheeled vehicle or other type of aircraft.
The aircraft 202 is a two-wing aircraft that includes a propeller coupled to a nose 208 of the aircraft 202. In other examples, the aircraft 202 may have another configuration, such as a jet or larger passenger aircraft configuration. Aircraft 202 is shown with first sensor 112 coupled at a first location on first wing 204 and second sensor 114 coupled at a second location on second wing 206. The orientation of the sensor may be different in other examples. For example, one or more sensors may be coupled to a body of the aircraft 202.
The nose 208 of the aircraft 202 is shown with a propeller configured to transmit power by converting rotational motion into thrust. When determining the lateral displacement or overall orientation of the aircraft 202 relative to a desired path in the environment (e.g., a centerline of a runway), the computing system 102 may be configured to use the nose 208 of the aircraft 202 as a reference point. For example, the computing system 102 may use the sensor data to determine a lateral displacement of the nose 208 relative to a centerline of a runway over which the aircraft 202 is traveling.
The front wheels 210 of the aircraft 202 may also be used as reference points to determine the position and orientation of the aircraft during ground navigation. In addition to the front wheels 210, the aircraft 202 is also shown to include a pair of rear wheels 212, the pair of rear wheels 212 configured to enable ground navigation of the aircraft 202.
The cockpit 214 is an area of the aircraft 202 that may house one or more pilots during operation of the aircraft 202. In some examples, the cockpit 214 may include the computing system 102, the control system 122, and other components of the system. In other examples, computing system 102, control system 122, and/or other components of system 100 may be located remotely from aircraft 202.
FIG. 3 illustrates a front view of a system coupled to an aircraft, according to an example embodiment. The front view shows system 100 coupled to aircraft 202, system 100 including first sensor 112 coupled to first wing 204 and second sensor 114 coupled to second wing 206. In other examples, the system 100 is coupled to the aircraft 202 in other configurations, including more or fewer sensors coupled at other locations on the aircraft 202.
Fig. 4 illustrates a first perspective of a reference point of an aircraft corresponding to a first sensor coupled to the aircraft, according to an example embodiment. The images shown in fig. 4 represent example images that may be captured by first sensor 112 coupled to first wing 204 of aircraft 202 and provided to computing system 102. In other examples, the first perspective shown in the image may differ depending on the orientation and orientation of the first sensor 112 relative to the aircraft 202. Further, other examples may involve another type of sensor that captures sensor data representative of an area of the environment of the aircraft 202.
The image represents a region of the forward environment of the aircraft 202 that includes a portion of the target path 300 and the nose 208 and nose wheel 210 of the aircraft. The target path 300 represents a path on which the system 100 may be configured to assist the aircraft 202 in traveling. For example, the computing system 102 may use sensor data (such as the image shown in fig. 4) to determine control instructions to align the aircraft 202 with the target path 300 during subsequent ground navigation. Thus, as the aircraft 202 navigates, the computing system 102 may use additional images or other sensor data to detect and monitor changes in lateral displacement between the aircraft 202 and the target path 300. In further examples, the system 100 may estimate a center of the path and attempt to navigate the aircraft 202 along the estimated path center.
As described above, the computing system 102 may use the image shown in fig. 4 to determine the lateral displacement 304 between the target path 300 and a reference point of the aircraft (i.e., the bottom of the nose wheel 210). In some examples, the computing system 102 may align the image with a map to further determine a path for navigating the aircraft 202.
Fig. 5 illustrates a second perspective of a reference point of the aircraft corresponding to a second sensor coupled to the aircraft, according to an example embodiment. The image shown in fig. 5 represents an example image that may be captured by and provided to the computing system 102 by the second sensor 114 coupled to the second wing 206 of the aircraft 202. In other examples, the second perspective shown in the image may be different depending on the orientation and orientation of the second sensor 114 relative to the aircraft 202. Further, other examples may involve another type of sensor that captures sensor data representative of an area of the environment of the aircraft 202.
The image shown in fig. 5 represents an area including the target path 300 and the nose 208 and nose wheel 210 of the aircraft 202. The computing system 102 may use the images shown in fig. 4 and 5 to determine the lateral displacement 304 of the bottom portion of the front wheel 210 relative to the target path 300. In particular, the computing system 102 may use a combination of the images received from the first and second sensors 112, 114 to estimate a lateral displacement 304 representing a distance between a bottom portion of the front wheel 210 and the target path 300.
In some examples, the computing system 102 may determine the lateral displacement 304 by initially locating the target path 300 using a filter (e.g., Sobel filter, sliding window) within the image or sensor data received from one or both of the first sensor 112 and the second sensor 114. Computing system 102 may then project horizontal line 302 from a reference point on aircraft 202 (e.g., a bottom portion of front wheels 210) to through target path 300 in order to identify an intersection between horizontal line 302 and target path 300. After identifying the intersection, computing system 102 may solve to find one or more pixels in the image that correspond to the intersection. Computing system 102 can then project the pixels of the intersection along target path 300 and use the dot product to find the distance difference between the projection and the intersection. At this point, computing system 102 may determine lateral displacement 304.
FIG. 6 illustrates a reference distance analysis using a first sensor coupled to an aircraft, according to an example embodiment. When using sensor data (i.e., images) received from the first sensor 112, the computing system 102 may perform computer vision processing techniques to estimate the distance of objects in the environment. The computer vision processing may involve using the orientation of the first sensor 112 on the aircraft 202 and the known distances between portions of the aircraft compared within the measurements from the first sensor 112. In some examples, the computing system 102 may combine information from images received from both the first sensor 112 and the second sensor 114 to estimate the location of objects, surface lines, people, and other features in the environment of the aircraft 202.
As shown in fig. 6, the computing system 102 may determine the altitude and location of a person or object in the environment of the aircraft 202. For example, the computing system 102 may project a first line 310 extending horizontally across the image along the same plane as the front wheels 210. The first line 310 may be used to give an indication of the height of an object or person located along the first line 310. Specifically, computing system 102 may estimate that the person located on first line 310, represented by block 314, is approximately 6 feet tall based on the number of pixels the person extends upward in the image.
Similar to determining the altitude of the person or object located on the first line 310, the computing system 102 may also determine the altitude of the person or object located remotely from the aircraft 202. The computing system 102 may project a second line 312, which second line 312 may be extended and used to estimate the height of the person and object located at the second line 312. The projection of the second line 312 involves using the fixed position of the first sensor 112 and previous pixel measurements to estimate the height of objects and people located along the second line 312. As shown in fig. 6, computing system 102 may estimate the height of a person standing along second line 312 (i.e., an estimated 6 feet height) based on the person having a height corresponding to 100 pixels up. The computing system 102 may perform similar projections using the first sensor 112 or the second sensor 114 to determine the distance and altitude of objects, people, and airport signs relative to the aircraft 202. Further, the computing system 102 may also estimate a distance of the object relative to the aircraft 202.
Fig. 7 illustrates determining a lateral displacement between a reference point of an aircraft and a centerline of a runway according to an example embodiment. As shown in fig. 7, the computing system 102 may use sensor measurements from one or more sensors (e.g., the first sensor 112 and the second sensor 114) coupled to the aircraft 202 to determine a lateral displacement 322 indicative of a distance between the nose wheel 210 of the aircraft and a centerline 320 of a runway over which the aircraft 202 is navigating.
As described above, the system 100 may perform operations to assist the aircraft 202 in navigating a ground environment in an airport or similar environment. In particular, the computing system 102 within the system 100 may receive and use sensor data to detect airport markers in the nearby environment, such as a left boundary 326 and a right boundary 328 extending along the runway. The computing system 102 may further use the detected airport markings, including the left boundary 326 and the right boundary 328, to detect and identify the centerline 320 of the runway.
After detecting and identifying the centerline 320 of the runway (or another target path), the computing system 102 may use subsequent sensor data to determine the lateral displacement 322 that exists between the front wheels 210 and the centerline 320 of the runway. The computing system 102 may further determine a control command indicating an adjustment to the control of the aircraft 202 to align the front wheels 210 with the centerline 320 of the runway. For example, the control instructions may cause the aircraft 202 to follow the adjustment path 324 until the nose wheel 210 of the aircraft 202 is aligned with the centerline 320 of the runway.
FIG. 8 illustrates the detection of airport signs located near a runway using a set of sensor data, according to an example embodiment. As shown in fig. 8, the system 100 may further use the sensor data to detect airport markers, such as markers 330, 332, and 334. The computing system 102 may further use the detection of the markers 330 and 334 to determine a navigation strategy for the aircraft. For example, the computing system 102 may use airport markers (e.g., the markers 330) to identify a runway on which the aircraft 202 is navigating. Similarly, the computing system 102 may use the flag 332 to determine that the aircraft 202 should be temporarily stopped until additional sensor data confirms that the runway is free for use by the aircraft 202.
Fig. 9 illustrates identification of intersections of a runway according to an example embodiment. During ground navigation of the aircraft 202, the aircraft 202 may encounter different intersections in an airport or similar environment. An intersection such as intersection 340 shown in fig. 9 may correspond to a point where multiple ground paths overlap. To enable safe navigation of the aircraft 202, the system 100 may be configured to use the sensor data to detect and identify intersections (e.g., intersection 340).
Upon detecting the intersection 340, the computing system 102 may be configured to determine a control instruction. For example, the computing system 102 may determine a change in the path of the aircraft 202 in response to detecting the intersection 340. Similarly, the computing system 102 may identify a runway for use by the aircraft 202 during subsequent navigations.
Fig. 10 illustrates detection of a gap in the centerline of a runway according to an example embodiment. As the aircraft 202 navigates, the computing system 102 may monitor the travel path of the aircraft 202 (e.g., the nose wheel 210 of the aircraft 202) relative to one or more surface markers (e.g., the centerline 320 of the runway), boundaries, and other features in the environment. In some cases, the centerline 320 or another surface marker used by the computing system 102 may include one or more gaps (e.g., gap 350).
After detecting the gap 350 in the centerline 320 of the runway, the computing system 102 may be configured to use trend information received from one or both of the AHRS116 and the GPS118 to determine a lateral displacement representing a distance between a reference point of the aircraft 202 and the centerline 320 of the runway. In particular, the trend information may be used to estimate future positions and orientations of the aircraft 202 such that the computing system 102 can determine lateral displacement despite the presence of a gap in the centerline 320. Further, the computing system 102 may use the additional surface markings 352, 354 when determining the lateral displacement of the front wheels 210 relative to the centerline 320.
Fig. 11 illustrates identification of runway boundaries based on airport markings, according to an example embodiment. The system 100 may further detect an airport flag 360, 362, the airport flag 360, 362 representing the end of the runway on which the aircraft 202 is currently traveling. As a result of detecting the airport markers 360, 362 within the sensor data, the computing system 102 may determine control instructions for subsequent ground navigation of the aircraft 202.
As an example, the system 100 may detect airport markers 360, 362 while decelerating on the runway after landing is completed. In response to detecting the airport markers 360, 362 indicating the end of the runway, the computing system 102 may further determine control instructions intended to navigate the aircraft 202 according to the turns in the runway. In particular, execution of the control instructions may cause the aircraft 202 to continue to align with the centerline 320 on the runway and follow the centerline 320 by turning.
FIG. 12 illustrates sensor orientations on an aircraft according to an example embodiment. System 400 represents another example system configured to perform operations related to autonomous runway navigation. Similar to the system 100 depicted in fig. 1, the system 400 may be used to assist ground navigation of an aircraft, including providing control instructions capable of enabling semi-autonomous or fully autonomous operation of the aircraft 402.
As shown in fig. 12, the system 400 includes a first sensor 408, a second sensor 410, and a third sensor 418, but may include additional sensors located at other positions on the aircraft 402 in other embodiments. As such, the system 400 may use sensor measurements from various sensors to estimate vehicle states in the overall environment of the airport and further assist in ground navigation.
The aircraft 402 is shown as a single wing aircraft configured with a first wing 404 and a second wing 406, but may have other configurations in examples. First sensor 408 is shown coupled at a first location on first airfoil 404 and second sensor 410 is shown coupled at a second location on second airfoil 406. At the first and second orientations, both the first sensor 408 and the second sensor 410 are configured to measure a forward environment of the aircraft 402, with the first sensor 408 measuring a first region 412 of the forward environment of the aircraft 402 and the second sensor 410 measuring a second region 414 of the forward environment of the aircraft 402. Both the first region 412 and the second region 414 include a portion of the aircraft 402 (i.e., a nose of the aircraft 402), as shown in the overlap region 416. The overlap region 416 corresponds to a region that is measurable by both the first sensor 408 and the second sensor 410.
In addition to measuring the forward environment of aircraft 402, system 400 also includes a third sensor 418 coupled at a third location on first wing 404. At a third location, a third sensor 418 is positioned and oriented to measure a rearward environment of the aircraft 402 represented by a third area 420. When determining the reverse navigation instruction, the computing system of the system 400 may use sensor data from the third sensor 418 and other similarly positioned sensors that measure the rearward environment of the aircraft 402 and obtain a better understanding of the overall environment of the aircraft 402. In further examples, aircraft 402 may include additional sensors located at other positions, such as sensors positioned on the back of second wing 406 in a manner similar to third sensor 418.
Fig. 13 shows a flow chart of a method according to an example embodiment. The method 500 shown in fig. 13 gives an example of a method that may be used with the system 100 shown in fig. 1-12. In other examples, components of the devices and/or systems may be arranged to be adapted, capable of, or suitable for performing a function, such as when operating in a particular manner.
The method 500 may include one or more operations, functions, or actions as illustrated by one or more of block 502, block 504, block 506, block 508, block 510, and block 512. Although the blocks are shown in sequential order, the blocks may also be performed in parallel, and/or in a different order than described herein. Moreover, various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based on the desired implementation.
At block 502, the method 500 involves receiving, at a computing system, a first set of sensor data, wherein the first set of sensor data includes sensor data from one or both of a first sensor coupled to the aircraft at a first location and a second sensor coupled to the aircraft at a second location. The computing system may receive a first set of sensor data from one or more sensors coupled to the aircraft, including receiving the first set of sensor data from different types of sensors located at various locations on the aircraft. For example, the computing system may receive sensor data from one or more electro-optical sensors and/or one or more cameras (e.g., thermal imaging cameras, RGB cameras).
In an example, a computing system receives an image or another type of sensor data from a camera coupled to a wing of an aircraft. For example, the first camera may have a position on a first wing of the aircraft such that the first camera captures an image of an area that includes a reference point of the aircraft (e.g., a nose or nose wheel of the aircraft) and a corresponding runway that is below the reference point of the aircraft at a first perspective. Similarly, the second camera may have a similar position on a second wing of the aircraft such that the second camera captures an image of an area that includes a reference point of the aircraft and a corresponding runway or other type of path that is located below the reference point of the aircraft at a second perspective. In another example, the computing system may receive sensor data from a single sensor coupled to a body or another portion of the aircraft.
At block 504, the method 500 involves detecting, by the computing system, one or more airport signs located near a runway on which the aircraft is located using the first set of sensor data. In particular, the computing system may use the sensor data to detect signs, surface lines, objects, people, aircraft, and other features in the aircraft environment. Similar to pilots, the computing system may use detected airport signs to gain further understanding of the aircraft environment in order to develop navigation instructions. In some examples, the computing system may use the sensor data to detect the boundary of the target path. For example, the computing system may use the sensor data to detect runway edge geometry to define the boundary of the runway.
At block 506, method 500 involves identifying a centerline of the runway based on one or more airport signs. The computing system may use the detected surface lines, landmarks, and other airport markers to determine a runway or another ground path for the aircraft to use for ground navigation.
In some examples, the computing system uses airport markings to identify the centerline of the runway. For example, the computing system may use the detected boundaries extending along the sides of the runway to further identify the centerline of the runway. Similarly, the computing system may use logos, other surface lines, and markers to identify the centerline of the runway.
At block 508, the method 500 involves receiving, at the computing system, a second set of sensor data, wherein the second set of sensor data includes sensor data from both the first sensor and the second sensor. The second set of sensor data represents subsequent sensor data obtained by the first sensor, the second sensor, and/or other sensors coupled to the aircraft. In some examples, the computing system may receive subsequent sensor data from a single sensor coupled to the aircraft.
The computing system may receive additional sets of sensor data from various sensors (such as a first sensor and a second sensor) coupled to the aircraft. The computing system may receive sensor data continuously or in a periodic arrangement. In addition, the computing system may also receive different types of sensor data, such as images and radar measurements.
At block 510, the method 500 involves determining, by the computing system, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the second set of sensor data.
In some examples, the lateral displacement is determined based on a comparison of a first position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the second sensor. The computing system may also determine a heading of the aircraft relative to a centerline of the runway.
At block 512, the method 500 involves providing, by the computing system, control instructions indicating one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
Fig. 14 shows a flowchart of an example method for use with the method 500, according to an example embodiment. At block 514, the functions include receiving trend information representative of one or more previous movements performed by the aircraft, wherein the trend information is received from one or both of an Attitude and Heading Reference System (AHRS) and a Global Positioning System (GPS). The computing system may receive trend information from one or more sensors, such as AHRS, GPS, or other types of sensors (e.g., IMU).
At block 516, the functions include estimating a future position and orientation of the aircraft using the trend information. The computing system may use the trend information to determine previous movements performed by the aircraft and further project the movements to estimate one or more future positions and orientations of the aircraft. When estimating the future position, the computing system may use the speed, orientation, acceleration, and position of the aircraft.
At block 518, the functions include wherein determining a lateral displacement representing a distance between the reference point of the aircraft and the centerline of the runway using the second set of sensor data further comprises using the future position and orientation of the aircraft to determine a lateral displacement representing a distance between the reference point of the aircraft and the centerline of the runway.
Fig. 15 shows a flowchart of an example method for use with the method 500 according to an example embodiment. At block 520, the function includes detecting noise in the second set of sensor data received from both the first sensor and the second sensor. The computing system may detect noise based on a lack of accurate measurements within a set of received sensor data. For example, the computing system may detect noise based on a set of sensor data having measurements that are less consistent with other sensor data obtained during a similar time frame. In some examples, the computing device may detect a system failure (e.g., a single sensor temporarily exits) in other sensor data sets.
At block 522, the functions include determining a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the future position and orientation of the aircraft based on detecting noise in the second set of sensor data. In particular, the trend information may provide the computing system with additional measurements that may aid in determining the lateral displacement.
Fig. 16 shows a flowchart of an example method for use with the method 500, according to an example embodiment. At block 524, the functions include detecting a gap in the centerline of the runway using a second set of sensor data received from both the first sensor and the second sensor. The gap may represent one or more discontinuities in a surface line marking the centerline of the runway. In some examples, gaps in sensor data may occur due to poor lighting conditions or erasure of the mark.
At block 526, the functions include determining, using the estimated future position and orientation of the aircraft, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway based on detecting the gap in the centerline of the runway.
Fig. 17 shows a flowchart of an example method for use with the method 500 according to an example embodiment. At block 528, the functions include identifying boundaries of the runway based on one or more airport signs.
At block 530, the function includes identifying a centerline of the runway based in part on the boundary of the runway. In some examples, the computing system may project the centerline ahead of the aircraft on the runway based on previous detections of the centerline.
As used herein, the term "substantially" or "about" means that the recited feature, parameter, or value need not be achieved exactly, but that deviations or variations (including, for example, tolerances, measurement error, measurement accuracy limitations, and other factors known to those skilled in the art) may occur in amounts that do not preclude the effect that the feature is intended to provide.
The description of the different advantageous arrangements has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the examples in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Moreover, the different advantageous examples may describe different advantages compared to other advantageous examples. The example or examples selected are chosen and described in order to best explain the principles of the examples, the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various examples with various modifications as are suited to the particular use contemplated.
According to one aspect of the present disclosure, a system is provided that includes a first sensor coupled to an aircraft at a first location; a second sensor coupled to the aircraft at a second location; a computing system configured to: receiving a first set of sensor data comprising sensor data from one or both of the first sensor and the second sensor; detecting one or more airport signs located near a runway using a first set of sensor data, wherein an aircraft is located on the runway; identifying a centerline of the runway from the one or more airport signs; receiving a second set of sensor data comprising sensor data from both the first sensor and the second sensor; determining a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the second set of sensor data, wherein the lateral displacement is determined based on a comparison of a first position of the reference point relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point relative to the centerline of the runway as represented in the sensor data from the second sensor; and providing control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
The system is further disclosed wherein the computing system is further configured to receive trend information representative of one or more previous movements performed by the aircraft, wherein the trend information is received from one or both of an Attitude and Heading Reference System (AHRS) and a Global Positioning System (GPS); estimating a future position and orientation of the aircraft using the trend information; and wherein the computing system is further configured to determine a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the future position and orientation of the aircraft.
The system is further disclosed wherein the computing system is also configured to detect noise in a second set of sensor data received from both the first sensor and the second sensor; and in response to detecting noise in the second set of sensor data, using the future position and orientation of the aircraft to determine a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway.
The system is further disclosed wherein the computing system is also configured to detect a gap in a centerline of the runway using a second set of sensor data received from both the first sensor and the second sensor; and in response to detecting a gap in the centerline of the runway, using the future position and orientation of the aircraft to determine a lateral displacement representing a distance between a reference point of the aircraft and the centerline of the runway.
The system is further disclosed wherein the first orientation is on a first wing of the aircraft such that the first sensor receives sensor data including a reference point of the aircraft and an area of a respective runway that is below the reference point of the aircraft at a first perspective, and wherein the second orientation is on a second wing of the aircraft such that the second sensor receives sensor data including a reference point of the aircraft and an area of a respective runway that is below the reference point of the aircraft at a second perspective.
The system is further disclosed wherein the first sensor and the second sensor comprise a camera.
The system is further disclosed wherein the reference point is a nose of the aircraft.
The system is further disclosed wherein the computing system is configured to provide control instructions indicative of one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft to the remote user interface, wherein the remote user interface is configured to display the one or more adjustments and enable remote control of the aircraft.
The system is further disclosed wherein the remote user interface includes a head mountable display.
The system is further disclosed wherein the computing system is configured to provide control instructions indicative of one or more adjustments for aligning the reference point of the aircraft with the centerline of the runway during subsequent navigation of the aircraft to the control system of the aircraft for autonomous operation.
The system is further disclosed wherein the computing system is further configured to identify intersections of the runway based on one or more airport markers, wherein the airport markers correspond to surface lines of the runway or markers located near the runway; and wherein the computing system is configured to identify a centerline of the runway based in part on the intersection point.
The system is further disclosed wherein the computing system is further configured to identify a boundary of the runway based on the one or more airport signs; and wherein the computing system is configured to identify a centerline of the runway based in part on the boundary of the runway.
In accordance with another aspect of the present disclosure, a method is provided that includes receiving, at a computing system, a first set of sensor data, wherein the first set of sensor data includes sensor data from one or both of a first sensor coupled to an aircraft at a first location and a second sensor coupled to the aircraft at a second location; detecting, by the computing system, one or more airport signs located near a runway using the first set of sensor data, wherein the aircraft is located on the runway; identifying a centerline of a runway based on one or more airport signs; receiving, at the computing system, a second set of sensor data, wherein the second set of sensor data includes sensor data from both the first sensor and the second sensor; determining, by the computing system, a lateral displacement representing a distance between the reference point of the aircraft and the centerline of the runway using the second set of sensor data, wherein the lateral displacement is determined based on a comparison of a first position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the second sensor; and providing, by the computing system, control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.
The method is further disclosed as including receiving trend information representative of one or more previous movements performed by the aircraft, wherein the trend information is received from one or both of an Attitude and Heading Reference System (AHRS) and a Global Positioning System (GPS); estimating a future position and orientation of the aircraft using the trend information; and wherein determining a lateral displacement representing a distance between the reference point of the aircraft and the centerline of the runway using the second set of sensor data further comprises using the future position and orientation of the aircraft to determine a lateral displacement representing a distance between the reference point of the aircraft and the centerline of the runway.
The method is further disclosed as including detecting noise in a second set of sensor data received from both the first sensor and the second sensor; and determining, based on detecting noise in the second set of sensor data, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the future position and orientation of the aircraft.
The method is further disclosed as including detecting a gap in a centerline of the runway using a second set of sensor data received from both the first sensor and the second sensor; and determining, using the estimated future position and orientation of the aircraft, a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway based on detecting a gap in the centerline of the runway.
The method is further disclosed wherein providing control instructions indicative of one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft comprises providing the control instructions to a remote user interface, wherein the remote user interface is configured to display the one or more adjustments and enable remote control of the aircraft.
The method is further disclosed wherein providing control instructions indicative of one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft comprises providing the control instructions to a control system of the aircraft for autonomous operation.
The method is further disclosed as including identifying a boundary of a runway based on one or more airport signs; and identifying a centerline of the runway based in part on the boundary of the runway.
According to another aspect of the present disclosure, a non-transitory computer-readable medium is provided that is configured to store instructions that, when executed by a computing system, cause the computing system to perform operations comprising receiving a first set of sensor data, wherein the first set of sensor data comprises sensor data from one or both of a first sensor coupled to an aircraft at a first location and a second sensor coupled to the aircraft at a second location; detecting one or more airport signs located near a runway using a first set of sensor data, wherein an aircraft is located on the runway; identifying a centerline of a runway based on one or more airport signs; receiving a second set of sensor data, wherein the second set of sensor data includes sensor data from both the first sensor and the second sensor; determining a lateral displacement representing a distance between a reference point of the aircraft and a centerline of the runway using the second set of sensor data, wherein the lateral displacement is determined based on a comparison of a first position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the first sensor and a second position of the reference point of the aircraft relative to the centerline of the runway as represented in the sensor data from the second sensor; and providing control instructions that indicate one or more adjustments for aligning a reference point of the aircraft with a centerline of the runway during a subsequent navigation of the aircraft.

Claims (10)

1. A system, comprising:
a first sensor coupled to the aircraft at a first orientation;
a second sensor coupled to the aircraft at a second location;
a computing system configured to:
receiving a first set of sensor data comprising sensor data from one or both of the first sensor and the second sensor;
detecting one or more airport signs located near a runway using the first set of sensor data, wherein the aircraft is located on the runway;
identifying a centerline of the runway from the one or more airport signs;
receiving a second set of sensor data comprising sensor data from both the first sensor and the second sensor;
determining a lateral displacement representing a distance between a reference point of the aircraft and the centerline of the runway using the second set of sensor data, wherein the lateral displacement is determined based on a comparison of a first position of the reference point relative to the centerline of the runway as represented in sensor data from the first sensor and a second position of the reference point relative to the centerline of the runway as represented in sensor data from the second sensor; and
providing control instructions indicative of one or more adjustments for aligning the reference point of the aircraft with the centerline of the runway during a subsequent navigation of the aircraft.
2. The system of claim 1, wherein the computing system is further configured to:
receiving trend information representative of one or more previous movements performed by the aircraft, wherein the trend information is received from one or both of an Attitude and Heading Reference System (AHRS) and a Global Positioning System (GPS);
estimating a future position and orientation of the aircraft using the trend information; and
wherein the computing system is further configured to determine the lateral displacement using the future position and orientation of the aircraft, the lateral displacement representing the distance between the reference point of the aircraft and the centerline of the runway.
3. The system of claim 2, wherein the computing system is further configured to:
detecting noise in the second set of sensor data received from both the first sensor and the second sensor; and
in response to detecting the noise in the second set of sensor data, determining the lateral displacement using the future position and orientation of the aircraft, the lateral displacement representing the distance between the reference point of the aircraft and the centerline of the runway.
4. The system of claim 2, wherein the computing system is further configured to:
detecting a gap in the centerline of the runway using the second set of sensor data received from both the first sensor and the second sensor; and
in response to detecting the gap in the centerline of the runway, determining the lateral displacement using the future position and orientation of the aircraft, the lateral displacement representing the distance between the reference point of the aircraft and the centerline of the runway.
5. The system of claim 1, wherein the first location is on a first wing of the aircraft such that the first sensor receives sensor data including the reference point of the aircraft and an area of a respective runway that is below the reference point of the aircraft at a first perspective, and wherein the second location is on a second wing of the aircraft such that the second sensor receives sensor data including the reference point of the aircraft and an area of a respective runway that is below the reference point of the aircraft at a second perspective.
6. The system of claim 1, wherein the first sensor and the second sensor comprise cameras.
7. The system of claim 1, wherein the reference point is a nose of the aircraft.
8. The system of claim 1, wherein the computing system is configured to provide the control instructions indicative of the one or more adjustments for aligning the reference point of the aircraft with the centerline of the runway during a subsequent navigation of the aircraft to a remote user interface, wherein the remote user interface is configured to display the one or more adjustments and enable remote control of the aircraft.
9. The system of claim 8, wherein the remote user interface comprises a head-mountable display.
10. The system of claim 1, wherein the computing system is configured to provide the control instructions indicative of the one or more adjustments for aligning the reference point of the aircraft with the centerline of the runway during subsequent navigation of the aircraft to a control system of the aircraft for autonomous operation.
CN201910652835.8A 2018-07-19 2019-07-19 Systems, methods, and computer-readable media for autonomous airport runway navigation Active CN110751860B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/039,490 2018-07-19
US16/039,490 US10878709B2 (en) 2018-07-19 2018-07-19 System, method, and computer readable medium for autonomous airport runway navigation

Publications (2)

Publication Number Publication Date
CN110751860A true CN110751860A (en) 2020-02-04
CN110751860B CN110751860B (en) 2022-11-25

Family

ID=67226116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910652835.8A Active CN110751860B (en) 2018-07-19 2019-07-19 Systems, methods, and computer-readable media for autonomous airport runway navigation

Country Status (3)

Country Link
US (1) US10878709B2 (en)
EP (1) EP3598418A1 (en)
CN (1) CN110751860B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130946A (en) * 2020-09-22 2020-12-25 西安宇视信息科技有限公司 Aircraft information display method and device, electronic equipment and storage medium
CN113932804A (en) * 2021-09-17 2022-01-14 四川腾盾科技有限公司 Positioning method combining airport runway vision and GNSS/inertial navigation
CN116382351A (en) * 2023-06-05 2023-07-04 四川腾盾科技有限公司 Autonomous obstacle avoidance method for large fixed-wing unmanned aerial vehicle

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10235892B1 (en) * 2017-05-05 2019-03-19 Architecture Technology Corporation Aircraft surface state event track system and method
US11094210B2 (en) * 2018-05-09 2021-08-17 Simmonds Precision Products, Inc. Airport surface navigation aid
US11074820B2 (en) * 2018-12-19 2021-07-27 Embraer S.A. Low/No visibility takeoff system
US11094211B2 (en) * 2019-03-18 2021-08-17 Rockwell Collins, Inc. Judgmental oversteering taxi aid system and method
US11037455B1 (en) * 2019-03-18 2021-06-15 Rockwell Collins, Inc. Autonomous judgmental oversteering determination system for aircraft taxiing
US11639234B2 (en) 2019-04-24 2023-05-02 The Boeing Company Method, system and apparatus for aligning a removable sensor on a vehicle
US20220058962A1 (en) * 2020-08-20 2022-02-24 Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company Navigation systems and methods for operation
US11753181B2 (en) 2021-03-30 2023-09-12 Honeywell International Inc. System and method for visual aided landing
CN113636095B (en) * 2021-10-18 2022-04-08 中国民用航空总局第二研究所 Equipment data processing method, equipment and medium for guiding aircraft
EP4386719A1 (en) * 2022-12-16 2024-06-19 Airbus Operations SAS Method and system for assisting a pilot of an aircraft during taxiing the aircraft on a taxiway of an airfield

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH449279A (en) * 1966-07-01 1967-12-31 Swissair Photo Ag Method and device for the automatic control of an aircraft equipped with an autopilot
US3521227A (en) * 1966-10-10 1970-07-21 Kaiser Aerospace & Electronics Display system for providing integrated display of aircraft information
NO972864D0 (en) * 1996-06-26 1997-06-20 Rahtheon Anschuetz Gmbh Procedure for course change of a vessel over ground
US20040056952A1 (en) * 2002-09-20 2004-03-25 Kazuhide Konya Autotiller control system for aircraft utilizing camera sensing
US6813370B1 (en) * 1999-09-22 2004-11-02 Fuji Jukogyo Kabushiki Kaisha Lane marker recognizing apparatus
US20050062615A1 (en) * 2001-10-05 2005-03-24 Goetz Braeuchle Object sensing apparatus
JP2007326534A (en) * 2006-06-09 2007-12-20 Toyota Motor Corp Vehicular lane keeping support device
US7382284B1 (en) * 2004-09-29 2008-06-03 Rockwell Collins, Inc. Aircraft surface operations guidance on head up display
US20080177427A1 (en) * 2007-01-19 2008-07-24 Thales Device and method for measuring dynamic parameters of an aircraft progressing over an airport zone
US20080238718A1 (en) * 2007-03-30 2008-10-02 Hyundai Motor Company Method for preventing lane departure for use with vehicle
US20130138273A1 (en) * 2011-11-30 2013-05-30 Honeywell International Inc. System and method for aligning aircraft and runway headings during takeoff roll
US20140297168A1 (en) * 2013-03-26 2014-10-02 Ge Aviation Systems Llc Method of optically locating and guiding a vehicle relative to an airport
US20150332098A1 (en) * 2014-05-16 2015-11-19 GM Global Technology Operations LLC System and method for estimating vehicle dynamics using feature points in images from multiple cameras
CN105374038A (en) * 2015-11-11 2016-03-02 石家庄铁道大学 Car-mounted double camera locomotive wheel track displacement image detection system and detection method
CN106256606A (en) * 2016-08-09 2016-12-28 浙江零跑科技有限公司 A kind of lane departure warning method based on vehicle-mounted binocular camera
CN106295560A (en) * 2016-08-09 2017-01-04 浙江零跑科技有限公司 The track keeping method controlled based on vehicle-mounted binocular camera and stagewise PID
WO2017000876A1 (en) * 2015-06-29 2017-01-05 优利科技有限公司 Geo-location or navigation camera, and aircraft and navigation method therefor
RU2015129183A (en) * 2015-07-16 2017-01-18 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Алтайский государственный технический университет им. И.И. Ползунова" (АлтГТУ) The method of determining the position of a mobile machine on a plane
US9594372B1 (en) * 2016-01-21 2017-03-14 X Development Llc Methods and systems for providing feedback based on information received from an aerial vehicle
CN107817018A (en) * 2016-09-12 2018-03-20 沃尔沃汽车公司 The test system and method for testing of lane line departure warning system
US20180091797A1 (en) * 2016-09-27 2018-03-29 The Boeing Company Apparatus and method of compensating for relative motion of at least two aircraft-mounted cameras
CN108052107A (en) * 2018-01-19 2018-05-18 浙江科钛机器人股份有限公司 A kind of AGV indoor and outdoor complex navigation system and methods for merging magnetic stripe, magnetic nail and inertial navigation

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6405975B1 (en) 1995-12-19 2002-06-18 The Boeing Company Airplane ground maneuvering camera system
JP2004109937A (en) * 2002-09-20 2004-04-08 Fuji Xerox Co Ltd Image write-in device and record indication medium
US9371179B2 (en) * 2013-01-17 2016-06-21 Buckhorn, Inc. Collapsible nestable container

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH449279A (en) * 1966-07-01 1967-12-31 Swissair Photo Ag Method and device for the automatic control of an aircraft equipped with an autopilot
US3521227A (en) * 1966-10-10 1970-07-21 Kaiser Aerospace & Electronics Display system for providing integrated display of aircraft information
NO972864D0 (en) * 1996-06-26 1997-06-20 Rahtheon Anschuetz Gmbh Procedure for course change of a vessel over ground
US6813370B1 (en) * 1999-09-22 2004-11-02 Fuji Jukogyo Kabushiki Kaisha Lane marker recognizing apparatus
US20050062615A1 (en) * 2001-10-05 2005-03-24 Goetz Braeuchle Object sensing apparatus
US20040056952A1 (en) * 2002-09-20 2004-03-25 Kazuhide Konya Autotiller control system for aircraft utilizing camera sensing
US7382284B1 (en) * 2004-09-29 2008-06-03 Rockwell Collins, Inc. Aircraft surface operations guidance on head up display
JP2007326534A (en) * 2006-06-09 2007-12-20 Toyota Motor Corp Vehicular lane keeping support device
US20080177427A1 (en) * 2007-01-19 2008-07-24 Thales Device and method for measuring dynamic parameters of an aircraft progressing over an airport zone
US20080238718A1 (en) * 2007-03-30 2008-10-02 Hyundai Motor Company Method for preventing lane departure for use with vehicle
US20130138273A1 (en) * 2011-11-30 2013-05-30 Honeywell International Inc. System and method for aligning aircraft and runway headings during takeoff roll
US20140297168A1 (en) * 2013-03-26 2014-10-02 Ge Aviation Systems Llc Method of optically locating and guiding a vehicle relative to an airport
US20150332098A1 (en) * 2014-05-16 2015-11-19 GM Global Technology Operations LLC System and method for estimating vehicle dynamics using feature points in images from multiple cameras
WO2017000876A1 (en) * 2015-06-29 2017-01-05 优利科技有限公司 Geo-location or navigation camera, and aircraft and navigation method therefor
RU2015129183A (en) * 2015-07-16 2017-01-18 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Алтайский государственный технический университет им. И.И. Ползунова" (АлтГТУ) The method of determining the position of a mobile machine on a plane
CN105374038A (en) * 2015-11-11 2016-03-02 石家庄铁道大学 Car-mounted double camera locomotive wheel track displacement image detection system and detection method
US9594372B1 (en) * 2016-01-21 2017-03-14 X Development Llc Methods and systems for providing feedback based on information received from an aerial vehicle
CN106256606A (en) * 2016-08-09 2016-12-28 浙江零跑科技有限公司 A kind of lane departure warning method based on vehicle-mounted binocular camera
CN106295560A (en) * 2016-08-09 2017-01-04 浙江零跑科技有限公司 The track keeping method controlled based on vehicle-mounted binocular camera and stagewise PID
CN107817018A (en) * 2016-09-12 2018-03-20 沃尔沃汽车公司 The test system and method for testing of lane line departure warning system
US20180091797A1 (en) * 2016-09-27 2018-03-29 The Boeing Company Apparatus and method of compensating for relative motion of at least two aircraft-mounted cameras
CN108052107A (en) * 2018-01-19 2018-05-18 浙江科钛机器人股份有限公司 A kind of AGV indoor and outdoor complex navigation system and methods for merging magnetic stripe, magnetic nail and inertial navigation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谢春生等: "基于PEP的飞机一发失效复飞应急程序优化", 《中国民航大学学报》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130946A (en) * 2020-09-22 2020-12-25 西安宇视信息科技有限公司 Aircraft information display method and device, electronic equipment and storage medium
CN112130946B (en) * 2020-09-22 2024-03-26 西安宇视信息科技有限公司 Airplane information display method and device, electronic equipment and storage medium
CN113932804A (en) * 2021-09-17 2022-01-14 四川腾盾科技有限公司 Positioning method combining airport runway vision and GNSS/inertial navigation
CN116382351A (en) * 2023-06-05 2023-07-04 四川腾盾科技有限公司 Autonomous obstacle avoidance method for large fixed-wing unmanned aerial vehicle
CN116382351B (en) * 2023-06-05 2023-08-18 四川腾盾科技有限公司 Autonomous obstacle avoidance method for large fixed-wing unmanned aerial vehicle

Also Published As

Publication number Publication date
CN110751860B (en) 2022-11-25
EP3598418A1 (en) 2020-01-22
US20200027362A1 (en) 2020-01-23
US10878709B2 (en) 2020-12-29

Similar Documents

Publication Publication Date Title
CN110751860B (en) Systems, methods, and computer-readable media for autonomous airport runway navigation
US11585951B1 (en) Heading or pitch determination systems and methods with high confidence error bounds
US11959771B2 (en) Creation and use of enhanced maps
US9260180B2 (en) Autonomous and automatic landing method and system
KR101454153B1 (en) Navigation system for unmanned ground vehicle by sensor fusion with virtual lane
US9939818B2 (en) Method and system for automatic autonomous landing of an aircraft
US10118614B2 (en) Detailed map format for autonomous driving
EP1709611B1 (en) Automatic taxi manager
EP2118713B1 (en) Precision approach control
EP2589538B1 (en) Display device, piloting assistance system, and display method
US8073584B2 (en) Method for measuring dynamic parameters of an aircraft progressing over an airport zone
KR102483714B1 (en) Image sensor-based autonomous landing
US11725940B2 (en) Unmanned aerial vehicle control point selection system
US10643481B2 (en) Method and a device for avoiding an object by detecting its approach to an aircraft
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
US20090024311A1 (en) Method and apparatus for displaying terrain elevation information
RU2703412C2 (en) Automatic aircraft landing method
US11915603B2 (en) Docking guidance display methods and systems
US11754415B2 (en) Sensor localization from external source data
TWI725611B (en) Vehicle navigation switching device for golf course self-driving cars
Xia et al. Integrated emergency self-landing method for autonomous uas in urban aerial mobility
JP7488063B2 (en) Navigation performance of urban air vehicles.
Durrie et al. Vision-aided inertial navigation on an uncertain map using a particle filter
EP3926608A2 (en) Docking guidance display methods and systems
US20230023069A1 (en) Vision-based landing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant