EP1661117A2 - Systemes d'affichage pour un appareil - Google Patents

Systemes d'affichage pour un appareil

Info

Publication number
EP1661117A2
EP1661117A2 EP04817726A EP04817726A EP1661117A2 EP 1661117 A2 EP1661117 A2 EP 1661117A2 EP 04817726 A EP04817726 A EP 04817726A EP 04817726 A EP04817726 A EP 04817726A EP 1661117 A2 EP1661117 A2 EP 1661117A2
Authority
EP
European Patent Office
Prior art keywords
display
processor
windows
information
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04817726A
Other languages
German (de)
English (en)
Other versions
EP1661117A4 (fr
Inventor
Barry Berson
Bialecki Larry
Buck Peter
Morgenstern John
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Supersonic Aerospace International LLC
Original Assignee
Supersonic Aerospace International LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/615,634 external-priority patent/US7486291B2/en
Priority claimed from US10/616,145 external-priority patent/US7312725B2/en
Priority claimed from US10/619,848 external-priority patent/US6905091B2/en
Priority claimed from US10/706,672 external-priority patent/US7982767B2/en
Application filed by Supersonic Aerospace International LLC filed Critical Supersonic Aerospace International LLC
Publication of EP1661117A2 publication Critical patent/EP1661117A2/fr
Publication of EP1661117A4 publication Critical patent/EP1661117A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • G01C23/005Flight directors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras

Definitions

  • Many devices such as aircraft, are typically designed to provide a view offhe out-the-window scene for at least one operator to operate the device.
  • a view of the scenery outside the device was provided through passive means, such as a cockpit windshield, or artificial means through sensors and displays.
  • Synthetic Vision Systems present a completely artificial computer- generated view of the external environment to the crewmember(s).
  • SVS displays are typically based on static geographical and cultural data supplemented by dynamic traffic information.
  • Some implementations of SVS use Global Positioning Satellite (GPS) data to register the data base information dynamically to the aircraft's position and altitude. Supplemental sensors may be used to confirm the GPS position data or provide additional data (e.g., other aircraft, weather events, ground equipment).
  • GPS Global Positioning Satellite
  • Supplemental sensors may be used to confirm the GPS position data or provide additional data (e.g., other aircraft, weather events, ground equipment).
  • SVS can use both head-up and head-down displays. Displays typically include an artificial out-of-the-window view out the front and to the sides of the aircraft, and/or any number of symbolic and map presentations.
  • EVS Enhanced Vision Systems
  • EVS include sensors that can detect and display images of objects that pilots would not normally be able to see when looking through the cockpit window of an aircraft.
  • EVS can present data from sensors that can penetrate low- visibility weather conditions and darkness, such as RADAR or forward-looking infrared (FLIR).
  • FLIR forward-looking infrared
  • the data presented from the sensors is derived from the current environment and not from a computer database.
  • EVS can be used on both head-down and head-up displays.
  • Other features such as navigation enhancements and proactive systems to avoid controlled flight into terrain and runway incursions can also be integrated in EVS.
  • the FAA requires aircraft to provide out-the-window vie ing capability with specified horizontal and vertical fields of view.
  • the configuration of aircraft designed for optimum performance at conditions such as supersonic flight can include a long, pointed nose for drag reduction.
  • most contemporary supersonic aircraft designs feature a modified delta wing optimized for high-speed flight that results in high angles of attack at lower speeds. The long nose and high angle of attack at low airspeeds impairs the FAA required forward visibility of the flight crew during some phases of operation.
  • One solution to reduced cockpit out-the-window visibility includes a movable nose cone, such as the droop-nose design of the Concorde aircraft.
  • a mechanical system with actuators allows the crew to move the aircraft nose from a cruise position to a "drooped" position for takeoff, landing, and ground operation.
  • the droop nose configuration requires additional weight and space for the actuator system, and increases the complexity of the aircraft.
  • Still another solution to enabling the pilot to see outside the airplane during approach and landing is to include cockpit windows at the lower front fuselage of the aircraft, instead of, or in addition to, the traditional location on the upper front fuselage.
  • Such a configuration provides a window for each crewmember to view a limited portion of the runway during landing, as disclosed in U.S. Patent No. 5,351,898 issued to Michael S. Koehn.
  • Drawbacks associated with the configuration include increased drag due to the opening(s) in the bottom of the nose of the aircraft, and the loss of space in the nose for other aircraft components.
  • the windows provide very narrow horizontal and vertical fields of view that can impair the pilot's depth perception through lack of spatial references.
  • Shock waves and thus sonic booms, are fundamental to supersonic flight and can be minimized, but not eliminated, on aircraft that generate lift forces during flight.
  • a significant finding from past sonic boom studies is that startle, rattle, and building vibrations (which can cause damage) are key elements in determining the response of the public to sonic booms. Pressure disturbances of less than 1.0 lb/f 2 will produce less startle, rattle, and building vibrations.
  • NASA's High Speed Research Program identified three key requirements for overland supersonic flight: (1) establishing the criteria for an acceptable "shaped" sonic boom signature, (2) designing a viable aircraft to produce that shaped signature, and (3) quantifying the influence of the atmosphere on such signatures.
  • a display system for a operating a device receives images from a first sensor and a second sensor that represent scenery outside the device.
  • the display system is configured to detect moving objects in the images, as well as fuse the images to a single viewpoint.
  • the fused image is transformed to a first viewpoint image from a first operator station in the device, and a second viewpoint image from a second operator station in the device.
  • the combined sensor image and symbols are output to a display device that is positioned to provide a portion of the out-the-window field of view to the operator.
  • the entire desired field of view for the operator is provided by the display device in combination with the out-the-window scene available through windows of the device.
  • the display system generates a plurality of mutually exclusive windows on a display device.
  • One or more of the windows includes a common user interface and a common display area for a subset of at least two of the windows.
  • the system receives information regarding current flight conditions of the device such as an aircraft and determines the acoustic level of the sonic boom and/or other noise generated by the device during operation. The current acoustic level is compared to a desired level, and various cues are displayed to operators regarding corrective actions that can be taken to reduce or maintain the acoustic level at the desired level.
  • a protective housing encloses the sensors. This protective housing includes a transparent aperture through which the sensor captures images. A cleaning mechanism removes obstructions from the transparent aperture in order to provide continuous images representing scenery outside the device through an operator display.
  • FIG. 1A is a diagram of an embodiment of a components that can be included in a display system for a variety of types of devices, with an option to display acoustic pressure level information overlaid on a map of the area along the aircraft's planned flight route;
  • FIG. IB is a front- iew diagram of a particular embodiment of the display system of FIG. 1A;
  • FIG. 1C is a side-view diagram offhe particular embodiment of the display system of FIG. 1A;
  • FIG. ID is a diagram of an embodiment of a selectable window-resizing feature that allows a user to change the size of display windows;
  • FIG. 1A is a diagram of an embodiment of a components that can be included in a display system for a variety of types of devices, with an option to display acoustic pressure level information overlaid on a map of the area along the aircraft's planned flight route;
  • FIG. IB is a front- iew diagram of a particular embodiment of the display system of FIG.
  • FIG. IE is a diagram of the display of FIG. ID with windows that have been resized using the selectable window-resizing feature;
  • FIG. IF is a flow diagram of an embodiment of a process for configuring a user's display;
  • FIG. IG is a diagram of an embodiment of components that can be included in the display system of FIG. 1A;
  • FIG. 2 is a diagram of an embodiment of an aircraft with a structural outer mold line that obscures a portion of a crewmembers required field of view;
  • FIG. 3 is a side view diagram of the aircraft of FIG. 2;
  • FIG. 4 is a diagram of an embodiment of a display device positioned to provide a required field of view for a crewmember in combination with out-the-window fields of view from cockpit windows;
  • FIG. 5 is a diagram of an embodiment of a components and processes that can be included in a display system for a variety of types of devices;
  • FIG. 6 is a diagram of an embodiment of an avionics display generated by the display system of FIG. 1A;
  • FIG. 7A is a diagram of an embodiment of another avionics display generated by the display system of FIG. 1A;
  • FIG. 7B is a diagram of an embodiment of a display option tree that allows each crewmember to access increasingly detailed levels of information in common windows of the displays of FIG. 1A;
  • FIG. 8A is a diagram depicting a perspective view of the footprint of a sonic "ca ⁇ et" boom generated by an aircraft traveling at supersonic speed;
  • FIG. 8B is a front view of the ca ⁇ et boom depicted in FIG. 2A and a graph indicating the relative strength of the sonic boom at various distances from the aircraft;
  • FIG. 8C is a side view of a focused ca ⁇ et boom;
  • FIG. 9 is a flow diagram of an embodiment of acoustic level cueing logic;
  • FIG. 10A is a diagram of an embodiment of an aircraft display that can be utilized to provide information to crewmembers regarding acoustic pressure levels at ground level caused by their aircraft during supersonic flight;
  • FIG. 10B is a diagram of another embodiment of an aircraft display that can be utilized to provide information to crewmembers regarding acoustic pressure levels at ground level caused by their aircraft during supersonic flight;
  • FIG. 10A is a diagram of an embodiment of an aircraft display that can be utilized to provide information to crewmembers regarding acoustic pressure levels at ground level caused by their aircraft during supersonic flight;
  • FIG. 10B is a diagram of another embodiment of
  • FIG. 11 is a diagram of another embodiment of an aircraft display that can be utilized to provide information to crewmembers regarding acoustic pressure levels at ground level caused by their aircraft during supersonic flight;
  • FIG. 12A provides an isometric view of the underside of the nose of an aircraft with sensors mounted thereon;
  • FIG. 12B provides a side view of the nose of an aircraft with sensors mounted thereon;
  • FIG. 12C depicts a cross section of the nose of the aircraft depicted in FIG. 1; and
  • FIG. 12D provides a partial cross section of one embodiment of the sensor system.
  • FIG. 1 A shows a diagram of an embodiment of a display system 100 with a configuration of displays 102, 104, 106, in which one or more of displays 102, 104 can be partitioned into several mutually exclusive display areas, referred to as display windows 1A through 3B.
  • display windows 1A through 3B Various types of information can be displayed in windows 1A through 3B based on the types of functions being performed by components in display system 100, devices being monitored via display system 100, and functions performed by devices that communicate with display system 100.
  • FIGS. IB and 1 C show front and side view diagrams of a particular geometric configuration of displays 102, 104, 106 that can be included in an embodiment of display system 100 of FIG. 1A.
  • the design eye represents the position of each crewmember' s eye.
  • the design eye refers to the 100 percentile (eye height) pilot's eye position, with the seat at its neutral seat reference point, when the pilot is seated at the location typically used to fly the aircraft.
  • Current design practices use a design eye ellipse or line that enables 5 th through 95 th percentile operators to have the same over the nose vision capability. This allows the larger pilots to sit higher up and further away from the main instrument panel, etc.
  • Displays 102, 104, 106 can be flat panel display devices, with active display areas of 18 by 11 inches, 15 by 12 inches, and 8 by 8 inches, respectively. Other display types and sizes can be utilized.
  • the active area of displays 100 provide a -20 to +20 degree horizontal field of view, and a -10 to -30 degree vertical field of view, twenty-four (24) inches from each crewmember' s design eye.
  • the outline of display 102 from the pilot's design eye is shown in FIG. IB.
  • Display system 100 (FIG. 1A) can be utilized with other displays 102, 104, 106, and crewmember position configurations.
  • Display management functions can cause certain types of information to be automatically assigned to a particular window of display 102 based on the purpose for which display 102 is being used. For example, the Federal Aviation Administration requires certain primary flight information to be available to the crewmembers at all times during flight. Accordingly, the primary flight information can be automatically placed in fixed locations on display 102, depending on the aircraft's flight phase, and the role being performed by a particular cre member. Under certain circumstances, each crewmember can then configure the remaining windows of display 102 based on their own preferences and/or current operational requirements.
  • display processor 108 presents a default window configuration for each crewmember based on the crewmember' s role, and the operational state of the aircraft.
  • Display 102 can include options to select between different roles that a crewmember can assume. For example, a crewmember can choose between role options as primary pilot, co-pilot, flight engineer, maintenance personnel, or flight attendant. When a crewmember switches from one role to another, the default configuration for the new role can automatically replace the information displayed on the crewmember' s display 102.
  • the default window configurations can be preset by the crewmember to display information and options that are preferred by the crewmember when performing each particular role. A default set of options and information can be presented that is most likely to be useful to the crewmember assuming the role.
  • the embodiment of display system 100 in FIG. 1 A includes display processor 108, which receives inputs from the subsystems and sensors 110 and the crewstation(s) including positions of switches and knobs (not shown), control sticks 112, 114, throttle levers 116, 118, and rudder pedals (not shown).
  • Displays 102, 104, 106, control sticks 112, 114, and throttle levers 116, 118 can include a variety of switches for controlling the operational modes of displays 102, 104, 106, and subsystems and sensors 110.
  • Display system 100 communicates with displays 102, 104, 106, and includes logic instructions to generate displays.
  • Display system 100 can be configured to communicate with external systems via network 116.
  • Various input/output devices such as keyboard, mouse, touchscreen, light pen pointing device, printer, speakers, light indicators, and voice recognition and response system, to allow a user to interact with displays 102, 104, 106 can also be included.
  • Display processor 108 can be any suitable computer processing device that includes memory for storing and executing logic instructions, and is capable of interfacing with display device s 102, 104, 106.
  • display system 100 can be embodied in any suitable computing device, and so includes personal data assistants (PDAs), telephones with display areas, network appliances, desktops, laptops, X-window terminals, or other suitable computing devices.
  • PDAs personal data assistants
  • Display system 100 and corresponding logic instructions can be implemented using any suitable combination of hardware, software, and/or firmware, such as microprocessors, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuit (ASICs), or other suitable devices.
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuit
  • Display system 100 can also be connected to display system 100 such as an audio system, a data storage drive, and other input/output devices.
  • Application software with logic instructions that are executed by display system 100 can be stored on a computer readable medium, or accessed by display system 100 in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via a wide or local area network.
  • Display system 100 can be configured to connect to a network via a suitable communication link such as a TI, ISDN, or cable line, a wireless connection through a cellular or satellite network, or a local data transport system such as Ethernet or token ring over a local area network.
  • a suitable communication link such as a TI, ISDN, or cable line
  • a wireless connection through a cellular or satellite network
  • a local data transport system such as Ethernet or token ring over a local area network.
  • Display processor 108 can include logic to determine whether the modes requested by the crewmembers are permitted based on the current mode of the components. Display processor 108 also provides data from subsystems and sensors 110 and other aircraft components, as well as the operational mode of subsystems and sensors 110, to display processor 108, which generates displays and any other indicators, such as lights and sounds. Mode control and option selections are also output from display processor 108 to control operational modes of various subsystems and sensors 110.
  • windows 1A through 3B can be designated as common windows associated with a subset of two or more of the remaining windows 1A, IB, 2A, 2B, 3A.
  • window 1C can be a common window associated with windows 1 A through IB
  • window 2C can be a common window associated with windows 2A through 2B
  • window 3B can be independent of other windows 1A through 3 A.
  • Other arrangements and combinations of window and common windows can be utilized based on the types of information a user will be viewing, and the utility of having a common window 1C, 2C associated with two or more of the other windows 1A, IB, 2A, 2B, 3A.
  • An option area 120, 122, 124 can be associated with each common window 1C, 2C, and window 3B, respectively, to allow the user to customize the information in windows 1C, 2C, and 3B.
  • the capability to customize the information in one or more windows 1C, 2C, and 3B provides user configurable workspace on display 102 while retaining the information in the other windows.
  • options area 120 can include several options for displaying more detailed information in common window 1C that is related to the information displayed in either of windows 1 A or IB.
  • Common window 1C can include a scroll feature to allow the user to scroll through the detailed information, while the less detailed information continues to be displayed in windows 1 A and/or IB.
  • Display processor 108 can be configured to determine when functions being performed in display system 100 are in a predefined state. Display processor 108 can also monitor the operational status of various components in display system 100. When display processor 108 detects one of the predefined states, relevant information regarding the state can be presented in one of windows 1A, IB, 2A, 2B, 3A, or 3B. In some circumstances, the crewmember cannot change the information regarding the state while the aircraft remains in the state. The crewmember can, however, choose options associated with the respective window 1C, 2C, or 3B to retrieve more information about the detected state, as well as information related to the information in other associated windows 1A, IB, 2A, 2B, 3A.
  • display 102 covers the entire visible display area of the display device. Additionally, windows 1 A through 3B do not overlap each other, thereby providing the user with an unobstructed view of all the information on display 102.
  • a window-resizing feature can be included in some embodiments to allow the user to change the size of display windows 1A through 3C.
  • the resizing feature can be implemented in any suitable manner and on any one window or group of windows. For example, FIGS. ID and IE show nodes 126 around the edges and/or at the corners of windows 1C, 2C, and 3C.
  • the user selects node 126 with cursor 128 and uses a mouse to drag node 126 across display 102 until window 3C is the desired size.
  • the sizing feature can also include logic to automatically resize the width and height of surrounding windows 2A, 2B, 2C, and 3B to prevent overlap among windows 1A through 3 C as the user changes the size of the selected window.
  • a selectable option 130 can be displayed to allow the user to quickly restore windows 2A, 2B, 2C, and 3B to their previous size, or to a default size.
  • upper and lower size limits can be imposed to prevent the user from resizing windows 1A through 3C outside a predetermined range of sizes.
  • the selectable features in option areas 120, 122, 124 that allow the crewmember to customize windows 1C, 2C, and 3B can be implemented in any suitable manner, such as computer-generated graphic features that are selectable via a touch screen overlaid on display 102, a movable cursor on display 102, and/or with hardware features such as pushbutton switches that are mounted adjacent display 102.
  • the selectable options to customize common windows 1C, 2C, and 3B can be located on other components of display system 100, or other suitable areas, that are accessible by a user.
  • a voice recognition system can be included to interact with displayed information.
  • Display system 100 can be used by one or more operators to monitor a system or device. While an embodiment of an aircraft avionics system is shown in FIG. 1 A, display system 100 can be configured for use with any type of system or device other than aircraft.
  • the embodiment of display system 100 shown can include a variety of sensors and subsystems 110 such as Digital Map Set (DMS) 132, communications systems (COMM) 134, cautions and warnings system (ICAW) 136, and terrain clearance and avoidance (TCAS) subsystems 138, air data computer (ADC) 140, flight incident recorder and monitoring set (FIRAMS) 142, and an automatic flight control system (AFCS) 144.
  • DMS Digital Map Set
  • COMM communications systems
  • ICAW cautions and warnings system
  • TCAS terrain clearance and avoidance
  • ADC air data computer
  • FIRAMS flight incident recorder and monitoring set
  • AFCS automatic flight control system
  • Central computer 146 receives inputs from the avionics components and the crew station(s) including positions of switches, control sticks 112, 114, throttle levers 116, 118, and rudder pedals (not shown). Displays 102, 104, 106, control sticks 112, 114, and throttle levers 116, 118 can include a variety of switches for controlling the operational modes of displays 102, 104, 106, and aircraft subsystems. Central computer 146 includes logic to determine whether the modes requested by the crewmembers are permitted based on the current mode of the components.
  • Central computer 146 also provides data from sensor systems 110 and other avionics components, as well as the operational mode of the avionics components, to display processor 108, which generates output to displays 102, 104, 106 and any other indicators, such as lights and sounds. Mode control and option selections are also output from central computer 146 to various avionics components to control their operational modes.
  • avionics system 100 can be equipped with additional or fewer subsystems, based on the use of the aircraft or device. Further, redundant subsystems and processing systems can also be included to provide backup capability..
  • FIG. IF is a flow diagram of an embodiment of a process 150 for initially configuring display 106.
  • the user is requested to login in process 152.
  • Display processor 114 determines whether any window configurations were previously stored for the user.
  • Process 154 presents option 156 to allow the user to select from among existing configurations, and option 158 to allow the user to enter a new window configuration. If the user selects an existing configuration, process 168 displays the selected configuration on display 106.
  • process 160 presents options in display 106 to configure a layout for the windows.
  • Process 160 can utilize any suitable user interface to allow the user to enter the number, size, and location of each window in display 106.
  • process 162 presents options in display 106 to select the information to be displayed in each window. Combinations of information from various subsystems can be displayed within the same window or in separate windows. For example, video images from a video camera can be overlaid with textual information from other sensors.
  • a user can also designate a window as being "common" to two or more of the other windows, such as common window 1C associated with windows 1 A and IB.
  • window 2B in display 102 can be configured automatically with a head-up display (HUD) and window 2A can be configured with a horizontal situation indicator display (HSID). Each crewmember can then configure the remaining windows of display 102 based on their own preferences.
  • HUD head-up display
  • HID horizontal situation indicator display
  • Process 166 allows the user to assign a name to the configuration and save the configuration.
  • the capability to customize the information in one or more windows in display 102 allows operators to configure a display that is most useful to them.
  • Login information can be shared across display systems 100 via a network to allow the user to access saved configurations from any user station connected to the network.
  • Display 102 is sized to accommodate the amount of information that is desired to be available to operators responsible for monitoring and controlling a system. Further, the information is scaled so that it can be easily read by the operator. For example, in an aircraft, each crewmember can be provided with at least one 18 inch by 11 inch display area 104 on a flat panel display device 102. In other systems, display devices 102 with additional or less display area 104 can be utilized.
  • a default window configuration can be displayed for each operator based on the operator's role.
  • Display 102 can include options to select between different roles that an operator can assume. For example, an operator can choose between role options as primary pilot, co-pilot, flight engineer, maintenance personnel, or flight attendant.
  • the default window configuration for the new role can automatically replace the information displayed on the operator's display 102.
  • the default window configurations can be preset by the operator to display information and options that are preferred by the operator when performing each particular role. A default set of options and information can be presented that is most likely to be useful to the operator assuming the role. Further, some roles may have access to control and monitor subsystems that other roles do not.
  • login information can be used to associate a particular operator with the role(s) they are allowed to assume, thereby helping to prevent unauthorized access to various subsystems.
  • a time-out feature can also be included to automatically log the operator off the system after a pre-specified period of non-use.
  • An example of a group of sensor systems 110 that can be included in avionics system 100 includes flight reference sensors; electrical system sensors; control surface position sensors; image sensors including RADAR, FLIR, and video camera(s); navigation sensors including TACAN, AHRS, INS, ILS, and GPS sensors; and sensors that provide information regarding the engine, throttle position, fuel system, landing gear, and hydraulics system.
  • Flight reference sensors provide information regarding aircraft angle of attack, ambient air temperature outside the aircraft; a pitot static system to measure aircraft velocity with pitot and static pressures surrounding the aircraft; an indicated airspeed (IAS) value based on the pitot and static pressure; a backup pressure altimeter value based on static pressure; and a vertical speed (climb or dive rate) indicator value based on changes in static air pressure.
  • IAS indicated airspeed
  • Electrical system sensors supply information regarding voltages, frequency, power supply status, AC-DC transformer/rectifier status, engine start indicators, and backup batteries.
  • the Radio Detection and Ranging (RADAR) sensor provides Air-to-air (A/A) and Air-to-ground (A/G) modes for object detection, designation, tracking and navigation.
  • the RADAR sensor also provides terrain avoidance for low level navigation, a detection and tracking capability for moving and stationary surface objects, precision velocity measurement for navigation update, and weather information.
  • the Forward Looking Infrared (FLIR) sensor provides thermal imagery offhe surrounding environment in television format.
  • the video camera(s) provide a visual out-the-window scene.
  • the image data from the sensors can be fused to form a composite image of the out-the-window scene.
  • the fused image can be combined with display symbology to provide an enhanced vision display for the crewmembers.
  • Navigation subsystems typically include Tactical Air Navigation (TACAN), which is used to determine the relative bearing and slant range distance to a TACAN ground station.
  • TACAN Tactical Air Navigation
  • the TACAN is also used as a source to keep aircraft present position and update the aircraft present position being kept by another source, such as the inertial navigation system (INS) or air data computer (ADC) 140.
  • the INS subsystem is a self- contained, fully automatic dead reckoning navigation system.
  • the INS can be closely coupled to the Global Positioning System (GPS) to provide highly accurate aircraft present position and velocity data.
  • GPS Global Positioning System
  • the INS detects aircraft motion (acceleration and attitude) and provides acceleration, velocity, present position, pitch, roll, and true heading to related systems.
  • the GPS is a space satellite based radio navigation system that provides continuous, all weather, passive operation to an unlimited number of users anywhere on the earth.
  • the Attitude Heading Reference system (AHRS) is a self- contained attitude reference system, which provides backup pitch, heading, and roll attitude for use by other subsystems.
  • the Instrument Landing System (ILS) is an all weather runway landing approach guidance system. The ILS decodes transmitted azimuth and elevation signals during an aircraft approach and provides steering information to be displayed on the Head-Up Display (HUD), the Horizontal Situation Indicator Display (HSID), and/or other appropriate displays.
  • HUD Head-Up Display
  • HID Horizontal Situation Indicator Display
  • Aircraft communication subsystems (COMM) 134 typically include Very- High Frequency/Ultra-High Frequency (VHF/UHF) communication systems to provide air-to-air and air-to-ground communications.
  • VHF/UHF Very- High Frequency/Ultra-High Frequency
  • IAS Intercommunication and Audio System
  • CNI radio navigation, and identification
  • Other communication systems such as a satellite commumcation system, and high frequency radio systems, among others, can also be included.
  • ICAW system 136 filters extraneous messages to inform crewmembers of specific problems. For example, when an engine fails, the generator and hydraulic cautions normally associated with an engine being shutdown are suppressed, and the crewmembers are provided the specific problem in the form of an engine shutdown message.
  • the Traffic Alert and Collision Avoidance System, or TCAS 138 is an instrument integrated into other systems in an aircraft cockpit. TCAS 138 includes a display showing the relative positions and velocities of aircraft and issues an alarm when another aircraft is on a path to pass within a predetermined range to the subject aircraf .
  • Air Data Computer (ADC) 140 receives inputs from various aircraft sensors. Any errors in these inputs are corrected in the ADC and the corrected signals are used to compute accurate air data and magnetic heading information. ADC outputs are used for primary flight data displays, navigation, altitude reporting, environment control, and unsafe landing warning.
  • the Flight Incident Recorder and Monitoring System (FIRAMS) 142 monitors engine and airframe operational status for component failures and caution/advisory conditions when the central computer 146 is operating. If the central computer 146 detects a component failure, central computer 146 commands the FIRAMS 142 to store the applicable maintenance code. When the central computer 146 detects specific unit failures, it commands the FIRAMS 142 to store significant maintenance data and selected tactical information in a memory device.
  • FIRAMS Flight Incident Recorder and Monitoring System
  • the automatic flight control system (AFCS) 144 provides autopilot and automatic throttle control (ATC) mode commands to actuators connected to the control surfaces and throttle levers.
  • the autopilot mode maintains a constant heading, altitude, speed, and/or attitude.
  • the ATC mode positions the engine throttle levers and power lever control to maintain a constant angle of attack during landing, with approach power compensation, or constant airspeed during flight with a velocity control system.
  • a flight management system (FMS) can be integrated with the AFCS 144 to allow the crew to select options to fly the most economical flight profile or choose the fastest route to a destination. As the flight proceeds, the FMS can track fuel-burn and winds, update estimated flight time, and automatically change navigation and communication radio frequencies.
  • the FMS can control the flight from takeoff through landing, and perform navigation functions including determining waypoints, course intercepts, estimated time of arrival, holding patterns, altitude crossing restrictions, and optimum holding speed.
  • Digital Map Set (DMS) 132 can also be included to provide an image of the terrain and obstacles that is overlaid by textual information.
  • the map image from DMS 132 can be overlaid by text and symbols, and the map image can be continuously updated during flight to provide a bird's eye view of the position and heading offhe aircraft relative to the terrain and various landmark features to the crewmembers. Current flight path, and deviation from a pre-specified flight path, can also be shown.
  • Displays 102, 104, 106 can replace traditional aircraft instrument panels to provide crewmembers with an interactive display of the primary flight information, control operation of various subsystems on the aircraft, as well as allowing crewmembers to view information from the various subsystems at any particular time.
  • Information regarding the out-the-window scenery and or objects outside the device can also be provided from sensors that are not on-board the device.
  • more than one of the same type of sensor can be included in display system 100. In such embodiments, if one sensor fails, the image from another of the same type of sensor can be transformed to each operator's viewpoint, thereby improving the reliability of display system 100.
  • Display processor 108 and central computer 146 can include one or more data processing devices configured to perform a variety of functions, such as detect traffic and obstacles; fuse video images; fuse enhanced images and terrain map data with video images; transform fused images to one or more operator viewpoints; and generate display symbology and combine with transformed images.
  • the output of display processor 108 is presented to operator(s) offhe device on one or more displays 102, 104, 106, which can include, for example, Head Up Display (HUD), Head Down Display (HDD), Primary Display (PD) and Multifunction Display (MFD).
  • Display processor 108 can perform other functions in addition to, or instead of, the preceding functions.
  • other displays 102, 104, 106 can be utilized in addition to, or instead of, the displays 102, 104, 106 shown.
  • Other techniques for controlling the appearance of displays 102, 104, 106 can also be provided, such as automatic and manual declutter display modes, voice recognition and response systems, color-coding, and display scaling. Further, other combinations of information and number/size of windows can be implemented for display 102, 104, 106.
  • a lock out feature can also be included to help crewmembers coordinate their efforts by preventing them from attempting to control operation of the same subsystem simultaneously. Alternatively, control can be given to the last crewmember who makes an input.
  • Displays 102, 104, 106 minimize the number of dedicated control panels and displays that are typically used to monitor and operate an aircraft and its subsystems. A reduced number of displays 102, 104, 106 results in decreased weight, increased system reliability, and reduced maintenance. Further, displays 102, 104, 106 provide enhanced situational awareness of the aircraft and the subsystems, and reduce crew workload from typical pilot- vehicle interfaces.
  • Display system 100 is discussed herein as an example of a type of system in which various embodiments of a display with customizable windows can be used to monitor and control a large number of subsystems. It is anticipated that such a customizable displays 102, 104, 106 will be useful in monitoring and controlling a wide ⁇ variety of systems, and even groups of various systems.
  • Such variety of systems can include, for example, one or more mobile vehicles such as automobiles, trains, and boats; one or more machines such as robotic or manually operated manufacturing equipment and excavators; biological organisms such as a patient undergoing surgery, a group of patients in a hospital ward, viruses, and wildlife; and various aspects of one or more building facilities including processes and machinery operating within the buildings.
  • the amount and type of information presented on displays 102, 104, 106 is limited only by the ability to sense the desired parameters, and to communicate the sensed information to display systems 100. Further, the number and type of subsystems that can be controlled using one or more customizable displays 102, 104, 106 is limited only by the ability of the operator to enter control selections, and the ability of display system 100 to transmit the control selections to the subsystems being controlled.
  • FIG. 2 is a diagram of components and processes that can be included in an embodiment of a display system 200 for providing out-the-window displays for devices such as aircraft, trains, boats, and other types of devices where it is useful to have visual images of scenery, traffic, obstacles, and other objects surrounding the device.
  • a variety of subsystems and sensors supply images and data to display processor 108.
  • sensors and subsystems such as video camera(s) 202, Forward-Looking Infrared (FLIR) sensor(s) 204, RADAR sensor(s) 206, communication and navigation systems 208, terrain map database 210, hazard information 212, motion dynamics information 214, and operator and display geometry information 216 provide information to display processor 108.
  • FLIR Forward-Looking Infrared
  • Information regarding the out-the-window scenery and/or objects outside the device can also be provided from sensors that are not on-board the device.
  • more than one of the same type of sensor is included in display system 200. In such embodiments, if one sensor fails, the image from another of the same type of sensor can be transformed to each operator's viewpoint, thereby improving the reliability of display system 200.
  • Display processor 108 includes one or more data processing devices configured to perform a variety of functions, such as detect traffic and obstacles 218; fuse video images 220; fuse enhanced images and terrain map data with video images 222; transform fused images to one or more operator viewpoints 224; and generate display symbology and combine with transformed images 226.
  • the output of display processor 108 is presented to operator(s) of the device on one or more displays devices 228, which can include, for example, Head Up Display (HUD), Head Down Display (HDD), Primary Display (PD) and Multifunction Display (MFD).
  • Display processor 108 can perform other functions in addition to, or instead of, the functions shown in FIG. 2.
  • other display devices 228 can be utilized in addition to, or instead of, the display devices 228 shown.
  • One or more of functions 218 through 226 can be configured for parallel processing to reduce the latency of display system 200.
  • the data from the sensors can be processed concurrently, such as several instances of detect traffic and obstacles function 218 running concurrently to analyze images from left and right cameras 202, FLIR images 204, RADAR images 206, and hazard information 212.
  • FIG. 3 shows an embodiment of an out-the-window view from an operator' s station combined with display device 300 that provides a portion of a desired field of view (FOV) 302, as indicated by a dashed line.
  • Display system 200 can be utilized to provide desired FOV 302.
  • Desired FOV 302 can be derived from governmental or other regulations, such as set forth in the United States Federal Aviation Administration (FAA) Advisory Circular (AC) 25.773-1 entitled “Pilot Compartment View Design Considerations", dated January 8, 1993.
  • the out-the-window field of view areas for the operator in an exemplary aircraft are indicated by pilot left side window view 304, pilot front window view 306, co-pilot front window view 308, and co-pilot side window view 310.
  • desired FOV 302 will be based on the tasks to be performed, and structural characteristics of the device that may block a portion of desired FOV 302.
  • display 300 can replace traditional instrument panels, and provide a portion of the out-the-window scenery in all weather and time of day conditions, even when the scenery is obscured by a portion of the structure of the device.
  • Display images on display 300 can also provide interactive presentations of operation and subsystem mode information, and allow operators to control and monitor various subsystems and sensors at any particular time.
  • FIGS. 4 and 5 show an embodiment of an aircraft forward fuselage 400 with an outer mold line (OML) 402 and cockpit windows 404 that result in cockpit window views 304, 306, 308, 310 shown in FIG. 3.
  • OML outer mold line
  • the nose of aircraft 400 is long and tapered, and designed to meet low some boom and high speed requirements, such as for a supersonic aircraft.
  • a tradeoff is required, however, between the length and shape of OML 402 to achieve reduced sonic boom, and an OML 402 that allows cockpit windows 404 to be sized to provide desired FOV 302.
  • Embodiments of display system 300 can be implemented to provide portions of the out the window visual scene to fill in the area of desired FOV 302 that is not visible from cockpit window views 306, 308.
  • Runway 312 (FIG. 3) is shown in perspective to display 300, window views 304, 306, 308, 310, and desired FOV 302 with respect to the pilot of aircraft 400 during the flare portion of a landing sequence.
  • the shape and size of window views 304, 306, 308, 310 and display 300 can vary from the configuration shown in FIGS. 3, 4, and 5.
  • Video cameras 202 can provide a video image of a field of view in front of aircraft 400. Images from video camera 202 are useful to provide the crew with images of surrounding scenery obscured by the aircraft's OML 402 in daylight and good visibility conditions. Scenery images from camera sensor 202 can also be presented directly to the crewmembers on display 300 to assist the crew in operating aircraft 400 in manual and autopilot modes. Images from video camera 202 can be analyzed by functions, such as detect traffic and obstacles function 218, in display processor 108.
  • FLIR sensor 204 provides an infrared spectrum video image of a field of view in front of aircraft 200. FLIR images provide the ability to view surrounding scenery in day, night, and all-weather conditions. Additionally, display processor 108 can analyze FLIR images to monitor the integrity of data being used in display system 200, and detect objects around aircraft 400. Scenery images from FLIR sensor 204 can be transformed to crewmember' s viewpoints and displayed directly to the crewmembers on display 300 to assist the crew in operating aircraft 400 in manual and autopilot modes.
  • RADAR sensor(s) 206 can include one or more different types of RADAR sensors to provide information regarding weather, air and surface traffic, precision velocity measurement for navigation system updates, altitude above ground information, scene imagery to the pilot in low visibility conditions, object detection (either directly through pilot scene inte ⁇ retation, or automatically), and to monitor the integrity of data being used in display system 200.
  • Raw data can be provided in the form of scanned RADAR returns, azimuth versus range, at incrementally selected elevations.
  • Scenery images from RADAR sensor 206 can be transformed to crewmember' s viewpoints, sized and oriented to conform to the view available from windows 304 through 310, and displayed directly to the crewmembers on display 300 to assist the crew in operating aircraft 400 in manual and autopilot modes.
  • Navigation components in communication and navigation subsystems 208 can include a variety of subsystems to determine the relative bearing and slant range distance to a ground station, to keep the device's present position and update the present position being kept by another source, such as an inertial navigation system (INS).
  • the INS subsystem is a self-contained, fully automatic dead reckoning navigation system.
  • the INS can be coupled to a Global Positioning System (GPS) to provide highly accurate present position and velocity data.
  • GPS Global Positioning System
  • the INS detects motion (acceleration and attitude) and provides acceleration, velocity, present position, pitch, roll, and true heading to related systems.
  • the GPS is a space satellite based radio navigation system that provides continuous, all weather, passive operation to an unlimited number of users anywhere on the earth.
  • AHRS Attitude Heading Reference System
  • ILS Instrument Landing System
  • HUD Head-Up Display
  • HID Vertical and/or Horizontal Situation Indicator Display
  • Other suitable components can be utilized in communication and navigation subsystems 208, in addition to, or instead of, the components mentioned herein.
  • Other communication systems such as a satellite communication system, data link, and high frequency radio systems, among others, can also be included.
  • Terrain map database 210 provides latitude, longitude, and elevation data for terrain and man-made structures of potential significance to hazard avoidance.
  • the database may include nested components in hardware and/or software, with varying resolution and accuracy, appropriate to the phase of flight anticipated in the represented region.
  • Terrain map database 210 can be used to provide scene imagery to the pilot in low visibility conditions, to detect objects in the surrounding area (either directly through pilot scene inte ⁇ retation, or automatically), and to monitor the integrity of data being used in display system 200. Frequent updates to terrain map database 210 can be provided to include changes that may affect operation of aircraft 400. For example, the database can be updated to include recently constructed buildings and roads.
  • Hazard information 212 can be provided by sensors and subsystems such as a Traffic Alert and Collision Avoidance System, or TCAS, to provide information regarding the relative positions and velocities of other aircraft or moving vehicles in the vicinity of the subject aircraft 400. Position, heading, and speed information can be included on display 300, and a visual, audio, or other type of alarm can issue when another aircraft or vehicle is on a path to pass within a predetermined range of aircraft 400. Hazards information 212 can also include other components to provide information relevant to operation of aircraft 400 of which the crewmembers should be aware, such as terrain awareness/avoidance.
  • TCAS Traffic Alert and Collision Avoidance System
  • Motion dynamics sensors 214 provide information regarding attitude, position, speed, acceleration in three-dimensional space for aircraft 400. Other information such as angle of attack, ambient air temperature outside the aircraft; a pitot static system to measure aircraft velocity with pitot and static pressures surrounding the aircraft; an indicated airspeed (IAS) value based on the pitot and static pressure; a backup pressure altimeter value based on static pressure; and a vertical speed (climb or dive rate) indicator value based on changes in static air pressure, can also be used.
  • IAS indicated airspeed
  • Other on-board and off- board sensors such as an INS can provide information to motion dynamics sensors 214.
  • detect traffic and obstacles function 218 analyzes video images from video cameras 202 and determines whether there are any moving or stationary objects in the vicinity of aircraft 400. If so, symbols can be included on displays presented on display 300 to alert operators to the presence of the objects. In some situations, symbols may not be presented on the displays if the objects can be easily seen in the video image on the display. Audible warnings can be presented in addition to visually presented symbols.
  • Fuse video images function 220 combine, also referred to as fuse, enhanced images and terrain map data with images from video cameras 202.
  • video images from cameras 202 are analyzed in parallel processing paths for moving and stationary objects. Under relatively high visibility conditions, the video images may be considered sufficient to provide desired FOV 302 on display 300.
  • the function fuse enhanced images and terrain map data with fused video image 222 can generate a composite image for display 300 using the best information available from various sources, such as FLIR 204, RADAR, 206, video cameras 202, and terrain map 210.
  • data from the terrain map database can be compared to measured terrain height variations from a RADAR altimeter, INS, and GPS along the aircraft flight path to estimate the position of aircraft 400.
  • Images from video cameras 202, RADAR sensors 206, and FLIR sensors 204 can be fused to form a composite out-the-window scenery image using any suitable image sensor fusion algorithm, such as described by Z. Rahman, D. J. Jobson, G. A. Woodell, and G. D. Hines, in a publication entitled "Multi- Sensor Fusion And Enhancement Using The Retinex Image Enhancement Algorithm," Visual Information Processing XI, Proc. SPIE 4736, (2002).
  • Data from a 3 -dimensional terrain map database can be used to fill in portions of data that is not available from video cameras 202, RADAR sensors 206, or FLIR sensors 204 to provide an enhanced image for display 300.
  • Transform fused images to operator viewpoints function 224 performs viewpoint transformation functions to align the fields of view of images from cameras 202, RADAR sensors 206, or FLIR sensors 204, and translate the images to the viewpoint of each crewmember.
  • the fused image can be combined with display symbology to provide a further enhanced image for display 300.
  • Transform fused images to operator viewpoints function 224 uses dimensional information from the crewstation, the configuration of display 300, as well as motion dynamics information 214, to crop, rotate, and translate images to the viewpoint of each crewmember.
  • the following processes occur in viewpoint transformation function 224: • rotate the image about the viewpoint x, y, and z axes; • translate the image in the x, y, and z directions by an amount equal to the (negative) displacement of the viewpoint from the origin in the x, y, and z directions, respectively; and • scale and crop all images to the same field of view to conform to out-the- window view.
  • Images used to provide a portion of desired FOV 302 can be scaled and oriented to conform to the real world scenery for each crewmember' s viewpoint.
  • Appropriate coordinate system transformation matrices for different reference systems can be used, based on the original coordinate system of the image, and the coordinate system used for the design eye(s).
  • FIG. 6 is a diagram of an embodiment of an avionics display 600 that can be generated by the display system 100 of FIG. 1 A.
  • display processor 108 generates display 600 below a pre-specified altitude, such as 18,000 feet, for example, and presents display 600 to the crewmember designated as pilot-in- command of the aircraft.
  • Display 600 includes an image of out-the-window (OTW) scenery 602 that provides a field of view to the crewmember that meets or exceeds the portion of desired FOV 302 (FIG. 3) that is not available from window views 304, 306, 308, 310 (FIG. 3).
  • OW scenery 602 is overlaid with head-up- display symbology that provides pitch, roll, yaw, speed, and altitude information, as well as navigation, and cautions and warning information.
  • FIG. 7 A shows an embodiment of another avionics display 700 that can be generated by display system 100 (FIG. 1A).
  • Display 700 includes communication system window 702, navigation window 704, common window 706 (currently displaying navigation waypoint information), primary information window 708, Head Up Display (HUD) window 710, Horizontal Situation Indicator display (HSID) window 712, common window 714 (currently displaying caution and warning information), engine status window 716, and common window 718 (currently displaying cautions and warnings information).
  • HUD window 710 provides flight attitude, altitude, speed, and navigation steering information.
  • HSID window 712 provides aircraft attitude, steering, and navigation information superimposed on a moving map of the geographic area around the aircraft that is generated by DMS 132 (FIG. 1 A).
  • the embodiment of avionics display 700 shown in FIG. 7A also includes communication subsystems (COM) option 720, navigation subsystems (NAV) option 722, flight planning subsystem (FP) option 724, traffic alert and collision avoidance subsystem (TCAS) option 726, acknowledge (ACK) option 728, checklist (CKLST) option 730, integrated cautions and warnings (ICAW) subsystem option 732, subsystem history (HIST) option 734, subsystem (SYS) option 736, and subsystem diagnostics (FAULT) option 738.
  • Crewmembers can choose options 720 through 738 to view more detailed information about the aircraft's operation and subsystems in common windows 706, 714, and 718.
  • the options shown for common window 706 include Com option 720 to view more detailed information regarding the aircraft's communication system 704; NAV option 722 to view information about various aspects of navigating the aircraft; FP option 724 to review and modify the aircraft's flight plan; and TCAS option 726 to view more information regarding other aircraft or obstacles in the vicinity of the aircraft.
  • an indicator of the option selected such as selected box 740 or lighted pushbuttons, can be utilized. For example, a green light can indicate a selected option, and white light can indicate the option is available for selection.
  • FIG. 7B shows an embodiment of a display option tree 750that can be implemented to allow each crewmember to access increasingly detailed levels of information in common windows 706, 714, 718 independently from one another. While a first crewmember is monitoring engine performance, for example, the other crewmember can view and change the flight plan. Additionally, when COM option 720 is selected by one crewmember, options 720 through 726 on display 102, 104, 106 change to another set of options to access another level of information that is available for the selected COM option 720.
  • the sublevels include a feature, such as BACK option 752, to return to the previous level.
  • a feature such as BACK option 752 to return to the previous level.
  • the information on the other crewmember' s display 102, 104, 106 is unaffected, unless the option selected by the first crewmember changes the operating mode or other information that is common to both displays 102, 104, 106.
  • Acknowledge (ACK) option 728 and checklist (CKLST) option 730 are associated with the Integrated Caution Advisory and Warning subsystem (ICAW) 706.
  • ICAW Integrated Caution Advisory and Warning subsystem
  • messages generated by ICAW system 706 appear in window 714.
  • a limited number of individual ICAW messages can appear at one time in window 714, and additional information about the messages can appear in window 718 when ICAW option 732 is selected.
  • the ICAW messages in window 714 can be cleared by selecting ACK option 728. When additional messages are available, they replace the caution and warning messages that are cleared when ACK option 728 is selected.
  • ICAW subsystem 136 includes an electronic checklist feature that is accessed via CKLST option 730.
  • CKLST option 730 When an ICAW message is displayed in window 714, the crewmember can depress CKLST option 730 to view the associated checklist in window 714.
  • the crewmember can move an indicator over the desired ICAW and select ICAW option 732 to view a checklist for the problem indicated by the message.
  • Associated checklists can be automatically linked together so that if an engine failure occurs, the pilot will not only get the checklist for the engine failure procedure in-flight but also the single engine landing checklist. Crewmembers can also manually page through the checklists at any time by selecting CKLST option 730.
  • Subsystem history (HIST) option 734 can be selected to display operational history for the subsystem selected with subsystem (SYS) option 736.
  • FAULT option 738 can be selected to initiate diagnostic procedures, commonly referred to as Built-in-Tests (BIT), on the selected subsystem. The results of the BIT are displayed in window 718.
  • FIG. 7B for options 720 through 726, various sublevels of options can be implemented for options 728 through 738, including a display navigation feature, such as BACK option 752, to return to the previous level.
  • central computer 146 determines whether the aircraft is in a predefined state, and instructs display processor 108 to display predefined information in at least one of windows 702 through 716 while the aircraft remains in the predefined state. Additionally, options 720 through 738 can be changed or enabled/disabled depending on the aircraft state. For example, when the aircraft is on final approach to land, one or both offhe crewmember' s displays 102, 104, 106 can be forced to display primary flight information or other information considered necessary to conduct that portion of the flight in windows 140 and 142, as well as other windows on display 102, 104, 106.
  • a "drag and drop” feature can be provided as another method of displaying more detailed information about one or more of the subsystems.
  • the drag and drop feature allows a user to select a word or other item in one of windows 702, 704, 708, 710, 712, or 716, and drag the selection to one of common windows 706, 714, or 718.
  • Information regarding the selected item is then displayed in the common window in which the item was dropped. For example, selecting a particular caution in window 714 and dropping it in window 718 would cause information regarding the relevant subsystem to be displayed in window 718.
  • Aircraft display system 100 (FIG. 1A) is discussed herein as an example of a type of system in which various embodiments of displays 102, 104, 106 can be used to provide a portion of desired FOV 302 (FIG. 3) as well as to monitor and control a large number of subsystems and sensors 110 (FIG. 1A). It is anticipated that embodiments of display s 102, 104, 106, or group of displays s 102, 104, 106, will be useful in providing an expanded field of view for operators of a wide variety of systems that have limited out- the-window visibility.
  • shock wave 804 spreads forward from the nose of aircraft 802.
  • the sonic boom heard at ground level 806 is only one portion of shock wave 804, and is referred to as a "ca ⁇ et boom.”
  • Shock wave 804 spreads broadly beneath aircraft 802, as depicted in FIG. 8B.
  • shock wave 804 at ground level 806 will be approximately one mile wide for every thousand feet of altitude of aircraft 802. Therefore, a shock wave 804 generated at 110,000 feet above ground level will be approximately fifty miles wide at ground level 806.
  • Shockwave 804 typically strikes ground level 806 forward of the point at which shock wave 804 is created, and continues along the route of aircraft 802 until aircraft 802 is moving slower than the speed of sound.
  • shock wave 804 is affected by various factors including the size, weight, speed, altitude, and angle of attack of aircraft 802, as well as roll, pitch, and yaw angle during flight. Atmospheric and terrain variations can also affect the intensity of shock wave 804, but variables which are under the pilot's control, such as speed, acceleration, and attitude angles, are typically more important. Increasing temperatures in the troposphere tend to diffuse shock wave 804. The strength of shock wave 804 is typically the highest directly ahead of aircraft 802, and reduces in strength with increasing distance from aircraft 802. Shock wave 804 typically diffracts off ground 806.
  • Changing the acceleration, angle of attack, pitch, roll, or yaw attitude of aircraft 802 can either focus or diffuse shock wave 804.
  • a focused shock wave 804 occurs when two or more wavefronts 808, originating at different times from aircraft 802, coincide exactly, as shown in FIG. 8C.
  • deceleration and/or lifting the nose of aircraft 802 will diffuse shock wave 804; acceleration and or dropping the nose will focus shock wave 804.
  • a change in horizontal direction will focus shock wave 804 along the inside of the sonic boom ca ⁇ et' s turn, which is often however along a track to the outside of the flight path.
  • display system 100 can include a variety of subsystems and sensors 110 that provide information about the operational state of aircraft 802 and systems on board aircraft 802 to processor 108.
  • speed, altitude, pitch angle, roll angle, yaw angle, bank angle, climb rate, linear and rotational accelerations, and aircraft latitude and longitude can be provided by navigation sensors such as Tactical Air Navigation (TACAN), attitude heading reference set (AHRS), inertial navigation system (INS), and global positioning system (GPS) sensors.
  • TACAN Tactical Air Navigation
  • AHRS attitude heading reference set
  • INS inertial navigation system
  • GPS global positioning system
  • Imaging sensors such as RADAR, FLIR, and video cameras provide information regarding the environment outside aircraft 802.
  • DMS Digital Map Set
  • TAWS terrain awareness warning system
  • ICAW cautions and warnings subsystem
  • a flight management system can be integrated with the AFCS 144 to allow the crew to select options to fly the most economical flight profile, the optimum flight profile that maintains the acoustic pressure level from shock wave 804 below a certain level.
  • Parameters for determining the location, size, shape, color, and other display characteristics of acoustic level information and cues for displays 102, 104, 106 include translational and rotational velocities and accelerations about the axes of aircraft 802, flight path angle (FPA), Mach number, angle of attack (AOA), thrust, altitude, atmospheric temperature profiles, terrain data, and aircraft length and weight.
  • FPA flight path angle
  • AOA angle of attack
  • supersonic aircraft typically fly under varying conditions of Mach, altitude, and g's pulled in a turn.
  • formulas and/or multi-dimensional look-up tables can be used in display processor 108 to determine the expected acoustic level of shock wave 804.
  • Look-up tables can include data specifically for the aircraft in which display system 100 is being used, although data for any number of aircraft can be included. The data can be based on actual flight test data, or generated using analytical computation models.
  • FIG. 9 a flow diagram of an embodiment of acoustic level (AL) cueing logic 900 that can be included in display processor 108 or other suitable processing device is shown with the aircraft's current flight characteristics being input to process 902 to determine the acoustic level of the aircraft using multi-dimensional lookup tables and/or formulas based on the current flight characteristics.
  • A acoustic level
  • process 904 determines whether the current acoustic level is above or within a predetermined range of the maximum desired acoustic level. If not, process 906 generates normal cues, without cautions or alerts, for output to one or more of displays 102 through 106, as well as other aural, tactile, and/or visual cues being utilized.
  • process 908 determines whether the maximum desired acoustic level has been exceeded. If not, process 910 determines the rate of increase or decrease in the acoustic level. If the acoustic level is increasing, process 912 issues caution cues based on the rate of increase. Similarly, if the acoustic level is decreasing, the urgency of the cues can be reduced proportionally.
  • an option can be included in the crewstation to enable or disable maneuver limit logic, as indicated by process 914, to further prevent control inputs that would cause the acoustic level to increase further.
  • maneuver limit logic as indicated by process 914
  • process 916 can limit the amount of control stick 112, 114 (FIG. 1A) input that is sent to the control surfaces to prevent the bank angle of the aircraft from exceeding an amount that would cause the acoustic level to exceed the desired level.
  • Limits on other inputs or changes to the flight condition of the aircraft that are likely to cause the desired acoustic level to be exceeded can also be implemented in processes 914 and 916.
  • process 920 determines the amount of time the aircraft has been exceeding the desired acoustic level.
  • Process 922 can raise the urgency level of the cues based on the amount of time the desired acoustic level was exceeded.
  • an option can be included in the crewstation to enable or disable auto-correct logic to automatically make changes to the flight condition of the aircraft to reduce the acoustic level of the engine and airframe noise or shock wave to the desired level.
  • process 928 displays options and cues to indicate changes that the crewmembers can make to reduce the acoustic level of the aircraft. For example, options to reduce speed, acceleration, pitch angle, and/or bank angles can be provided via voice, tactile, or visual cues.
  • display 1000 that can be utilized to provide information to crewmembers regarding previous, current, and predicted acoustic pressure levels at ground level caused by their aircraft during supersonic flight is shown.
  • display 1000 includes aircraft symbol 1002, acoustic signature (AS) symbols 1004, as well as symbols representing the aircraft's navigation route 1006, navigation waypoints 1008, and one or more alternate navigation routes 1010 that may be taken to avoid flight over certain areas, such as densely populated areas or noise-restricted areas.
  • AS acoustic signature
  • Acoustic signature symbols 1004 can represent the footprint of previous, current, and predicted shock waves that have been, are, or are likely to be, generated during supersonic flight as the air in front of the aircraft is compressed.
  • one acoustic signature symbol 1004 can be positioned near aircraft symbol 1002 to indicate the probable location at ground level, size and/or strength of the acoustic signature currently being generated by the aircraft.
  • One or more acoustic signature symbols 1004 can be displayed behind aircraft symbol 1002 when the crewmember selects acoustic level history 1014 option to display the history of acoustic signature symbols 1004, such as Acoustic Level History (ALHIS) option 1014.
  • AHIS Acoustic Level History
  • acoustic signature symbols 1004 can be displayed when the aircraft is within a predefined range of an expected maneuver point to show the expected footprint of shock wave 804 (FIG. 8A) or other acoustic signature during the maneuver.
  • the expected footprint of shock wave 804 FOG. 8A
  • Display 1000 can therefore present acoustic signature symbols 1004 at waypoints indicating the expected strength of the shock wave or engine/airframe noise at a standard bank angle, such as 90 degrees bank angle at the current speed.
  • Display 1000 can also show predicted acoustic signature values at various points along the entire flight profile.
  • a particular waypoint 1008 is selected using one of various selection devices, such as a pilot-controlled cursor 1016, on display 1000
  • information regarding the acoustic level expected to result during the turn to the next waypoint 1008 can be presented.
  • display 1000 can present information window 1018 that includes the predicted pressure disturbance level at the current speed and expected bank angle, as well as the desired pressure disturbance level.
  • Display 1000 can also present alternate route 1010 that minimizes the amount of bank angle required to turn to stay on course. Such information can be presented to the crewmembers on the ground during flight planning as well as enroute.
  • selectable maneuver limit options 1020, 1022 can be presented on display 1000 to indicate flight condition parameters that can be varied to stay below a desired pressure disturbance level during the upcoming maneuver.
  • option 1020 indicates the maximum bank angle that can be used at the current speed to stay within the desired acoustic level.
  • option 1022 indicates the maximum speed for the aircraft during the turn to remain within the desired acoustic level.
  • Other suitable parameters and indicators can be presented to indicate options to the crewmembers for staying at or below the desired acoustic level during an upcoming maneuver or portion of flight.
  • Window 1018 can be presented whether or not cursor 1016 is positioned over another symbol on display 1000 to indicate the current and desired acoustic level. Additionally, when cursor 1016 is positioned over another symbol on display 1000, window 1018 can be presented adjacent the selected symbol. Information in window 1018, and adjacent maneuver limit options 1020, 1024 can be updated as various parameters, that affect the engine/airframe noise or shock wave, change during flight.
  • Display 1000 can also include auto-correct option 1024 that can be selected to enable AL cueing logic 112 (FIG. 1) to automatically adjust one or more flight parameters to reduce the acoustic level of the aircraft to be at or below the desired acoustic level.
  • Auto-correct option 1024 can be displayed and enabled at all times during the flight, or alternatively, only when the acoustic level of the shock wave is approaching or exceeds the maximum desired acoustic level.
  • display 1000 includes one or more acoustic loudness contours 1030, 1032 as well as aircraft symbol 1002, acoustic signature (AS) symbols 1004, aircraft's navigation route 1006, navigation waypoints 1008, and alternate navigation routes 1010.
  • AS acoustic signature
  • Acoustic loudness contours 1030, 1032 can represent the footprint of previous, current, and predicted shock waves acoustic levels that have been, are, or are likely to be, generated by takeoff/landing engine and airframe noise or during supersonic flight as the air in front of the aircraft is compressed.
  • Acoustic signature symbols 1004 represent boom intercepts at ground level that were calculated at time T, T+tl, T+t2...T+tn. Locations where booms occur twice, are spread out, or are focused, can be determined from the intercepts. Acoustic signature symbols 1004 may or may not be desired by the pilot, but can be used to determine the ground shock strength, loudness, time of intercept, or other parameters associated with the acoustic level near the ground. Acoustic loudness contours 1030, 1032 represent a constant pressure level, and can be presented in different colors on display 1000 to represent varying pressure levels.
  • Focused boom contours 1034, 1036 represent the focused boom that occurs when the aircraft accelerates from subsonic to supersonic flight. Preplanned routing could be used to place the focused booms in a location where annoyance would be reduced, such as over the ocean.
  • Increased strength contours 1038, 1040 can be presented along route 1006 to represent a maneuver that is likely to cause an increase in the strength of the sonic boom that is likely to be avoided with a change in routing, as represented by alternate route 1010.
  • Areas between contours 1030, 1032, 1034, 1036, 1038, 1040 can be color coded to indicate the acoustic level.
  • an acoustic level 'g' meter acceleration (g) warning display 1042 that is calibrated with maximum allowable sonic boom strength. Since the current heading, Mach number, altitude, and weight of the aircraft are generally known, the parameters that dominate importance for boom strength are the acceleration forces on the aircraft.
  • An indicator line 1044 could increase in length and showing the current g- value of the aircraft can change color, for example from green position relative to yellow to red, g-scale 1045 as flight conditions change toward increasing boom strength.
  • the following table shows combinations of acceleration and cutoff Mach number above which indicator line 1044 would be red in region 1048 to indicate an acoustic level being generated above the desired acoustic level:
  • regions 1046, 1048 can change primarily with Mach number.
  • Regions 1046, 1048 can be color coded to indicate severity of acoustic level, for example, region 1046 can be yellow, and region 1048 can be red. Additional or fewer warning regions can be utilized.
  • the numerical values of g-scale 1045 can also change, such as when turning at speeds near the cutoff Mach number to prevent a focused shock wave during a maneuver, for example.
  • G- warning display 1042 informs the pilot of actions that should not be taken during flight.
  • G- warning display 1042 can also be linked to other cues, such as stick shakers, to cue the pilot when the aircraft's flight conditions are entering the warning zone.
  • g- warning display 1042 can be reconfigured to indicate thrust level warning.
  • g-scale 1045 can be reconfigured to indicate throttle lever percentage, while indicator line 1044 indicates current throttle level, and regions 1046, 1048 indicate warning regions.
  • FIG. 11 shows another embodiment of a display 1100 with ALMAP option 1102 selected to display acoustic pressure level cues overlaid on map 1104.
  • Map 1104 represents the geographic area along the aircraft's planned flight route and provides a visual image of the position of the aircraft with respect to the ground below.
  • Map orientation option 1106 can allow the user to select different orientations for map 1104, such as north-up or route-up.
  • Map type option 1108 can allow the user to select different types of maps, such as various types of aeronautical, topographical, and road maps.
  • Legend features declutter option 1110 can include options to add or remove various symbols and features associated with map 1104 to allow a clearer view of other images on display 1100, such as acoustic signature symbols 1004.
  • Map scale option 1112 allows the user to change the scale of map 1104 as presented on display 1100.
  • map scale option 1112 When map scale option 1112 is selected, the scale and position of route symbols 1006, waypoint symbols 1008, and acoustic signature symbols 1004, as well as the scale of map 1104, changes accordingly.
  • Other suitable options can be implemented with display 1100, such as map option 1114 that allows the user to display only map 1104 without any other symbols and information.
  • population density information can be included in a terrain database and accessed by AL cueing logic 900 (FIG. 9) to generate cues based on the population density of the area subject to the acoustic signature.
  • AL cueing logic 900 FIG. 9
  • Display system 100 and AL cueing logic 900 are discussed herein as examples of types of systems, logic, and display formats that can be used to provide information regarding the acoustic levels of the engine/airframe noise and shock wave generated by the aircraft.
  • Embodiments of AL cueing logic 900 alerts crewmembers of the level of acoustic disturbance that have been caused, and are likely to be caused, under current flight conditions.
  • AL cueing logic 900 also provides cues to the crewmembers indicating modifications to the flight condition that could lessen the severity of the disturbance.
  • options can be selected to enable AL cueing logic 900 to limit a pilot's ability to execute maneuvers that would cause sonic boom disturbances above a predetermined level, and to automatically adjust one or more flight parameters to reduce the acoustic level to a level at or below the desired level.
  • Symbols generated by AL cueing logic 900 can be presented on any suitable display 102, 104, 106 available, in addition to, or instead of, displays 1000, 1100 that are designed to be dedicated to AL cueing.
  • FIG. 12A provides an isometric view of the underside of an aircraft 1210.
  • FIGS. 12A-D depict a protective sensor mount used on an aircraft 1210, such a sensor mount may be used to provide out-the-window displays for devices such as aircraft, trains, boats, and other types of devices where it is useful to have visual images of scenery, traffic, obstacles, and other objects surrounding the device.
  • two protective sensor housings 1212 protect sensors 1214, such as video cameras 202 (FIG. 2).
  • Sensor housings 1212 are mounted on lower surface 1216 of the nose of aircraft 1210.
  • Each sensor 1214 provides a field of view 1218 through transparent aperture 1220.
  • Fairings 1222 provide a smooth continuous transition between the sensor housings 1212 and the fuselage of aircraft 1210.
  • FIG. 12B provides a left side view of aircraft 1210 that further depicts sensor housing 1212.
  • sensor housings 1212 may be mounted on lower surface 1216 of the nose of aircraft 1210 or other suitable location.
  • the sensor contained within the housing is provided field of view 1218 through transparent aperture 1220.
  • This aperture 1220 may rotate to avoid the accumulation of dirt, debris, and moisture on the aperture 1220.
  • rotation of the transparent aperture 1220 may prevent the accumulation of debris.
  • a mechanical cleaning system can further ensure that no debris collects on the aperture.
  • Fairing 1222B provides a smooth continuous transition between the sensor housings 1212 and the fuselage of aircraft 1210.
  • FIG. 12C provides a cross section of the nose of aircraft 1210 through sensor housings 1212.
  • sensors 1214 are cameras, which capture video images.
  • FLIR Forward- Looking Infrared
  • FIG. 12C also illustrates cleaning mechanism 1224.
  • cleaning mechanisms 1224 wipe the transparent outer surface of conical surfaces 1226 with brushes.
  • any debris removal system known to those skilled in the art may be employed.
  • cleaning mechanisms 1224 are located above sensors 1214. As such, their placement does not interfere with the field of view of these sensors 1214.
  • Conical surfaces 1226 rotate about axis 1228 respectively. As shown in FIG. 12D, this rotation is driven by drive shaft 1230 coupled to motor 1232 by gear box 1234. As conical surface 1226 s rotate, a brush or wiper 1236 can be included to remove debris from the inner and/or outer surface of conical surfaces 1226. This brush or wiper may oscillate in response to the motion of crank 1238 and connecting rod 1240, which may be controlled by an operator, or automatically controlled when debris is detected by an image processing system. A fluid injection system 1242 may further enhance the ability of the cleaning mechanism 1224 to remove debris by applying cleaning solutions to the conical surface 1226 as it rotates. An environmental seal 1244 serves to isolate sensors 1214 within the protective housing from the external environment.

Abstract

Systèmes d'affichage pour faire fonctionner un appareil qui reçoit des images d'un premier capteur et d'un second capteur, ces images représentant un paysage à l'extérieur de l'appareil. Le système d'affichage est configuré pour détecter des objets en mouvement dans les images, ainsi que pour fusionner les images de manière à obtenir un seul point de vue. L'image fusionnée est transformée en une image de premier point de vue à partir d'un premier poste d'opérateur dans l'appareil et en une image de second point de vue à partir d'un second poste d'opérateur dans l'appareil. L'image de capteurs combinée et des symboles sont envoyés à un dispositif d'affichage qui est placé de manière à fournir à l'opérateur une partie du champ de vision extérieur. Le champ de vision entier désiré est fourni par le dispositif d'affichage à l'opérateur en combinaison avec le paysage extérieur visible par les fenêtres de l'appareil. Le système d'affichage produit une pluralité de fenêtres mutuellement exclusives sur le dispositif d'affichage. Une ou plusieurs des fenêtres comportent une interface commune d'utilisateur et une zone d'affichage commune pour un sous-ensemble d'au moins deux des fenêtres. Le système reçoit des informations concernant les conditions de vol sur le moment de l'appareil, par ex. un avion, et détermine le niveau acoustique du bang supersonique et / ou d'autres bruits produit par l'appareil en fonctionnement. Le niveau acoustique sur le moment est comparé à un niveau désiré et diverses indications sont affichées à l'intention des opérateurs concernant des mesures de correction qui peuvent être prises pour réduire ou maintenir le niveau acoustique à un niveau désiré. Un boîtier protecteur entoure les capteurs. Ce boîtier protecteur comporte une ouverture transparente à travers laquelle le capteur prend des images. Un mécanisme de nettoyage enlève les saletés de l'ouverture transparente afin que les capteurs produisent des images continues représentant le paysage à l'extérieur de l'appareil via un dispositif d'affichage d'opérateur.
EP04817726A 2003-07-08 2004-07-01 Systemes d'affichage pour un appareil Withdrawn EP1661117A4 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US10/615,634 US7486291B2 (en) 2003-07-08 2003-07-08 Systems and methods using enhanced vision to provide out-the-window displays for a device
US10/616,145 US7312725B2 (en) 2003-07-08 2003-07-08 Display system for operating a device with reduced out-the-window visibility
US10/619,848 US6905091B2 (en) 2003-07-14 2003-07-14 System and method for controlling the acoustic signature of a device
US10/706,672 US7982767B2 (en) 2003-11-11 2003-11-11 System and method for mounting sensors and cleaning sensor apertures for out-the-window displays
PCT/US2004/023167 WO2005050601A2 (fr) 2003-07-08 2004-07-01 Systemes d'affichage pour un appareil

Publications (2)

Publication Number Publication Date
EP1661117A2 true EP1661117A2 (fr) 2006-05-31
EP1661117A4 EP1661117A4 (fr) 2009-01-21

Family

ID=34623988

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04817726A Withdrawn EP1661117A4 (fr) 2003-07-08 2004-07-01 Systemes d'affichage pour un appareil

Country Status (2)

Country Link
EP (1) EP1661117A4 (fr)
WO (1) WO2005050601A2 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2903787B1 (fr) * 2006-07-11 2008-11-14 Thales Sa Dispositif de generation de la fonction de secours dans un viseur tete haute
US8019489B2 (en) 2006-12-20 2011-09-13 The Boeing Company Methods and systems for displaying messages from a plurality of sources
DE102010042956A1 (de) 2010-10-26 2012-04-26 Airbus Operations Gmbh Verfahren und Anordnung zur Bestimmung einer Belastung einer Flugzeugstruktur
US9376983B2 (en) * 2012-11-30 2016-06-28 Honeywell International Inc. Operations support systems and methods with acoustics evaluation and control
FR3028498B1 (fr) * 2014-11-14 2018-06-01 Airbus Operations Dispositif pour la commande d'un regime de poussee d'au moins un moteur d'aeronef.
US10793266B2 (en) * 2016-11-14 2020-10-06 Boom Technology, Inc. Commercial supersonic aircraft and associated systems and methods
WO2018183994A1 (fr) * 2017-03-31 2018-10-04 Area 2601, LLC Systèmes informatiques et procédés permettant de faciliter une approche d'aéronef
US10699584B2 (en) 2018-03-02 2020-06-30 Honeywell International Inc. Systems and methods for sonic boom aware flight planning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5189929A (en) * 1992-03-09 1993-03-02 United Technologies Corporation System and method for transmission gearbox noise control utilizing localized oil cooling/heating
US5742336A (en) * 1996-12-16 1998-04-21 Lee; Frederick A. Aircraft surveillance and recording system
WO1998016421A1 (fr) * 1996-10-17 1998-04-23 Bullock, Roddy, M. Dispositif de lecture, detection, choix d'objectifs, communication et reponse base sur un aeronef
US20030067542A1 (en) * 2000-10-13 2003-04-10 Monroe David A. Apparatus for and method of collecting and distributing event data to strategic security personnel and response vehicles

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4887298A (en) * 1988-06-15 1989-12-12 Renkus-Heinz Electronic circuit for sensing disconnect or failure of a power output sense line in an audio power system
US5551649A (en) 1989-10-20 1996-09-03 Fokker Aircraft B.V. Propeller blade position controller
US6405975B1 (en) * 1995-12-19 2002-06-18 The Boeing Company Airplane ground maneuvering camera system
US6466235B1 (en) * 1999-09-08 2002-10-15 Rockwell Collins, Inc. Method and apparatus for interactively and automatically selecting, controlling and displaying parameters for an avionics electronic flight display system
JP2001344597A (ja) * 2000-05-30 2001-12-14 Fuji Heavy Ind Ltd 融合視界装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5189929A (en) * 1992-03-09 1993-03-02 United Technologies Corporation System and method for transmission gearbox noise control utilizing localized oil cooling/heating
WO1998016421A1 (fr) * 1996-10-17 1998-04-23 Bullock, Roddy, M. Dispositif de lecture, detection, choix d'objectifs, communication et reponse base sur un aeronef
US5742336A (en) * 1996-12-16 1998-04-21 Lee; Frederick A. Aircraft surveillance and recording system
US20030067542A1 (en) * 2000-10-13 2003-04-10 Monroe David A. Apparatus for and method of collecting and distributing event data to strategic security personnel and response vehicles

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GUELL J: "FLILO (Flying Infrared for Low-level Operations) an Enhanced Vision System" IEEE AEROSPACE AND ELECTRONIC SYSTEMS MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, 1 September 2000 (2000-09-01), pages 31-35, XP002988590 ISSN: 0885-8985 *
See also references of WO2005050601A2 *

Also Published As

Publication number Publication date
WO2005050601A3 (fr) 2006-04-06
EP1661117A4 (fr) 2009-01-21
WO2005050601A2 (fr) 2005-06-02

Similar Documents

Publication Publication Date Title
US7312725B2 (en) Display system for operating a device with reduced out-the-window visibility
US7486291B2 (en) Systems and methods using enhanced vision to provide out-the-window displays for a device
US7982767B2 (en) System and method for mounting sensors and cleaning sensor apertures for out-the-window displays
US10318057B2 (en) Touch screen instrument panel
US10540902B2 (en) Flight planning and communication
US8484576B2 (en) System and method for customizing multiple windows of information on a display
US7212216B2 (en) Perspective view primary flight display with terrain-tracing lines and method
US9685090B2 (en) Navigational aids
EP1835369B1 (fr) Système et affichage pour évitement d'incursion de piste
US20180232097A1 (en) Touch Screen Instrument Panel
US9709420B2 (en) Reconfiguration of the display of a flight plan for the piloting of an aircraft
EP2685442B1 (fr) Système pour afficher des informations de piste
EP3125213B1 (fr) Systèmes à bord d'avion et procédés d'identification de plates-formes d'atterrissage en mouvement
EP2830032B1 (fr) Systèmes, procédés et affichage de poste de pilotage d'aéronef pour afficher des informations intégrées d'altitude minimale de sécurité et d'altitude de vectorisation minimale sur un dispositif d'affichage dans un aéronef
US20170186203A1 (en) Display of meteorological data in aircraft
EP3309519B1 (fr) Système pour aéronef et procédé associé permettant d'afficher le cisaillement du vent
US20120072105A1 (en) Ground navigational display, system and method displaying buildings in three-dimensions
CN106052690B (zh) 显示移动着陆平台的飞行器系统和方法
EP1661117A2 (fr) Systemes d'affichage pour un appareil
EP3926607A1 (fr) Procédés, systèmes et appareils permettant d'identifier et d'indiquer les procédures d'approche au point de visée de piste secondaire (srap)
US10565882B1 (en) Vertical situation display past and projected path depiction
EP2360656B1 (fr) Procédé et système pour afficher une altitude de sûreté minimale d'après des données

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060201

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL HR LT LV MK

DAX Request for extension of the european patent (deleted)
RBV Designated contracting states (corrected)

Designated state(s): FR GB

REG Reference to a national code

Ref country code: DE

Ref legal event code: 8566

A4 Supplementary search report drawn up and despatched

Effective date: 20081218

RIC1 Information provided on ipc code assigned before grant

Ipc: B64D 47/08 20060101ALI20081212BHEP

Ipc: G01C 23/00 20060101ALI20081212BHEP

Ipc: G09G 5/00 20060101AFI20060413BHEP

17Q First examination report despatched

Effective date: 20090515

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SUPERSONIC AEROSPACE INTERNATIONAL, LLC

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20121114