USRE49067E1 - Flight deck multifunction control display unit - Google Patents

Flight deck multifunction control display unit Download PDF

Info

Publication number
USRE49067E1
USRE49067E1 US16/425,179 US201916425179A USRE49067E US RE49067 E1 USRE49067 E1 US RE49067E1 US 201916425179 A US201916425179 A US 201916425179A US RE49067 E USRE49067 E US RE49067E
Authority
US
United States
Prior art keywords
mcdu
command
voice
computing device
tablet computing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/425,179
Inventor
Sue McCullough
Mark Pearson
Phillip Toews
Charles Dairman
Shane Lai
Hariharan Saptharishi
Venkat Ramana Mummadi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US16/425,179 priority Critical patent/USRE49067E1/en
Application granted granted Critical
Publication of USRE49067E1 publication Critical patent/USRE49067E1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25338Microprocessor

Definitions

  • conventional MCDUs 102 and/or 104 are typically housed within a cabinet or MCDU box.
  • the circuitry comprising these MCDUs 102 and/or 104 is frequently compact and heavy, and any changes to the functionality of a conventional MCDU 102 and/or 104 most often require that the circuitry comprising the MCDU 102 and/or 104 be reconfigured.

Abstract

Systems and methods for controlling a flight deck multifunction control display unit are disclosed. In various embodiments, the systems may comprise a flight management system or other MCDU driven devices, a command database that stores a plurality of voice commands and a plurality of multifunction control display unit commands. In various embodiments, each voice command is associated with one of the plurality of multifunction control display unit commands. The systems may further comprise a pilot voice interface configured to receive a voice command from a pilot and transmit the voice command to the multifunction control display unit. The multifunction control display unit can receive the voice command from the pilot voice interface and, in response, access the command database to identify a multifunction control display unit command in the command database that is associated with the voice command.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a non-provisional of and claims priority to and the benefit of U.S. Provisional Application No. 62/030,489, filed Jul. 29, 2014, and entitled, FLIGHT DECK CONTROL AND DISPLAY, which is hereby incorporated by reference in its entirety.
TECHNICAL FIELD
The present invention generally relates to flight management systems, and more particularly, to a flight deck multifunction control display unit.
BACKGROUND
A flight management system (or “FMS”) is a computer system that operates aboard an aircraft. Flight management systems may receive a variety of avionics data and may, based upon the received data, automate a wide variety of in-flight tasks (e.g., navigation tasks).
Often times, flight management (and other) systems are accessed and controlled through one or more multifunction control display units (or “MCDUs”). Conventional MCDUs may include a low resolution display (e.g., a low resolution LCD display) and a keypad. Conventional MCDUs often comprise a large hardware bundle packaged within an MCDU cabinet. Thus, conventional MCDUs can be quite heavy, and any changes to MCDU functionality frequently require hardware redesign and/or hardware reconfiguration.
During operation, a pilot or copilot may manipulate the physical keys comprising the MCDU keypad to navigate to a particular FMS feature or function, such as, for example, an airport identification code, a radio frequency, a flight plan, and the like. Thus, to reach an FMS feature or function, it may be necessary that a pilot utilize the keypad to navigate through a set of menus and/or submenus using the keypad. Therefore, to reach a desired feature or function, a pilot may be required to recall and execute a number of button presses and/or other manual operations using the keypad. This process may consume time and focus, particularly where the pilot is unfamiliar with the feature and/or function being sought. The difficulty associated with locating the desired feature or function may be magnified during circumstances (e.g., landing) that may give rise to a high or fluctuating workload. Moreover, low resolution MCDU displays may fail to support a variety of non-English characters, rendering many existing MCDU displays unusable to pilots who are not trained to use English MCDU displays.
Hence, there is a need for a more pilot-friendly MCDU, such as for example, a tablet (e.g., a tablet computing device) MCDU. There is further a need for an MCDU having a touchscreen interface, as well as an MCDU that can be reconfigured on the fly, without hardware redesign, as well as an MCDU that supports a variety of non-standard (e.g., non-ASCII characters. A need further exists for an MCDU that reduces pilot workload, such as an MCDU that accepts voice commands, rather than and/or in addition to, as described above, manual command entries. A need further exists for an MCDU that includes wireless communication capability, one or more proximity sensors, one or more accelerometers, and the like.
BRIEF SUMMARY
This summary is provided to describe select concepts in a simplified form that are further described in the Detailed Description. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Systems and methods for controlling a flight deck multifunction control display unit are disclosed. In various embodiments, the systems may comprise a flight management system, a command database that stores a plurality of voice commands and a plurality of multifunction control display unit commands. In various embodiments, each voice command is associated with one of the plurality of multifunction control display unit commands. The systems may further comprise a pilot voice interface configured to receive a voice command from a pilot and transmit the voice command to the multifunction control display unit. The multifunction control display unit can receive the voice command from the pilot voice interface and, in response, access the command database to identify a multifunction control display unit command in the command database that is associated with the voice command, and transmit the multifunction control display unit command to the flight management system or other MCDU driven avionics system (or “FMS/AS”) for execution thereby.
Likewise, in various embodiments, the methods may comprise accessing, by a multifunction control display unit processor, a command database in response to receiving a voice command, identifying, by the processor, a multifunction control display unit multifunction control display unit command in the command database that is associated with the voice command, and transmitting, by the processor, the multifunction control display unit command to a flight management system or other MCDU driven avionics system that is communicatively coupled to the processor for execution by the flight management system or other MCDU driven avionics system.
Furthermore, other desirable features and characteristics of the systems and methods disclosed herein will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the preceding background.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
FIG. 1 illustrates an interior perspective view of an aircraft cockpit having two conventional, prior art, keypad controlled MCDUs;
FIG. 2 illustrates, in accordance with various embodiments, a block diagram of a system having a voice responsive, tablet, MCDU;
FIG. 3 illustrates, in accordance with various embodiments, a perspective view of a system having a voice responsive, tablet, MCDU;
FIG. 4 illustrates, in accordance with various embodiments, a cross-sectional view of a voice responsive, tablet, MCDU extending from a conventional MCDU cabinet;
FIG. 5 illustrates, in accordance with various embodiments, a perspective view of a voice responsive, tablet, MCDU extending from a conventional MCDU cabinet.
FIG. 6 illustrates, in accordance with various embodiments, interior perspective view of an aircraft cockpit having two voice responsive, tablet, MCDUs;
FIG. 7 illustrates, in accordance with various embodiments, a process for selecting an MCDU command based upon a voice command;
FIG. 8A illustrates, in accordance with various embodiments, a table associating a plurality of voice commands with a plurality of MCDU commands;
FIG. 8B illustrates, in accordance with various embodiments, a table associating a plurality of voice commands with a plurality of MCDU commands; and
FIG. 8C illustrates, in accordance with various embodiments, a table associating a plurality of voice commands with a plurality of MCDU commands.
DETAILED DESCRIPTION
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Thus, any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. All of the embodiments described herein are exemplary embodiments provided to enable persons skilled in the art to make or use the invention and not to limit the scope of the invention which is defined by the claims. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary, or the following detailed description.
As described above, and with reference to FIG. 1, conventional flight management systems as well as other avionics driven systems 100 can be accessed by way of and controlled by one or more conventional MCDUs 102 and/or 104. Traditionally, a pilot or copilot may manipulate the physical keys comprising an MCDU keypad to navigate to a particular FMS/AS feature or function, such as, for example, an airport identification code, a radio frequency, a flight plan, and the like.
Thus, to reach an FMS/AS feature or function, it may be necessary that a pilot utilize the keypad to navigate through a set of menus and/or submenus using the keypad. In other words, to reach a desired feature or function, a pilot may be required to recall and execute a number of button presses and/or other manual operations using the keypad. This multi-step process may consume time and focus, particularly where the pilot is unfamiliar with the feature and/or function being sought. The difficulty associated with locating the desired feature or function may be magnified during circumstances (e.g., landing) that may give rise to a high or fluctuating workload. Moreover, low resolution MCDU displays may fail to support a variety of non-ASCII characters, rendering many existing MCDU displays unusable to pilots who require non-ASCII characters.
In addition, conventional MCDUs 102 and/or 104 are typically housed within a cabinet or MCDU box. The circuitry comprising these MCDUs 102 and/or 104 is frequently compact and heavy, and any changes to the functionality of a conventional MCDU 102 and/or 104 most often require that the circuitry comprising the MCDU 102 and/or 104 be reconfigured.
A system comprising a touchscreen, voice responsive, MCDU is disclosed. The disclosed MCDU can be mechanically coupled to an outer surface of an existing MCDU cabinet. In addition, the disclosed MCDU can be communicatively coupled to and/or comprise a command database that includes a plurality of MCDU commands. The system can further comprise a pilot voice interface.
A pilot can provide a voice command to the MCDU via the voice interface, and the MCDU can identify, within the command database, an MCDU command that corresponds to the voice command. Thus, in various embodiments, a pilot can be relieved of the necessity of navigating a hierarchy of menus and/or submenus by way of an MCDU keypad. Rather, the MCDU can identify the requested MCDU command in the command database by matching the voice command (with the aid of voice recognition software) with a particular MCDU command. The MCDU can, in addition, transmit this command to the FMS/AS for execution.
However, the disclosed voice responsive MCDU (which can comprise a tablet computing device, such as a touchscreen tablet computing device) can also display a replica of a conventional MCDU keypad, which can employ a menu logic and structure that is identical and/or substantially similar to that of a conventional MCDU—that is, an MCDU with which a pilot has prior experience. Accordingly, although a pilot can provide a voice command to bypass a manual MCDU command selection process, in various embodiments, a pilot can also manipulate a plurality of virtual keys displayed by the MCDU touchscreen display in a manner similar to and/or identical to the manner in which the pilot formerly manipulated the physical keys of a conventional MCDU keypad. Thus, a pilot can choose a command selection option with which the pilot is most comfortable.
The system can further comprise a conversion device capable of being installed within an empty (or emptied out) MCDU cabinet. The conversion device can be, in comparison to existing MCDU circuitry, much lighter in weight and can consume less space within the MCDU cabinet as well.
The conversion device can, in various embodiments, convert an MCDU command (or other MCDU transmitted or received communication) between a universal serial bus (“USB”) data standard and an avionics data bus data standard, such as an Aeronautical Radio, Incorporated (“ARINC”) 429 data standard. Moreover, in various embodiments, the conversion device can convert an MCDU command (or other MCDU transmitted or received communication) between an avionics data bus standard, such as an ARINC 429 standard and a wireless communication standard (e.g., a WIFI standard, a BLUETOOTH standard, and the like). Thus, the conversion device can format data received from the disclosed MCDU, so that the data can be transmitted on an existing avionics data bus without adaptation of or reconfiguration of the existing data bus. In other words, a conventional MCDU can be easily swapped for the disclosed MCDU and conversion device, and the existing avionics data bus, FMS, and other components communicatively coupled to the MCDU can remain unchanged.
With reference now to FIG. 2, an aircraft control system 200 is disclosed. The system 200 can include an FMS/AS 204, a voice responsive MCDU 206 (such as a touchscreen and/or tablet computing device), a command database 208, a pilot voice interface 210, and/or a data conversion device 216.
In various embodiments, the FMS/AS 204 can comprise any computer-based system, computer processor, and the like that is capable of and/or configured to receive or transmit an MCDU command (or other data) from or to an MCDU 206 and/or command data from or to the command database 208 and/or converted MCDU command data from or to the data conversion device 216, and, based upon and/or in response to the transmitted or received command(s) and/or command data, manage any of a variety of in-flight and/or ground-based aircraft functionality.
The MCDU 206 can comprise any device capable of and/or configured to communicate an with the FMS/AS 204 (e.g., any device configured to transmit or receive data, such as for example, an MCDU command). More particularly, the MCDU 206 can comprise any device that can receive a voice command or voice command data from a pilot voice interface 210 and/or identify an MCDU command in a command database 208 based upon the received voice command. The MCDU 206 can further comprise any device capable of and/or configured to display a virtual keypad that corresponds, substantially similarly and/or identically, to a conventional MCDU physical keypad. Thus, an MCDU 206 can comprise a tablet computing device, a touchscreen computing device, and/or any combination of these and other similar devices. The MCDU can further include a variety of wireless communications functionality, one or more proximity sensors, one or more accelerometers, and the like.
Thus, in various embodiments, the MCDU 206 can comprise a voice responsive, device, such as, for example, a tablet computing device (which may, again, comprise a touchscreen). The MCDU 206 can comprise a processor (or processors) 212 and/or a voice recognition module 214. The voice recognition module 214 can comprise any hardware, software, and/or tangible, non-transitory computer-readable media capable of execution by the processor 212 to process a voice command received from the pilot voice interface 210. The MCDU 206 is communicatively coupled to the FMS/AS 204 and/or the command database 208 and/or the pilot voice interface 210.
As described briefly above, conventional MCDU 102 and/or 104 displays may lack resolution sufficient to render non-ASCII characters. The MCDU 206 display, however, incorporates a display, such as a tablet display, that is capable of displaying non-ASCII characters. Thus, the MCDU 206 is configured to display a variety of more complex characters and symbols than a conventional MCDU 102 and/or 104 display, thereby enabling the use of the MCDU 206 in a variety of aircraft.
In various embodiments, as described herein, the MCDU 206 is communicatively coupled to the FMS/AS 204 through a data conversion device 216. The data conversion device 216 can comprise any device that enables communication between the MCDU 206 and the FMS/AS 204. For instance, the data conversion device 216 can comprise any device that can convert MCDU commands (as well as other data) from a USB data standard or format to an avionics data bus data standard or format, such as an ARINC 429 data standard or format. The data conversion device can also convert MCDU commands (and other data) from a wireless communications standard to an avionics data bus standard (e.g., again, an ARINC 429 data standard).
The command database 208 can comprise any tangible, non-transitory, memory or computer-readable media capable of and/or configured to store command data, such as, for example, one or more MCDU commands and/or one or more voice and/or textual commands. The command database 208 preferably stores an association or link between one or more MCDU commands and/or one or more voice and/or textual commands. The command database 208 can reside physically apart from the MCDU 206 and/or can comprise a part or component of the MCDU 206.
The pilot voice interface 210 can comprise any device capable of and/or configured to receive and/or transmit pilot voice commands and/or data. For example, the pilot voice interface 210 can comprise a microphone. The pilot voice interface 210 can further comprise a pilot headset that includes a microphone into which the pilot may express a spoken voice command.
With respect to FIG. 3, a perspective view of a portion of the aircraft control system 200 is shown. As described herein, the system 200 includes the MCDU 206, which can be communicatively coupled to the data conversion device 216. The MCDU 206 can be communicatively coupled to the data conversion device 216 in any suitable manner. For example, the MCDU 206 can be communicatively coupled to the data conversion device 216 via a wired connection 304, via a wireless connection 306, and the like. The MCDU 206 can be similarly communicatively coupled to the pilot voice interface 210. For instance, the MCDU 206 can be communicatively coupled to the pilot voice interface 210 via a wired connection 308, a wireless connection, and the like.
As described herein, the data conversion device 216 is, in various embodiments, configured to convert MCDU command data that it receives in a USB or wireless data format to an avionics data bus format, such as an ARINC 429 format. Accordingly, as shown, the data conversion device 216 can receive USB or wireless formatted MCDU command data at an input 310, convert the MCDU command data to an avionics data bus data format, and transmit an MCDU command data at its output 312 to one or more existing avionics data bus inputs 314a and/or 314b (each having a variety of pinouts). The MCDU command data may, from here, be communicated over the aircraft avionics communications network to the FMS/AS 204. The data conversion device 216 may further, and in various embodiments, receive data (e.g., from the FMS/AS 204) in an avionics data bus format, convert the data to a USB or wireless format, and transmit the converted data to the MCDU 206.
In addition, as shown, the data conversion device 216 can be retrofitted within a conventional MCDU cabinet 302. More particularly, the hardware and circuitry bundled within a conventional MCDU cabinet 302 can be removed, and the data conversion device 216 inserted within the cabinet. The weight and complexity of the circuitry within the conventional MCDU cabinet 302 can be, in this way, significantly reduced. In addition, as described herein, the introduction of the data conversion device 216 and MCDU 206 to the system 200 may not require modification of the existing avionics data bus and/or any other systems within the aircraft. Rather, the MCDU 206 can be configured, as described, to seamlessly interoperate with existing avionics systems and subsystems.
With attention now to FIG. 4, a cross-sectional view of an MCDU 206 mounted to a conventional MCDU cabinet 302 is shown. The MCDU 206 can be coupled to an outer surface of the conventional MCDU cabinet 302 by any MCDU coupling structure 402 that is configured to, capable of, operable to, or otherwise suitable for the purpose of permitting articulation or motion of the MCDU 206. For example, the MCDU coupling structure 402 can permit the MCDU 206 to move in any direction, such as, for example, in any direction and/or directions in the z x, y, and/or z direction or directions. Thus, angle and position of the MCDU 206 can be manipulated by a pilot to achieve an ideal viewing angle, to alter an angle of reflection (e.g., of sunlight) of the MCDU 206 display, and the like.
Turning to FIG. 5, a perspective view of an MCDU 206 removably coupled or mounted to a conventional MCDU cabinet 302 is shown. Here, the MCDU 206 is removably mounted within a holster or case 502, which is mechanically coupled, in turn, to the conventional MCDU cabinet 302. Thus, in various embodiments, a pilot can remove or decouple the MCDU 206 from its holster 502. This permits the pilot to place the MCDU 206 in any location that is convenient to the pilot, such as, for example, in the pilot's lap and/or on the pilot's thigh or knee.
In addition, as shown, a gesture sensitive device 504 can be communicatively coupled to the MCDU 206. The gesture sensitive device 504 can, in various embodiments, communicatively couple to the MCDU 206 through an aperture, opening, or slot formed in the holster 502. The gesture sensitive device 504 can comprise any device capable of or configured to detect a gesture or motion of a pilot, such as a pilot hand wave or other bodily motion. Thus, the gesture sensitive device 504 can comprise, for example, a LEAP MOTION gesture recognition product.
A pilot can, in various embodiments, make a gesture with the body (e.g., a hand) of the pilot, and the gesture sensitive device 504 can detect this gesture and transmit data associated with the gesture (or “gesture data”) to the MCDU 206. The MCDU 206 can receive and interpret (e.g., using gesture interpretation software installed in a tangible, non-transitory, memory of the MCDU 206) the gesture data as an instruction by the pilot to execute a particular instruction.
For example, in various embodiments, a pilot can make a throwing gesture, which the gesture sensitive device 504 can receive and transmit, as gesture data, to the MCDU 206 processor 212 can compare this gesture data to a database of predefined stored gesture data, and, if a match is identified between the received gesture data and the stored gesture data, the MCDU 206 processor 212 can execute an instruction associated with the identified stored gesture data, such as, for example, displaying, sharing, moving, translating, or “throwing” an airport moving map or other information from the MCDU 206 display to a larger cockpit display. Moreover, in various embodiments, a gesture such as the pilot can position or hold one or both hands of the pilot to indicate that all cockpit chatter should stop and/or to trigger the voice recognition software installed within the MCDU 206 to begin listening for a pilot voice command.
Thus, with respect to FIG. 6, an aircraft control system 600 comprising two, touchscreen, voice responsive, MCDUs 206a and 206b is shown. One or both of the MCDUs 206a and/or 206b can be touchscreen, voice responsive, MCDUs, as described herein with respect to the MCDU 206. However, in various embodiments, only one of the MCDUs 206a and/or 206b may be a touchscreen, voice responsive, MCDU 206, while the other MCDU can comprise a conventional MCDU, such as the MCDU 102 and/or 104.
With reference to FIG. 7, a process 700 for selecting an MCDU command based upon a voice command is shown. Accordingly, in operation, and as described elsewhere herein, the MCDU 206 receives a voice command from a pilot voice interface 210 (step 702). The MCDU 206 executes voice recognition software to convert the voice command into a text command (step 704). For example, the MCDU 206 processor 212 can read and execute voice recognition software stored in the voice recognition module 214, which can comprise any tangible, non-transitory, computer-readable medium and/or memory. The voice recognition software can be executed by the processor 212 to perform the voice to text conversion described above.
The MCDU 206 further accesses a command database 208 in response to receiving the voice command and/or in response to converting the voice command to a text command (step 706). A command database 208 can include, as shown with reference to FIGS. 8A, 8B, and 8C, a cross-reference table 800. The cross-reference table can comprise any suitable number of columns and/or any suitable number of rows. For example, in various embodiments, the cross-reference table can comprise an MCDU command (“MCDU CMD”) column 802 and/or a voice command (“VOICE CMD”) column 804.
As described herein, an MCDU command 802 can comprise a command that an MCDU 206 can transmit or communicate to an FMS/AS 204 via an existing avionics data bus. In addition, and as described herein, a voice command 804 can comprise a command that a pilot enters via a pilot voice interface 210. Accordingly, the cross-reference table 800 links an MCDU command 802 to a voice command 804. In various embodiments, and as shown at table 800, the voice command 804 can exist in the table 800 as a text command. Thus, as the MCDU 206 processor 212 interfaces with the MCDU 206 voice recognition module 214 to convert a pilot voice command 804 into a text command, the MCDU 206 can access the command database 208 to match a converted voice/text command 804 stored in the command database 208 to an MCDU command 802 (step 708). The MCDU 206 can therefore identify an MCDU command 802 that is associated with a voice command 804. Having identified the associated MCDU command 802, the MCDU 206 can, in various embodiments, transmit or communicate the identified MCDU command 802 to the FMS/AS 204 via an existing avionics data bus, such as via the existing avionics data bus inputs 314a and/or 314b. The FMS/AS 204 can, in response to receiving the MCDU command, communicate, as described herein, with the MCDU 206 to provide the requested information (that is, the information associated with the MCDU command, such as tower frequency information) to the MCDU 206.
Accordingly, as described above, a pilot can utilize the MCDU 206 (and its associated voice recognition module 214 and processor 212) to bypass a menu hierarchy. In other words, although a pilot can utilize the MCDU 206 touchscreen to navigate through a menu hierarchy to reach a desired MCDU command 802, in various embodiments, the pilot can issue a voice command 804, which, in response to conversion to a text command by the voice recognition module 214 and processor 212, can enable instant (or single-step) navigation to a corresponding MCDU command 802. This can permit a pilot who has difficulty recalling a rarely used MCDU command 802 (and/or the menu navigation process required to reach the MCDU command 802) to reach the MCDU command 802 more rapidly and more easily. This feature can be of particular value to a pilot, for example, during an operation requiring increased focus or increased workload, such as during a landing operation.
Moreover, as shown, for example, with respect to rows 806a and 806b, a particular MCDU command can be associated in the cross-reference table 800, with any number of relatively synonymous voice commands 804. For instance, the MCDU command #1129.25, representing an instruction by a pilot to communicate at a radio frequency of 1129.25 KHz/MHz, can be associated with a variety of synonymous voice commands, such as voice commands 806a (“FREQUENCY ONE TWENTY NINE POINT TWO FIVE”) and/or 806b (“FREQUENCY ONE HUNDRED TWENTY NINE POINT TWO FIVE”), both of which result, as shown, in the same MCDU command (i.e., #1129.25). Likewise, in various embodiments, a voice command may comprise a dynamic voice command. For instance, a pilot may not know what at which frequency an air traffic control tower operates. A voice command, such as the voice command, “DWF Tower Frequency” 806C can be included in the Table 8B, and the system 200 can, based upon this voice command (or some variation of the words in this command (e.g., “DWF” and/or “tower” and/or “frequency”) search for a tower radio frequency of a nearby tower. Thus, the system 200 can accommodate a variety of voice commands 804, each of which may correspond to, depending upon a pilot's choice of words, a same MCDU command.
In addition, the system 200 can recognize a variety of disparate or separate voice commands, as described above, to determine or make a determination as to what MCDU command a pilot is seeking, notwithstanding that the pilot has not spoken a particular voice command (as shown in Tables 8A-8C) precisely as listed. Again, for example, a pilot may simply speak the word “DWF” or “Frequency,” and the system 200 can determine, based upon the closest match to the voice command in the Tables 8A-8C, which MCDU command the pilot is seeking.
A further advantage of system 200 can arise, particularly from a business perspective, in that an MCDU 206 customer can request a variety of customized functionality (e.g., a customized software key or “soft-key”). The soft-key can be associated with any functionality that the MCDU 206 customer requests, such as for example, a particular airport MCDU command code that is frequently used by the MCDU 206 customer.
Software enhancements and/or reconfigurations are easily accomplished with system 200, as opposed to enhancements or reconfigurations of conventional MCDUs 102 and/or 104, because conventional MCDUs 102 and/or 104 may conventionally require actual physical reconstruction or reconfiguration (i.e., time consuming circuit/hardware reconfiguration), while alterations to the MCDU 206 software can be performed relatively quickly in comparison. Moreover, in various embodiments, the cross-reference table 800 can be modified virtually on-the-fly to accommodate any MCDU 206 customer request. Changes of this nature (i.e., changes to the table 800) require very little time and effort to accomplish, because even the MCDU 206 software need not change. The system 200 therefore affords a large number of advantages, as described herein, over conventional flight management systems 100.
Those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Some of the embodiments and implementations are described above in terms of functional and/or logical block components (or modules) and various processing steps. However, it should be appreciated that such block components (or modules) may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. In addition, those skilled in the art will appreciate that embodiments described herein are merely exemplary implementations.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Numerical ordinals such as “first,” “second,” “third,” etc. simply denote different singles of a plurality and do not imply any order or sequence unless specifically defined by the claim language. The sequence of the text in any of the claims does not imply that process steps must be performed in a temporal or logical order according to such sequence unless it is specifically defined by the language of the claim. The process steps may be interchanged in any order without departing from the scope of the invention as long as such an interchange does not contradict the claim language and is not logically nonsensical.
Furthermore, depending on the context, words such as “connect” or “coupled to” used in describing a relationship between different elements do not imply that a direct physical connection must be made between these elements. For example, two elements may be connected to each other physically, electronically, logically, or in any other manner, through one or more additional elements.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.

Claims (8)

What is claimed is:
1. A system comprising:
a tablet computing device configured to render and implement a multifunction control display unit (“MCDU”);
a data conversion device in operable communication with the tablet computing device and configured to convert MCDU commands from a universal serial bus (“USB”) standard to an avionics data bus standard;
a gesture sensitive device in operable communication with the tablet computing device and configured to detect hand gestures and transmit gesture data associated with the gesture to the tablet computing device;
an MCDU cabinet having an inner surface and an outer surface, the inner surface defining a cavity having only the data conversion device disposed therein;
a mounting bracket coupled to the outer surface of the MCDU cabinet, the mounting bracket configured to releasably mount the tablet computing device therein, the mounting bracket further configured to be movable, in multiple directions, relative to the MCDU cabinet, to thereby permit movement of the tablet computing device in the multiple directions when the tablet computing device is mounted therein;
a flight management system (FMS) in operable communication with the data conversion device;
a command database, the command database in operable communication with the tablet computing device and storing a plurality of voice commands and a plurality of MCDU commands, each voice command associated with one of the plurality of MCDU commands; and
a pilot voice interface configured to receive a voice command from a pilot and transmit the voice command to the tablet computing device,
wherein, the tablet computing device is configured to:
interpret the gesture data as an instruction to execute,
receive the voice command transmitted by the pilot voice interface,
access the command database in response to the voice command to identify an MCDU command in the command database that is associated with the voice command and that is executable by the FMS,
at least selectively bypass, in response to the voice command, selection of a plurality of manual MCDU commands, and
transmit the MCDU command to the data conversion device for transmission to, and execution by, the FMS.
2. The system of claim 1, the tablet computing device comprising a voice recognition module, the voice recognition module comprising voice recognition software.
3. The system of claim 1, wherein the tablet computing device comprising comprises a processor and a tangible, non-transitory, processor-readable, memory, the memory storing that stores voice recognition software, wherein the processor executing is configured to execute the voice recognition software to convert the voice command into a text command.
4. The system of claim 3, wherein the tablet computing device matching is configured to match the text command to one of the plurality of MCDU commands.
5. The system of claim 1, wherein the tablet computing device is communicatively coupled to an existing avionics data bus, the tablet computing device transmitting and is configured to transmit the MCDU command by way of the existing avionics data bus.
6. The system of claim 1, wherein the avionics data bus standard is an Aeronautical Radio, Incorporated (“ARINC”) 429 standard.
7. The system of claim 1, wherein the tablet computing device communicates is further configured to communicate with the data conversion device via at least one of: via a wired connection or via a wireless connection.
8. The system of claim 1, further comprising:
a plurality of MCDU driven avionics systems in operable communication with the tablet computing device,
wherein the tablet computing device is further configured to identify one or more MCDU commands executable by the plurality of MCDU driven avionics systems and to selectively transmit the MCDU command to at least one of the plurality of MCDU driven avionics systems for execution thereby.
US16/425,179 2014-07-29 2019-05-29 Flight deck multifunction control display unit Active 2035-05-16 USRE49067E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/425,179 USRE49067E1 (en) 2014-07-29 2019-05-29 Flight deck multifunction control display unit

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201462030489P 2014-07-29 2014-07-29
US14/552,757 US9665345B2 (en) 2014-07-29 2014-11-25 Flight deck multifunction control display unit with voice commands
US16/425,179 USRE49067E1 (en) 2014-07-29 2019-05-29 Flight deck multifunction control display unit

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/552,757 Reissue US9665345B2 (en) 2014-07-29 2014-11-25 Flight deck multifunction control display unit with voice commands

Publications (1)

Publication Number Publication Date
USRE49067E1 true USRE49067E1 (en) 2022-05-10

Family

ID=53800823

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/552,757 Active 2035-05-16 US9665345B2 (en) 2014-07-29 2014-11-25 Flight deck multifunction control display unit with voice commands
US16/425,179 Active 2035-05-16 USRE49067E1 (en) 2014-07-29 2019-05-29 Flight deck multifunction control display unit

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/552,757 Active 2035-05-16 US9665345B2 (en) 2014-07-29 2014-11-25 Flight deck multifunction control display unit with voice commands

Country Status (3)

Country Link
US (2) US9665345B2 (en)
EP (1) EP2980787B1 (en)
CN (1) CN105320036A (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9922651B1 (en) * 2014-08-13 2018-03-20 Rockwell Collins, Inc. Avionics text entry, cursor control, and display format selection via voice recognition
FR3048094B1 (en) * 2016-02-22 2018-03-02 Zodiac Aero Electric AIRCRAFT INTERFACE AND METHOD OF CONTROLLING SUCH AN INTERFACE
CN106598431A (en) * 2016-11-30 2017-04-26 中国航空工业集团公司沈阳飞机设计研究所 Device for quickly guiding instruction transmission of unmanned aerial vehicle based on manned aerial vehicle
US10606948B2 (en) * 2017-04-25 2020-03-31 Honeywell International Inc. Predictive user interface for vehicle control system
US10235890B1 (en) * 2018-05-11 2019-03-19 Rockwell Collins, Inc. System for navigating an aircraft display with a mobile device
FR3093703B1 (en) * 2019-03-12 2021-06-18 Airbus Operations Sas AIRCRAFT CABINET WITH AT LEAST ONE FLIGHT DATA DISPLAY SCREEN AND AT LEAST ONE AVIONICS MODULE
US11289094B2 (en) * 2020-04-01 2022-03-29 Honeywell International Inc. System and method for assisting pilot through clearance playback
USD978769S1 (en) * 2020-07-16 2023-02-21 The Boeing Company Flight deck control panel with display
CN112363520A (en) * 2020-10-28 2021-02-12 中国民航大学 Aircraft flight auxiliary driving system based on artificial intelligence technology and control method
FR3134474B1 (en) 2022-04-12 2024-04-12 Thales Sa Device and method for centralized management of a flight management system for aircraft
CN115188225A (en) * 2022-07-07 2022-10-14 中国商用飞机有限责任公司 Method, system and computer readable medium for air traffic control

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082844A1 (en) * 2000-12-20 2002-06-27 Van Gestel Henricus Antonius Wilhelmus Speechdriven setting of a language of interaction
US20070189328A1 (en) 2006-02-16 2007-08-16 Honeywell International Inc. Enhanced system and method for air traffic control
US7418319B2 (en) 2004-03-31 2008-08-26 The Boeing Company Systems and methods for handling the display and receipt of aircraft control information
US20080221886A1 (en) * 2007-01-03 2008-09-11 Airbus France Method and system for the entry of flight data for an aircraft, transmitted between a crew on board the aircraft and ground staff
US20090093953A1 (en) * 2007-10-08 2009-04-09 Thorsten Wiesemann Method and system for dynamically determining and displaying navigation information
US20100152924A1 (en) * 2008-12-12 2010-06-17 Honeywell International Inc. Next generation electronic flight bag
US20100305786A1 (en) * 2009-05-28 2010-12-02 The Boeing Company Method and system for approach decision display
US20110101058A1 (en) 2009-05-28 2011-05-05 Tom Heckman Pivot mount assembly
US20110125503A1 (en) * 2009-11-24 2011-05-26 Honeywell International Inc. Methods and systems for utilizing voice commands onboard an aircraft
US20110264313A1 (en) * 2010-04-22 2011-10-27 Honeywell International Inc. Flight planning with digital notam
US20120022778A1 (en) * 2010-07-22 2012-01-26 Honeywell International Inc. Systems and methods for searching and displaying flight plans
US8159464B1 (en) * 2008-09-26 2012-04-17 Rockwell Collins, Inc. Enhanced flight display with improved touchscreen interface
US20120299839A1 (en) 2011-05-27 2012-11-29 Honeywell International Inc. Aircraft user interfaces with multi-mode haptics
US20120310450A1 (en) * 2011-06-06 2012-12-06 Honeywell International Inc. Methods and systems for briefing aircraft procedures
US20120316766A1 (en) * 2011-06-10 2012-12-13 Lafon Stephanie Method for creation of a taxiing route over an airport area and associated device
CN202694151U (en) 2011-12-16 2013-01-23 新时代集团国防科技研究中心 Control terminal device for unmanned aircraft
US8364328B2 (en) 2011-04-14 2013-01-29 Hedrick Geoffrey S M Avionics data entry devices
US20130124034A1 (en) * 2011-11-14 2013-05-16 Arinc Incorporated Method and apparatus for remote e-enabled aircraft solution management using an electronic flight bag (efb)
US20130187869A1 (en) * 2012-01-23 2013-07-25 Research In Motion Limited Electronic device and method of controlling a display
US20130204469A1 (en) * 2012-02-03 2013-08-08 Rosemount Aerospace Inc. System and method for real-time aircraft performance monitoring
US20130346081A1 (en) * 2012-06-11 2013-12-26 Airbus (Sas) Device for aiding communication in the aeronautical domain
CA2831509A1 (en) 2013-10-30 2014-01-01 Westjet Airlines Ltd. Integrated communication and application system for aircraft
US20140107871A1 (en) 2012-10-15 2014-04-17 The Boeing Company Turbulence Mitigation for Touch Screen Systems
US20140214246A1 (en) * 2013-01-28 2014-07-31 Honeywell Internationall Inc. Electronic flight bag systems and methods for verifying correct takeoff performance data entry
US20140222254A1 (en) * 2012-03-08 2014-08-07 Asig, Llc Method and apparatus for a passive avionics data transfer device for use with cots tablet equipment deployed as class i or ii electronic flight bag systems
US20140274217A1 (en) * 2013-03-14 2014-09-18 Samsung Electronics Co., Ltd. Method and apparatus for operating electronic device with cover
US20150016777A1 (en) * 2012-06-11 2015-01-15 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US20150123912A1 (en) * 2013-11-06 2015-05-07 The Boeing Company Flight Deck Touch Screen Interface for Interactive Displays
US9132913B1 (en) * 2013-09-26 2015-09-15 Rockwell Collins, Inc. Simplified auto-flight system coupled with a touchscreen flight control panel
US20160179327A1 (en) * 2014-09-03 2016-06-23 Quaero Ltd. Human machine interface device for aircraft
US9377852B1 (en) * 2013-08-29 2016-06-28 Rockwell Collins, Inc. Eye tracking as a method to improve the user interface
US9530318B1 (en) * 2015-07-28 2016-12-27 Honeywell International Inc. Touchscreen-enabled electronic devices, methods, and program products providing pilot handwriting interface for flight deck systems
US9534922B1 (en) * 2012-09-25 2017-01-03 Rockwell Collins, Inc. Aircraft recording system, device, and method responsive to a new command not stated in a flight plan
US20170251501A1 (en) * 2016-02-26 2017-08-31 The Boeing Company Method and system for integration of portable devices with flight deck displays
US9830910B1 (en) * 2013-09-26 2017-11-28 Rockwell Collins, Inc. Natrual voice speech recognition for flight deck applications
US9922651B1 (en) * 2014-08-13 2018-03-20 Rockwell Collins, Inc. Avionics text entry, cursor control, and display format selection via voice recognition

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120010765A1 (en) * 2010-07-07 2012-01-12 Honeywell International Inc. System for displaying a procedure to an aircraft operator during a flight of an aircraft
US8494766B2 (en) * 2011-01-07 2013-07-23 Ge Aviation Systems, Llc Flight management system with integrated tactical commands for use with an aircraft and method of operating same
CN102837649A (en) * 2011-06-23 2012-12-26 华晶科技股份有限公司 Multi-module identification system and control method thereof
US20130321169A1 (en) * 2012-05-30 2013-12-05 Honeywell International Inc. Airport surface collision-avoidance system (ascas)
US20140070965A1 (en) * 2012-09-12 2014-03-13 Honeywell International Inc. Systems and methods for shared situational awareness using telestration
KR101459441B1 (en) * 2012-12-18 2014-11-07 현대자동차 주식회사 System and method for providing a user interface using finger start points shape recognition in a vehicle
CN203632672U (en) * 2013-12-27 2014-06-04 山东太古飞机工程有限公司 Onboard satellite communication system

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082844A1 (en) * 2000-12-20 2002-06-27 Van Gestel Henricus Antonius Wilhelmus Speechdriven setting of a language of interaction
US7418319B2 (en) 2004-03-31 2008-08-26 The Boeing Company Systems and methods for handling the display and receipt of aircraft control information
US20070189328A1 (en) 2006-02-16 2007-08-16 Honeywell International Inc. Enhanced system and method for air traffic control
US7580377B2 (en) * 2006-02-16 2009-08-25 Honeywell International Inc. Systems and method of datalink auditory communications for air traffic control
US20080221886A1 (en) * 2007-01-03 2008-09-11 Airbus France Method and system for the entry of flight data for an aircraft, transmitted between a crew on board the aircraft and ground staff
US20090093953A1 (en) * 2007-10-08 2009-04-09 Thorsten Wiesemann Method and system for dynamically determining and displaying navigation information
US8159464B1 (en) * 2008-09-26 2012-04-17 Rockwell Collins, Inc. Enhanced flight display with improved touchscreen interface
US20100152924A1 (en) * 2008-12-12 2010-06-17 Honeywell International Inc. Next generation electronic flight bag
US20110101058A1 (en) 2009-05-28 2011-05-05 Tom Heckman Pivot mount assembly
US20100305786A1 (en) * 2009-05-28 2010-12-02 The Boeing Company Method and system for approach decision display
US20110125503A1 (en) * 2009-11-24 2011-05-26 Honeywell International Inc. Methods and systems for utilizing voice commands onboard an aircraft
EP2337024A1 (en) 2009-11-24 2011-06-22 Honeywell International Inc. Methods and systems for utilizing voice commands onboard an aircraft
US20110264313A1 (en) * 2010-04-22 2011-10-27 Honeywell International Inc. Flight planning with digital notam
US20120022778A1 (en) * 2010-07-22 2012-01-26 Honeywell International Inc. Systems and methods for searching and displaying flight plans
US8364328B2 (en) 2011-04-14 2013-01-29 Hedrick Geoffrey S M Avionics data entry devices
US20120299839A1 (en) 2011-05-27 2012-11-29 Honeywell International Inc. Aircraft user interfaces with multi-mode haptics
US20120310450A1 (en) * 2011-06-06 2012-12-06 Honeywell International Inc. Methods and systems for briefing aircraft procedures
US20120316766A1 (en) * 2011-06-10 2012-12-13 Lafon Stephanie Method for creation of a taxiing route over an airport area and associated device
US20130124034A1 (en) * 2011-11-14 2013-05-16 Arinc Incorporated Method and apparatus for remote e-enabled aircraft solution management using an electronic flight bag (efb)
CN202694151U (en) 2011-12-16 2013-01-23 新时代集团国防科技研究中心 Control terminal device for unmanned aircraft
US20130187869A1 (en) * 2012-01-23 2013-07-25 Research In Motion Limited Electronic device and method of controlling a display
US20130204469A1 (en) * 2012-02-03 2013-08-08 Rosemount Aerospace Inc. System and method for real-time aircraft performance monitoring
US20140222254A1 (en) * 2012-03-08 2014-08-07 Asig, Llc Method and apparatus for a passive avionics data transfer device for use with cots tablet equipment deployed as class i or ii electronic flight bag systems
US20150016777A1 (en) * 2012-06-11 2015-01-15 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US20130346081A1 (en) * 2012-06-11 2013-12-26 Airbus (Sas) Device for aiding communication in the aeronautical domain
US9534922B1 (en) * 2012-09-25 2017-01-03 Rockwell Collins, Inc. Aircraft recording system, device, and method responsive to a new command not stated in a flight plan
US20140107871A1 (en) 2012-10-15 2014-04-17 The Boeing Company Turbulence Mitigation for Touch Screen Systems
US20140214246A1 (en) * 2013-01-28 2014-07-31 Honeywell Internationall Inc. Electronic flight bag systems and methods for verifying correct takeoff performance data entry
US20140274217A1 (en) * 2013-03-14 2014-09-18 Samsung Electronics Co., Ltd. Method and apparatus for operating electronic device with cover
US9377852B1 (en) * 2013-08-29 2016-06-28 Rockwell Collins, Inc. Eye tracking as a method to improve the user interface
US9132913B1 (en) * 2013-09-26 2015-09-15 Rockwell Collins, Inc. Simplified auto-flight system coupled with a touchscreen flight control panel
US9830910B1 (en) * 2013-09-26 2017-11-28 Rockwell Collins, Inc. Natrual voice speech recognition for flight deck applications
CA2831509A1 (en) 2013-10-30 2014-01-01 Westjet Airlines Ltd. Integrated communication and application system for aircraft
US20150123912A1 (en) * 2013-11-06 2015-05-07 The Boeing Company Flight Deck Touch Screen Interface for Interactive Displays
US9922651B1 (en) * 2014-08-13 2018-03-20 Rockwell Collins, Inc. Avionics text entry, cursor control, and display format selection via voice recognition
US20160179327A1 (en) * 2014-09-03 2016-06-23 Quaero Ltd. Human machine interface device for aircraft
US9530318B1 (en) * 2015-07-28 2016-12-27 Honeywell International Inc. Touchscreen-enabled electronic devices, methods, and program products providing pilot handwriting interface for flight deck systems
US20170251501A1 (en) * 2016-02-26 2017-08-31 The Boeing Company Method and system for integration of portable devices with flight deck displays

Non-Patent Citations (19)

* Cited by examiner, † Cited by third party
Title
"First Office Action" EP 15 176 862, Nov. 30, 2015. (Year: 2015). *
"First Office Action", CN 2015 10571254, dated Oct. 24, 2018. (Year: 2018). *
"Fourth Office Action", EP 15 176 862, dated Jul. 19, 2019. (Year: 2019). *
"Response to Third Office Action", EP 15 176 862, dated Jun. 26, 2019. (Year: 2019). *
"Second Office Action", EP 15 176 862, dated Jan. 31, 2018. (Year: 2018). *
"Third Office Action", EP 15 176 862, dated Nov. 15, 2018. (Year: 2018). *
Alepetite, A., D. Zammit-Mangion, M. Fabbri, R. Fogh, C. Zammit, and I. Agius "Direct Tactile Manipulation of the Flight Plan in a Modern Aircraft Cockpit", Proceedings of the ACM International Conference on Human-Computer Interaction in Aerospace (HCI-Aero 2012), Sep. 12-14, 2012. (Year: 2012). *
Aviation Platform; New Cockpit—Goodie: The Tablet; [Retrieved from Internet Jul. 22, 2014] [www.aviationplatform.com/index/php/tech-a-developments].
Ballard Technologies "USB 429 ARINC 429/717 Interface Technical Data Sheet", 2010. (Year: 2010). *
Clark, N. "Touch Screens Are Tested for Piloting Passenger Jets", New York Times, Jul. 5, 2013. (Year: 2013). *
EP Extended Search Report for Application No. 15176862.9-1910 dated Nov. 30, 2015.
Flight Focus; Enhanced Electronic Flight Bag, Optimizing Flight Operations with Connectivity; [Retrieved from Internet Jul. 22, 2014] [www.flightfocus.net].
http://www.ballardtech.com/products.aspx/USB429/, retrieved Sep. 10, 2011. *
Jedruscek, M., J. Gauci, A. Muscat, and D. Zammit-Mangion "A new HMI for Aircraft Guidance using Touch Technologies", Proceedings of the 31st EAAP Conference, Sep. 2014. (Year: 2014). *
MCDU Image dated Aug. 21, 2014, retrieved from https://www.google.com/search?q=ipadmcdu&safe=active&rlz=1C1CHFX_enUS549US549&espv=2&source=lnms&tbm=isch&sa=X&ei=xe9PVKKuGuX58AHz2YGABQ&ved=0CAYQ_AUoAQ&biw=1280&bih=909.
MCDU image dated Aug. 21, 2014, retrieved from https://www.google.com/search?q=ipadmcdu&safe=active&rlz=1C1CHFX_enUS549US549&espv=2&source=nms&tbm=isch&sa=X&ei=xe9PVKKuGuX58AHz2YGABQ&ved=0CAYQ_AUoAQ&biw=1280&bih=909.
Mertens, M., H.J. Damveld, and C. Borst "An Avionics Touch Screen-Based Control Display Concept", Proceedings of SPIE 8383, 83830L, May 21, 2012. (Year: 2012). *
Moore C.A., et al.; Use of Voice Integrated With Aircraft Cockpit Displays, SID International Symposium, Jun. 5-7, 1984, San Francisco.
One Display for a Cockpit Interactive Solution (ODICIS) Consortium "Project Final Report", Oct. 17, 2012. (Year: 2012). *

Also Published As

Publication number Publication date
US9665345B2 (en) 2017-05-30
EP2980787B1 (en) 2020-02-19
US20160034250A1 (en) 2016-02-04
CN105320036A (en) 2016-02-10
EP2980787A1 (en) 2016-02-03

Similar Documents

Publication Publication Date Title
USRE49067E1 (en) Flight deck multifunction control display unit
EP2930716B1 (en) Speech recognition using electronic device and server
US9171273B2 (en) Integrated electronic checklist display system
JP6348831B2 (en) Voice input auxiliary device, voice input auxiliary system, and voice input method
US9922651B1 (en) Avionics text entry, cursor control, and display format selection via voice recognition
US9830910B1 (en) Natrual voice speech recognition for flight deck applications
US9524142B2 (en) System and method for providing, gesture control of audio information
US9898003B2 (en) External aircraft ground control
WO2016091214A1 (en) Cross-device input method, processing apparatus, input device and intelligent display device
EP3396498B1 (en) Predictive user interface for vehicle control system
EP3124384B1 (en) Touchscreen-enabled electronic devices and methods providing pilot handwriting interface for flight deck systems
EP3486807A1 (en) Methods, systems and apparatuses for improving speech recognition using touch-based predictive modeling
US9989377B2 (en) Method and system for displaying information
EP3838766A1 (en) Methods and systems for electronic checklist data references
EP4095853B1 (en) Dynamic speech recognition methods and systems with user-configurable performance
US11507080B2 (en) Portable aircraft controller devices and systems
US10672280B1 (en) Bimodal user interface system, device, and method for streamlining a user's interface with an aircraft display unit
US11688390B2 (en) Dynamic speech recognition methods and systems with user-configurable performance
US20210193133A1 (en) Information processing device, information processing method, and program
EP4043832A1 (en) System and method for providing interactive electronic charts
US11842646B2 (en) System and method for providing interactive electronic charts
US11790909B2 (en) System and method for smart validation of pre-conditions in executing pilot voice commands
EP3971889A1 (en) System and method for smart validation of pre-conditions in executing pilot voice commands
US20160171123A1 (en) Apparatus, recording medium, terminal and computer program for searching data pattern

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY