US20120174165A1 - Controlling display of content on networked passenger controllers and video display units - Google Patents

Controlling display of content on networked passenger controllers and video display units Download PDF

Info

Publication number
US20120174165A1
US20120174165A1 US13328224 US201113328224A US2012174165A1 US 20120174165 A1 US20120174165 A1 US 20120174165A1 US 13328224 US13328224 US 13328224 US 201113328224 A US201113328224 A US 201113328224A US 2012174165 A1 US2012174165 A1 US 2012174165A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
controller
passenger
transceiver
configured
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13328224
Other versions
US9060202B2 (en )
Inventor
Christopher K. Mondragon
Brett Bleacher
John J. Daly
David Pook
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales Avionics Inc
Original Assignee
Thales Avionics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/47815Electronic shopping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/214Specialised server platform, e.g. server located in an airplane, hotel, hospital
    • H04N21/2146Specialised server platform, e.g. server located in an airplane, hotel, hospital located in mass transportation means, e.g. aircraft, train or bus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Structure of client; Structure of client peripherals using peripherals receiving signals from specially adapted client devices
    • H04N21/4126Structure of client; Structure of client peripherals using peripherals receiving signals from specially adapted client devices portable device, e.g. remote control with a display, PDA, mobile phone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/418External card to be used in combination with the client device, e.g. for conditional access
    • H04N21/4185External card to be used in combination with the client device, e.g. for conditional access for payment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4786Supplemental services, e.g. displaying phone caller identification, shopping application e-mailing

Abstract

A controller is disclosed that controls an entertainment system which includes a video display unit that is separate from the controller. The controller includes a network interface, a display device, and a processing device. The network interface communicates with the video display unit via at least one data network. The processing device communicates a first command over the at least one data network to control a display of first content on the video display unit, and controls a display of second content on the display device of the controller. The second content is displayed concurrently with the first content. Related entertainment systems are disclosed.

Description

    RELATED APPLICATIONS
  • The present application claims the benefit of priority from U.S. Provisional Application No. 61/427,871 entitled “Programmable Passenger Controller With High Resolution Display and Touch Screen Interface” filed Dec. 29, 2010, the disclosure of which is hereby incorporated herein in its entirety by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to electronic entertainment systems and, more particularly, to devices used for remotely controlling entertainment systems.
  • BACKGROUND
  • In-flight entertainment (IFE) systems have been deployed onboard aircraft to provide entertainment for passengers in a passenger cabin. The in-flight entertainment systems typically provide passengers with video and audio programming. Some in-flight entertainment systems include an electronic communications network having a head-end server and seat-end electronics boxes that are coupled with video display units located at passenger seats. The video display units display content that is distributed to the seat-end electronics boxes from the head-end server over the communications network. Controllers facilitate a user's control of the content displayed on the video display units. The controllers typically include remote controls for personal use by passengers at their seats in the aircraft.
  • User interfaces to existing IFE systems may include a touch screen on a dedicated seat display monitor or video display unit disposed at the passenger seat, such as in the seat back in front of the passenger seat. The user interfaces may also include a fixed or tethered remote control unit at a passenger seat that is within reach of the passenger. Remote control units typically include fixed mechanical buttons which provide a variety of functions similar to a typical commercial television remote control. Users manipulate the buttons of the remote control units to produce desired responses at their respective video display units.
  • The process of using the remote control unit to control the display of the respective video display unit requires users to repetitively shift their focus and attention between the remote control unit and the video display unit, which can be inconvenient and problematic. This shifting of focus may slow the users' interactions with the video display unit and may cause some users discomfort. In addition, the tight command-response relationship between manipulation of a button on the remote control and responsive action displayed on the video display unit typically limits users to performing one activity at a time using the remote control unit. For example, users may only be able to interact with one step of an application process or one change to program content on the video display unit at a time using the remote control, even if the video display unit provides a windowed or overlaid display of more than one content item at a time. In addition, a graphical user interface (GUI), which is overlaid on video content displayed on the video display unit, can block a substantial portion of the video display unit's displayable surface, thus interfering with the passenger's viewing experience.
  • In a typical configuration, a remote control unit installed at a passenger seat has a fixed set of mechanical buttons or other physical controls, and cannot be expanded or modified without an expensive and time-consuming hardware replacement. Therefore, a user of the remote control is only able to control the display of content on the video display unit using a fixed method allowed by the mechanical buttons and other physical controls included on the remote control unit in conjunction with fixed programming for prompts, etc., provided via the video display unit. For example, a fixed remote control unit may include the following physical controls:
      • Buttons for audio and video controls for video display unit;
      • Buttons for navigating graphical user interfaces (GUIs) of video display unit;
      • Buttons for game control functions associated with video display unit;
      • Buttons for IFE system services;
      • Buttons for an alpha-numeric keyboard (e.g., QWERTY keyboard); and
      • Buttons for remote control of reading light and calling flight attendant
  • The inflexibility of current remote control units severely limits the ability of an IFE system to accommodate passenger and airline needs. For example, an alpha-numeric keyboard cannot be customized for the different languages of its passengers without great expense.
  • SUMMARY
  • Some embodiments of the present invention are directed to a controller for controlling an entertainment system that includes a video display unit that is separate from the controller. The controller includes a network interface, a display device, and a processing device. The network interface communicates with the video display unit via at least one data network. The processing device communicates a first command over the at least one data network to control a display of first content on the video display unit, and controls a display of second content on the display device of the controller. The second content is displayed concurrently with the first content.
  • In a further embodiment, the controller uses near field communications to identify the video display unit which is to be linked to the controller, and then uses the identity to establish a wired network connection to the video display unit.
  • In a further embodiment, the controller uses near field communications to identify the video display unit which is to be linked to the controller, and then uses the identity to carry out wireless pairing of the controller to the video display unit.
  • In a further embodiment, the controller establish a lower transmission data rate wireless communication link and a separate higher transmission data rate wireless communication link between the controller and the video display unit. The lower transmission data rate wireless communication link is used to transmit control commands from the controller to the video display unit or vice versa. The higher transmission data rate wireless communication link is used to transmit content from the video display unit to the controller or vice versa.
  • Some other embodiments of the present invention are directed to an entertainment system for a passenger vehicle that includes a plurality of passenger seats. The entertainment system includes a plurality of passenger controllers, a communication network, and a network address translation router. The passenger controllers are each associated with a different one of the passenger seats. Each of the passenger controllers includes a passenger interface for receiving and outputting information. The communication network communicatively interconnects the passenger controllers. Each of the passenger controllers is assigned a network address that is used for routing information thereto through the communication network. The network address translation router is configured to route a communication packet from an originating one of the passenger controllers through the communication network to a destination one of the passenger controllers by translating between a passenger readable identifier for the destination passenger seat and the network address of the passenger controller associated with the destination passenger seat.
  • In a further embodiment, the network address translation router is further configured to maintain a routing table that programmatically associates a passenger readable identifier for each of the passenger seats with the network address of the passenger controller associated with the passenger seat. The router receives a phone call and/or a text message via the communication network from the originating passenger controller. The phone call and/or the text message contains the passenger readable identifier for the destination passenger seat. The router queries the routing table using the passenger readable identifier to identify the associated network address for the passenger controller associated with the destination passenger seat, and route the phone call and/or the text message through the communication network to the network address for the passenger controller associated with the destination passenger seat.
  • Other controllers, systems, and methods according to embodiments of the invention will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional controllers, systems, and methods be included within this description, be within the scope of the present invention, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate certain non-limiting embodiment(s) of the invention. In the drawings:
  • FIG. 1A shows a perspective view of an assembled example programmable passenger controller according to some embodiments of the present invention;
  • FIG. 1B shows an exploded perspective view of the example programmable passenger controller of FIG. 1A according to some embodiments of the present invention;
  • FIGS. 2A and 2B show example interior views of the top cover assembly of the programmable passenger controller of FIGS. 1A and 1B;
  • FIGS. 3A and 3B show example front and rear perspective views, respectively, of the passenger controller of FIGS. 1A and 1B assembled into a front cover assembly and a rear cover assembly;
  • FIG. 4 is a block diagram of an entertainment system that includes passenger controllers and other system components which are configured according to some embodiments of the present invention;
  • FIG. 5 is a block diagram of another entertainment system that includes passenger controllers and other system components which are configured according to some embodiments of the present invention;
  • FIG. 6 is a block diagram of a passenger controller that is configured according to some embodiments of the present invention;
  • FIG. 7 is a block diagram of a system component that is configured according to some embodiments of the present invention;
  • FIG. 8 is a combined data flow diagram and flowchart of operations and methods that establish communication pathways between various components of an entertainment system according to some embodiments of the present invention;
  • FIG. 9 is a combined data flow diagram and flowchart of further operations and methods that establish communication pathways between various components of an entertainment system according to some embodiments of the present invention; and
  • FIG. 10 is a block diagram of another entertainment system that is configured according to some embodiments of the present invention.
  • DETAILED DESCRIPTION
  • In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
  • Various embodiments of passenger controllers and entertainment system are described herein which provide benefits over prior systems. For example, some embodiments may provide a flexible entertainment control experience to a passenger that supports multitasking so that the passenger may conduct multiple activities simultaneously using the IFE system. The programmable passenger controller may be easily customizable by the IFE system manufacturer or aircraft operator for different aircraft, customer, and/or passenger requirements. Such customization can include, but is not limited to, changing languages that are used for display of textual information, machine generated speech, and/or machine recognized speech from passengers, and/or cultural customizations as well as other airline operator or flight route customizations. The passenger controller may also facilitate customization of user selectable indicia on a touch screen interface, such as changes to soft button locations/indicia to provide customized touch screen based game controllers which may be automatically reconfigured for use with different game programs that can be selected among by a passenger for execution.
  • Although various embodiments of the present invention are explained herein in the context an in-flight entertainment environment, embodiments of entertainment systems and related controllers are not limited thereto and may be used in other environments, including other vehicles such as ships, buses, trains, and automobiles, as well as buildings such as conference centers, restaurants, businesses, hotels, homes, etc.
  • FIG. 1A illustrates a perspective view of an assembled example passenger controller 100. FIG. 1B illustrates the example passenger controller 100 in an exploded perspective view. FIGS. 2A and 2B illustrate interior views of the top cover assembly 110 of the example passenger controller 100 of FIGS. 1A and 1B. Embodiments of the controller 100 may be installed proximate to a passenger seat to which a separate video display unit is also proximately installed. The controller 100 may be installed so as to be handheld while tethered to the aircraft in the vicinity of the passenger seat, such by being tethered to a seat back in front of the passenger seat or to a hand rest adjacent to the passenger seat. The controller 100 may be tethered to its installation location, so that the controller 100 may be held in a passenger's hand in a comfortable and convenient position for viewing and manipulation, while not being removable by the passenger from its installation location.
  • In other embodiments, the controller 100 may be installed in a fixed position to be adjacent to the video display unit in the seat back in front of the respective passenger seat, in the hand rest adjacent to the respective passenger seat, in a swing-out arm adjacent to the respective passenger seat, or in other configurations. In some embodiments, the controller 100 may not be tethered or fixedly installed, but instead by portable and wirelessly connected to the IFE system. Embodiments of the controller 100 may also be installed in various other locations within an aircraft, such as in common areas such as galleys and lounges. The controller 100 may be automatically locked into a cradle, via a movable solenoid, that is controlled by the controller 100, the SVDU 400, and/or other component to prevent removal during take-off, landing, taxiing, and/or while the aircraft is parked.
  • In one embodiment, near field communications are used to determine whether the controller 100 has been removed from a cradle or outside a defined range of an associated seating area, and a flight attendant can be notified responsive to that determination. For example, as explained below, the controller 100 may be paired to a near field communication transceiver (e.g., RFID transceiver or Bluetooth transceiver) associated with the cradle and/or the SVDU 400 to form a near field communication link therebetween. The controller 100 and/or the paired SVDU 400/cradle can detect loss of the near field communication link and communicate a responsive alert message to an alert panel monitored by a flight attendant. The flight attendant can thereby be notified when, for example, a controller 100 is not properly stored (e.g., cradled) during takeoff, landing, and/or taxiing of an aircraft, and/or when a controller 100 is being carried off the aircraft.
  • The controller 100 may include a number of components, some of which are illustrated in FIGS. 1A, 1B, 2A, and 2B. The components of the controller 100 and arrangement thereof portrayed in FIGS. 1A, 1B, 2A, and 2B are not to be construed as limiting, but are to be considered exemplary to illustrate some embodiments of the invention. In various embodiments, some of the illustrated components may not be present and/or additional components may be additionally included to provide additional features and/or operation in view of the present disclosure. For example, while touch screen displays (touch sensitive displays) are discussed with reference to the figures, in some embodiments the controllers may not include touch screens, and other technologies besides touch screen technology may be used to provide input to the controller 100.
  • The controller 100 illustrated in FIGS. 1A and 1B includes a top cover assembly 110, a keypad 120, a liquid crystal display (LCD) 130, a mid-frame 140, a printed circuit board assembly (PCBA) 150, an electromagnetic interference (EMI) shield 160, a bottom cover 170, and an access door 180. As illustrated in FIGS. 2A and 2B, the top cover assembly 110 includes a decorative cover 210 and a touch screen 220 installed on an underside thereof to cover the LCD 130. The keypad 120 provides physical keys or buttons which a user may press to provide input to the IFE system using the controller 100. The keypad 120 may provide a tactile feedback to the user. The functionality associated with the keys or buttons of the keypad 120 may provide programmatically reconfigurable functionality. The LCD 130 may be a high resolution color display that measures approximately 3.8 inches diagonally with 800×480 pixels, but may be larger or smaller with a different number of pixels in other embodiments as known in the art. The LCD 130 may provide a display of video content including soft buttons or programmable controls which a user may touch select using the touch screen 220 to provide input to the IFE system using the controller 100.
  • The mid-frame 140 may provide structural integrity to the controller 100. In the illustrated embodiment, the mid-frame 140 may be aluminum (providing an advantageous balance of low cost and light weight). Other materials that may be used for the mid-frame 140 can include a plastic material, a ceramic material, or another material appropriate and useful in an aircraft environment.
  • The PCBA 150 may include electronic circuits such as a processor, memory, data bus, data communications circuitry, display driver circuitry, audio I/O circuitry, accelerometer, haptic vibration motors, and other circuitry as appropriate to perform the functions of the controller 100. The EMI shield 160 may shield electro-magnetic interference from emanating from the PCBA 150 in the controller 100. The decorative cover 210 and the bottom cover 170 may provide a decorative appearance, comfortable gripping surface, and protection of the interior components of the controller 100. The access door 180 may facilitate easy access to interior components (some of which may not be shown) of the controller 100, such as one or more batteries, switches, maintenance port, and/or removable media storage (e.g., a memory card such as a Secure Digital card).
  • The mid-frame 140 provides mechanical stability and protection for the LCD 130. The controller 100 may be equipped with a LCD 130 that is as large as will fit within its size envelope, which may be regulated by industry standards, such as ARINC 809-1 “3GCN (3rd Generation Cabin Network) Seat Distribution System” which defines the electrical and mechanical interfaces of in-flight entertainment system equipment for the 3rd generation cabin network (ARINC Inc., 2551 Riva Road, Annapolis, Md., http://www.arinc.com). The mid-frame 140 protects the controller 100 and the LCD 130 from passenger abuse such as twisting, applying excessive pressure, dropping, etc. The mid-frame 140 provides a thin yet strong wall surrounding the edges of the LCD 130 to facilitate a large viewable surface area.
  • The touch screen 220 may incorporate a projected capacitive multi-touch screen embedded within or laminated on an underside of the decorative cover 210. In this way, the controller 100 may be constructed to have a thin profile with a sleek appearance. The capacitive touch screen surface of the touch screen 220 may generate a signal indicating a position or x-y coordinate of a passenger's touch on the touch screen 220, and optionally a field strength of the passenger's touch. The signal indicating the position may be used to generate a control signal used to control a video source of the LCD 130 or a respective video display unit of the IFE system with which the controller 100 is operatively associated. The controller 100 may recognize one or more simultaneous touches input by a passenger on the touch screen 220, and may recognize changes in their positions over time. Based on these one or more simultaneous touches, the controller 100 may recognize various gestures such as taps, swipes, strokes, flicks, and pinches, and may perform functions associated with these gestures, such as scrolling screens of data, zooming, selecting menu items, etc.
  • A touch sensitivity of the capacitive touch technology of the touch screen 220 may be adjusted to compensate for a thickness of a transparent or translucent portion of the decorative cover 210 over the touch screen 220, which may include a material such as polarized glass or plastic. For example, the portion of the decorative cover 210 covering the touch screen 220 may include materials, compositions, and construction characteristics to provide a bright and clear display of the LCD 130 to a user. The portion of the decorative cover 210 may include a reflective mirror surface which passengers may use as a personal mirror for viewing themselves when the LCD 130 is turned off or dark. The reflective mirror surface may include materials, compositions, and construction characteristics to avoid excessive glare to the passenger.
  • FIGS. 3A and 3B illustrate front and rear perspective views, respectively, of the controller 100 assembled into a front cover assembly 310 and a rear cover assembly 320. The front cover assembly 310 includes the top cover assembly 110, the keypad 120, the LCD 130, and the mid-frame 140. The rear cover assembly 320 includes the PCBA 150, the EMI shield 160, the bottom cover 170, the access door 180 (not shown), and a local memory assembly 330. The local memory assembly 330 may include the removable media storage. This arrangement of parts into the top cover assembly 310 and the rear cover assembly 320 should not be construed as limiting, however, as the partitioning of parts into assemblies may be different in various embodiments.
  • Example Entertainment Systems Having Wireless and Wired Connections Between Passenger Controller and Other Components
  • FIG. 4 is a block diagram of an entertainment system that includes passenger controllers 100 a-d, seat video display units (SVDUs) 400 a-d, and other system components which are configured according to some embodiments of the present invention. Referring to FIG. 4, the system includes a head end content server 410 that can contains content that can be downloaded to the SVDU 400 a-d and passenger controllers 100 a-d through a data network 420 and/or a wireless router 430. Example content that can be downloaded from the head end content server 410 can include, but is not limited to, movies, TV shows, other video, audio programming, and application programs (e.g. game programs). The data network 420 may be a packet network (e.g., Ethernet), and wireless router 430 may be a WLAN (e.g. IEEE 802.11, WIMAX, etc) and/or a cellular-based network (e.g. a pico cell).
  • When used in an aircraft environment, the SVDU 400 a-d can be attached to seatbacks so that they face passengers in a following row of seats. As explained above, the passenger controllers 100 a-d may, for example, be tethered by a cable (e.g. wire/communication cable) to an armrest or an associated SVDU, or may be untethered. Although some embodiments are described herein in the context of a controller that communicates with a seat video display unit, the controller is not limited to communicating with a seat video display unit and may be used with any separate video display unit.
  • The passenger controllers 100 a-d are communicatively connected through wireless and/or wired communication links to different SVDUs 400 a-d. For example, the controller 100 a is connected by a wired communication cable (e.g. serial communication cable) to the SVDU 400 a which, in turn, is connected to the head end content server 410 through a wired communication cable connected to the network 420. The controller 100 b is wirelessly connected through Bluetooth, WLAN, and/or other wireless communication interface directly to the SVDU 400 b. The controller 100 c is indirectly connected to the SVDU 400 c through the wireless router 430. The controller 100 d is connected by another wired communication cable to the SVDU 400 d. The SVDUs 400 b,c,d are connected to the head end content server 410 by a wireless link through the wireless router 430. One or more of the controllers 100 may be connected directly to the network 420 by a wired connection, such as shown for controller 100 a′.
  • In accordance with some embodiments, a passenger can operate a controller 100 to control what content is displayed and/or how the content is displayed on the associated SVDU 400 and/or the controller 100. For example, passenger can operate the controller 100 to select among movies, games, audio program, and/or television shows that are listed on the SVDU 400, and can cause a selected movie/game/audio program/television show to be played on the SVDU 400, played on the controller 100, or played on a combination of the SVDU 400 and the controller 100 (e.g., concurrent display on separate screens).
  • FIG. 5 is a block diagram of another entertainment system that includes seat electronics boxes 500 that can be distributed within aircraft to interconnect the head end content server 410 to the SVDUs 400 e-g and/or the passenger controllers 100 e-g. For example, the seat electronics box 500 can be configured to route media from the head end content server 410 to selected ones of the SVDUs 400 e-g and/or the passenger controllers 100 e-g, and to relay commands (e.g., media selection commands, media playback commands, etc.) from the SVDUs 400 e-g and/or the passenger controllers 100 e-g to the head end content server 410.
  • The passenger controllers 100 e-g are communicatively connected through wireless and/or wired communication links to different SVDUs 400 e-g. For example, the controller 100 e is connected by a wired communication cable (e.g. serial communication cable) to the SVDU 400 e which, in turn, is connected to the head end content server 410 through a wired communication cable connected to the seat electronics box 500. The controller 100 f is indirectly connected to the SVDU 400 f by a wired connection through the seat electronics box 500. The controller 100 g is indirectly connected to the SVDU 400 through a wireless connection (e.g., Bluetooth, WLAN, etc.) through the seat electronics box 500. One or more of the SVDUs 400 may be connected to the seat electronics box 500 using one or more of the wireless communication interfaces 530 disclosed herein.
  • Each controller 100 in the IFE system may be assigned a unique network address (e.g., media access control (MAC) address, Ethernet address). In addition, each SVDU 400 may be each assigned a unique network address (e.g., MAC address, Ethernet address) which are different from the respective controller 100 network addresses. In some embodiments, the controller 100 and the respective SVDU 400 may be coupled with a same seat-end electronics box 500 (when utilized by the system) that functions as a local network switch or node to provide network services to a group of passenger seats, for example a row of seats. In other embodiments, the controller 100 and the respective SVDU 400 may be coupled with different seat-end electronics boxes 500 (when utilized by the system). For example, a controller 100 for use by a passenger in an aircraft seat identified by a passenger readable identifier (e.g., a printed placard) as seat “14B” may be attached to a seat electronics box 500 that provides network connections to row “14”, while the SVDU 400 installed in the seat back in front of seat “14B” for use by the passenger in seat “14B” may be attached to a different seat electronics box 500 that provides network connections to row “13.”
  • Example Passenger Controller
  • FIG. 6 is a functional circuit block diagram of a passenger controller 100 configured according to some embodiments of the present invention. The controller 100 includes a network interface 640, a display device 632, and a processing device 610. The network interface is configured to communicate with a video display unit, such as the SVDU 400 of FIGS. 4 and 5 via at least one data network through a wired or wireless interface.
  • The display device 632 may include the touch screen 220 and LCD 130 of FIGS. 1-3 to provide a touch sensitive display 632. The processing device 610 may be connected to a camera 638 to receive a stream of images of a passenger, and may be configured to identify gestures that a passenger creates based on an orientation and/or relative positioning of fingers of a hand, placement of hands and/or arms, and/or facial gestures created by the passenger's mouth and/or eyes.
  • The processing device 610 may include one or more data processors 612, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), and memory 614. The processor 612 is configured to execute computer program instructions from the memory 614, described below as a computer readable media, to perform at least some of the operations and methods described herein as being performed by a passenger controller in accordance with one or more embodiments of the present invention.
  • The processor 612 may execute a standard operating system for mobile devices, handheld computing devices, tablet computing devices, or personal digital assistants such as the ANDROID operating system. As such, the processor 612 may be compatible with and execute a wide variety of standard applications available for the operating system, independent of the IFE system and the respective proximately installed SVDU 400. In addition, standard off-the-shelf development platforms available for the standard operating system may facilitate rapid and straightforward application development by the aircraft operator for deployment on the controller 100. In this way, changes to the look and feel or additions to the functionality of the IFE system may be accomplished with reduced effort without requiring any changes to the hardware or recertification of the IFE system. Furthermore, the aircraft operator and/or users may download and install any of the thousands of applications available on the internet or the ANDROID MARKETPLACE for implementation, testing, integration, and use onboard the aircraft by the controllers 100 and the SVDUs 400.
  • Although the example controller 100 show in FIGS. 1-3 may have a general form factor and touch screen functionality that is similar to a smart phone, it is not limited thereto. Other example embodiments of a controller 100 can include, but are not limited to, a tablet computer, a palmtop computer, and a laptop computer.
  • The controller 100 may further include audio output/input interfaces 634, 636, a physical control interface 630 (e.g., buttons 120 shown in FIG. 1), an accelerometer 640 that provides an output signal indicating movement and/or orientation of the controller 100, and a haptic feedback generator 650 that generates touch based signaling (e.g., vibration) to a passenger.
  • In one embodiment, the processing device 610 is configured to respond to user input by communicating commands to the associated SVDU 400 to control the display of content (e.g., movies/television programming, application programs, etc.) on the associated SVDU 400, and to control the display of information received from the head end content server 410 and/or received from the associated SVDU 400 on the display device 632 of the controller 100. As explained above, content can be concurrently displayed on the controller 100 and the associated SVDU 400.
  • The controller 100 and the associated SVDU 400 may communicate through a wired and/or wireless interface. With further reference to FIG. 6, the network interface 640 can include a near field transceiver 600 (e.g., RFID), a Bluetooth transceiver 602, a WLAN transceiver (e.g., WIFI) 604, a cellular transceiver 606, and/or an IR (infrared) transceiver 607 that is configured to communicate with a corresponding remote wireless transceiver associated with the SVDU 400. The network interface 640 may include other types of wireless transceivers, including Wireless USB, Ultra Wideband (UWB), and optical transceivers such as infrared. The network interface 640 may include a wireline network interface 608 (e.g., Ethernet, Universal Serial Bus (USB)) that is configured to communicate with the SVDU 400 via a wired data network.
  • In one embodiment, the controller 100 may be configured to use the near field transceiver 600 to receive payment information from a credit card, and to communicate off-plane (e.g., as described below regarding FIG. 10) to a credit card processing facility to process payment information for a financial transaction performed on the plane.
  • In various embodiments, the controller 100 may be constructed of appropriate materials to meet applicable aircraft industry standards and requirements. For example, a head impact criteria (HIC) test may be satisfied by the controller 100. The controller 100 may be ideally suited for extreme environments such as that of an aircraft. These extreme environments may include vibration, large temperature variations, and shock which may cause reliability problems with standard commercial grade entertainment system controllers.
  • The functionality of the controller 100 as a touch screen interface for controlling the IFE system can facilitate straightforward, inexpensive, and rapid customization and branding of the controller 100 onboard the aircraft. For example, the IFE system manufacturer and/or the aircraft operator (e.g., airline) may customize the passenger interface of the controller 100 and SVDU 400 through software and/or graphical modifications input to the controller 100, and therefore may not requirements replacement of hardware to provide customization.
  • Moreover, the touch sensitive display 722 of the controller 100 facilitates the passenger's browsing and selection of IFE system functions, applications, and content in an intuitive manner without requiring the passenger to switch focus between the controller 100 and the separate SVDU 400. Furthermore, the passenger may have the option to direct content to the associated SVDU 400 which may be mounted in a seat back in front of the passenger's seat, or direct the content to the touch sensitive display 722 of the controller 100 for more personal viewing, local games, and/or convenient interaction.
  • In this way, the controller 100 may provide the passenger with an effective dual-screen display in combination with the respective video display unit to facilitate multitasking such as watching a movie while ordering meals and beverages, shopping, checking or sending email, viewing a real-time updating map of the flight's progress, playing a local game, or enjoying other entertainment options.
  • Using a First Wireless Link to Pair a Controller and a SVDU Across a Wired Network
  • Some embodiments are directed to facilitating establishment of communication links between controllers and SVDUs. In one embodiment, the processing device 610 of a controller 100 uses a wireless communication link to identify a SVDU 400 with which it is to be associated. The processing device 610 then uses the SVDU identity to perform further communications through a wired network with the SVDU 400. The processing device 610 may therefore control the wireless transceiver 600-607 to establish a wireless communication link with the remote wireless transceiver to receive an identifier for the SVDU 400, and then control the wired network interface 608 to use the identifier to communicate a command through the wired data network to control the display of the first content on the SVDU 400.
  • In one embodiment, the controller 100 may be associated with a particular SVDU 400 by swiping the controller 100 across a RFID tag that is located on (or otherwise associated with) the SVDU 400 to cause transmission of an identifier of the SVDU 400 from the RFID tag to the controller 100, which can occur without performing pairing operations between the near field transceiver 600 and the RFID tag and without the associated steps required of an operator/passenger.
  • In a further embodiment, the wireless transceiver can include a near field transceiver 600 that is configured to communicate with a remote near field transceiver associated with the SVDU 400. The processing device 610 can control the near field transceiver 600 to establish the communication link with the remote near field transceiver to receive the identifier for the SVDU 400 without performing pairing of the near field transceiver 600 and the remote near field transceiver. The processing device 610 can then use the identifier as a network address for the SVDU 400, or can determine a network address for the SVDU 400 using the identifier, to enable communication of command through the wired network to control the display of content on the associated SVDU 400.
  • Using a First Wireless Link to Pair a Controller and SVDU across a Second Wireless Link
  • In another embodiment, the controller 100 uses the near field transceiver 600 as described above to receive an identifier for a particular SVDU 400 (e.g. by swiping the near field transceiver 600 by a RFID tag associated with the SVDU 400) without performing pairing of the first wireless transceiver and the remote first wireless transceiver. The processing device 610 then controls a selected other one of the transceivers, such as the Bluetooth transceiver 602, the WLAN transceiver 604, and/or the cellular transceiver 606 to use the identifier to perform pairing to a corresponding remote transceiver (i.e., Bluetooth transceiver, WLAN transceiver, and/or cellular transceiver) of the SVDU 400 to establish a second communication link with the remote transceiver. The processing device 610 communicates commands through the selected transceiver to control the display of content of the SVDU 400, to control the delivery of content from the SVDU 400 to the controller 100, and/or to control the delivery of content from the head end content server 410 to the controller 100 and/or to the SVDU 400.
  • In a further embodiment, the processing device 610 controls the Bluetooth transceiver 602 to use the identifier to perform pairing to a remote Bluetooth transceiver of the SVDU 400 to establish a Bluetooth communication link, and to then control the SVDU 400 and/or the head end content server 410 through the Bluetooth communication link.
  • In another further embodiment, the processing device 610 controls the WLAN transceiver 604 to use the identifier to perform pairing to a WLAN transceiver of the wireless router 430, and establish a communication link through the wireless router 430 to the SVDU 400. The processing device 610 can then control the SVDU 400 and/or the head end content server 410 through the WLAN communication link.
  • Further Operations for Pairing a Controller and SVDU
  • Example operations and data flows that may be carried out between a controller 100 and a SVDU 400 are shown in FIGS. 8 and 9. Referring to a first embodiment shown in FIG. 8, an operator (e.g., maintenance person) or passenger can move (block 800) the controller 100 within a communication range of a near field transceiver associated with the SVDU 400. For example, the controller 100 can be swiped past a RFID tag associated with the SVDU 400 and/or can be moved within range of a Bluetooth transceiver and/or another communication transceiver of the SVDU 400. The near field transceiver of the SVDU 400 can transmit (block 802) the identifier for the SVDU for receipt (block 804) by the near field transceiver of the controller 100. The controller 100 can then establish (block 806) a direct and/or indirect communication pathway through at least one packet network (e.g. wired and/or wireless network) to the SVDU 400 and/or to the head end content server 410. The SVDU 400 and/or to the head end content server 410 can perform operations to assist (blocks 808 and 810) with establish and communication pathway to the controller 100.
  • The controller 100 can then communicate (block 812) control commands to the SVDU 400 and/or to the head end content server 410 to control the display of content on the SVDU 400 and/or to control the delivery of content from the SVDU 400 and/or the head end content server 410 to the controller 100. The SVDU 400 can therefore respond to commands form the controller 100 by communicating (block 814) content (e.g., a selected movie, television program, and/or application program) to the controller 100. The head end content server 410 can similarly respond to commands originating from the controller 100 by communicating (block 816) content (e.g., a selected movie, television program, and/or application program) to the controller 100 and/or to the SVDU 400.
  • Using Dissimilar Data Rate Wireless Links for Control and Content Communications
  • FIG. 9 illustrates operations and methods according to a further embodiment that is directed to using RFID communications (one type of near field communications) to communicate the identifier for the SVDU 400, and then using the identifier to establish a lower transmission data rate wireless communication link (e.g., Bluetooth) and a separate higher transmission data rate wireless communication link (e.g., WLAN) between the controller 100 and the SVDU 400. The lower transmission data rate wireless communication link can be used to communicate control commands, and may be exclusively used to communicate control commands. The higher transmission data rate wireless communication link can be used to communicate content (e.g., movie, television program, and/or application program), and may be exclusively used to communicate content.
  • Referring to FIG. 9, the controller 100 is moved (block 900) within a communication range of a RFID transceiver associated with the SVDU 400. The RFID transceiver of the SVDU 400 transmits (block 902) the identifier for receipt (block 904) by the RFID transceiver of the controller 100. The controller 100 performs Bluetooth pairing (blocks 906 and 908) with a Bluetooth transceiver of the SVDU 400 using the identifier to establish a Bluetooth network connection (block 906). The controller 100 also establishes (blocks 910 and 912) a WLAN network connection to a WLAN transceiver of the SVDU 400 using the identifier.
  • In a further embodiment, the controller 100 communicates (block 914) control commands to the SVDU 400 through the Bluetooth network connection to control the display of content (e.g., a selected movie, television program, and/or application program) on the SVDU 400 and/or to control the delivery of content from the SVDU 400 and/or the head end content server 410 to the controller 100. The SVDU 400 can communicate (block 916) content to the controller 100 through the WLAN network connection responsive to the control commands.
  • For example, a passenger can operate the controller 100 to generate a command that is used to select among a list of movies displayed on a display of the SVDU 400. The controller 100 can transmit the command through the Bluetooth network connection to cause the SVDU 400 to select a movie that is to be played on the display of the SVDU 400. The passenger may alternatively or additionally indicate by the generated command that the selected movie is to be streamed through the WLAN network connection for display on the display 632 of the controller 100. Other example commands that the passenger can generate from the controller 100 to control operation of the SVDU 400 can include gaming control feedback that is provided to a game program executing on the SVDU 400 and/or to cause video output to be played on the SVDU 400, streamed to the controller 100 for playing, or played on both the SVDU 400 and the controller 100. Other commands can include starting, pausing, and/or stopping playback of a movie and/or television program that is selectively (responsive to the command(s)) played on the SVDU 400, streamed to the controller 100 for playing, or played on both the SVDU 400 and the controller 100
  • The passenger can similarly use the controller 100 to generate commands that cause different content to be displayed on the SVDU 400 and the controller 100. For example, a passenger may operate the controller 100 to play a movie on the SVDU 400 and cause a game program to be downloaded from the SVDU 400 through the WLAN network connection for execution on the controller 100, so that playing of the movie and execution of the game program are occurring concurrently.
  • Repair and Reconfiguration of the Controller
  • By partitioning the controller 100 into multiple subassemblies such as the front cover assembly 310 and the rear cover assembly 320, a broad range of repair and replacement strategies may be employed to reduce the life-cycle costs of the controller 100 when deployed in an aircraft environment. One such strategy includes facilitating repair of a defective or malfunctioning controller 100 “on-wing,” at an airport terminal, or at an operator's or airline's local facilities. Replacement modules may be stored on-board the aircraft or at the operator's facilities at an airport for rapid replacement of defective modules onboard the aircraft.
  • A removable memory media may be swapped out to change content and/or functionality available to users via the controller 100. This swap may be conveniently performed during an on-wing maintenance operation. When a large amount of airline-specific content and/or software functionality is incorporated into the controller 100, loading the content and/or software functionality via a data communications interface can be very time consuming and therefore a limiting factor in the amount of content and/or software functionality that may be loaded at a given time. By having the content and/or software functionality stored on removable memory media, the content and/or software functionality of the controller 100 may be updated simply and quickly by swapping a new removable memory media into the controller 100 in place of an existing removable memory media. In addition, the removable memory media may be removed from the controller 100 when the controller 100 fails and needs to be replaced, and then re-installed or replaced into the replacement controller 100 during an on-wing repair operation.
  • Referring again to FIG. 6, the controller 100 can include a removable memory interface 620 that is configured to receive a removable memory media 622 and to store and retrieve data on the removable memory media 622. The media 622 includes non-volatile memory, such as flash memory or magnetic memory on a Solid State Disk (SSD), Secure Digital Card (SD Card), or disk drive. The processing device 610 is configured to execute programs residing on the media 622 (e.g., operation system program, game program, application program, etc.) to provide information to a passenger and/or to control interactions (e.g., operation of a graphical user interface) between the passenger and the controller 100. The media 622 can include content, such as movies, television programs, music, and audio programs.
  • Example Network Node
  • FIG. 7 is a functional circuit block diagram of a network node 700 configured according to some embodiments of the present invention, and elements of which may be included in the SVDU 400, the head end content server 400, a network address translation router 1000 (explained below), or other components of an entertainment system. The network node 700 includes a network interface that may include a wireline network interface 710, a near field transceiver 702 (e.g., a RFID transceiver), a Bluetooth transceiver 704, a WLAN transceiver 706, a cellular transceiver 708, and/or an IR transceiver 709 that are configured to communicate with a corresponding network interfaces and/or transceivers described above with regard to the controller 100.
  • The processing device 712 may include one or more data processors 714, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), and memory 716. The processor 714 is configured to execute computer program instructions from the memory 716, described below as a computer readable media, to perform at least some of the operations and methods described herein as being performed by the SVDU 400, the head end content server 410, the router 1000, and/or other system components in accordance with one or more embodiments of the present invention.
  • The controller note 700 may include audio output/input interfaces 724, 726, a physical control interface 720 (e.g., keyboard/keypad/buttons/switches), and a touch sensitive display 722.
  • Routing Phone Calls and/or Text messages Between Passenger Seats
  • Some embodiments are directed to routing phone calls and/or text messages between passenger seats. A passenger may call another passenger by entering or selecting a seat identifier for the other passenger using the passenger controller 100 and/or the associated SVDU 400, and can use a headset 1010 to talk to the other passenger. The passenger may perform a video conference with the other passenger using the camera 638 to send video to the other passenger and by receiving video from a controller 100 and/or SVDU 400 operated by the other passenger. A passenger may also call off-plane through a satellite/cellular transceiver 1020. A passenger may additionally or alternative send a text message to the other passenger by entering or selecting a seat identifier for the other passenger and typing the text message using an interface of the controller 100 and/or the SVDU 400 (e.g., a virtual keyboard displayed on the display 632 of the controller and/or a virtual keyboard displayed on the display 722 of the SVDU 400).
  • In particular, the system enables a passenger to enter or other select a passenger readable identifier (e.g., a printed placard) for a seat that is to receive the phone call and/or the text message (destination passenger seat), and the system translates that identifier into a network address that is used to route the phone call and/or text message to the destination passenger seat.
  • These and other embodiments are described below with regard to the example entertainment system shown in FIG. 10. The system includes a plurality of passenger controllers 100 that are each associated with a different one of the passenger seats. A communication network 420 communicatively interconnects the controllers 100. Each of the controllers 100 is assigned a unique network address that is used for routing information thereto through the communication network 420. The controllers 100 may be connected to the network 420 through associated SVDUs 400 as explained above, and the SVDUs 400 can also be assigned unique network addresses.
  • A network address translation router 1000 is configured to route a communication packet from an originating one of the passenger controllers 100 through the communication network 420 to a destination one of the passenger controllers 100 by translating between a passenger readable identifier for the destination passenger seat (entered or selected by a passenger) (e.g., “2A” to indicate row 2 seat A) and the network address of the passenger controller 100 associated with the destination passenger seat (e.g., “30C” indicating row 30 seat C).
  • The network address translation router 1000 may maintain a routing table that programmatically associates a passenger readable identifier for each of the passenger seats with the network address of the passenger controller 100 associated with the passenger seat. The router 1000 can be configured to receive a phone call and/or a text message via the communication network 420 from the originating passenger controller 100. The phone call and/or the text message can contain the passenger readable identifier for the destination passenger seat. The router 1000 can respond by querying the routing table using the passenger readable identifier to identify the associated network address for the passenger controller 100 associated with the destination passenger seat. The router 1000 then routes the phone call and/or the text message through the communication network 420 to the network address for the passenger controller 100 associated with the destination passenger seat.
  • In a further embodiment, the network address translation router 1000 receives an originating network address for the originating passenger controller 100 of the phone call and/or the text message, and queries the routing table using the originating network address to identify the passenger readable identifier for the passenger seat associated with the originating passenger controller 100. The router 1000 then routes the phone call and/or the text message, with information identifying the passenger readable identifier for the passenger seat associated with the originating passenger controller 100, to the network address for the passenger controller 100 associated with the destination passenger seat.
  • In a further embodiment, each of the passenger controllers 100 is configured to receive a phone call and/or a text message having information identifying a passenger readable identifier for a passenger seat associated with another passenger controller 100 that originated the phone call and/or the text message, and to display the passenger readable identifier for the received phone call and/or text message on the display device 632.
  • Further Definitions:
  • For the purposes of promoting an understanding of the principles of the invention, reference has been made to the embodiments illustrated in the drawings, and specific language has been used to describe these embodiments. However, no limitation of the scope of the invention is intended by this specific language, and the invention should be construed to encompass all embodiments that would normally occur to one of ordinary skill in the art. The terminology used herein is for the purpose of describing the particular embodiments and is not intended to be limiting of exemplary embodiments of the invention.
  • Numerous modifications and adaptations will be readily apparent to those of ordinary skill in this art without departing from the spirit and scope of the invention as defined by the following claims. Therefore, the scope of the invention is defined not by the detailed description of the invention but by the following claims, and all differences within the scope will be construed as being included in the invention.
  • For the sake of brevity, conventional electronics, systems, and software functional aspects of the systems (and components of the individual operating components of the systems) may not be described in detail. Furthermore, the connecting lines, or connectors shown in the various figures presented are intended to represent example communication/functional relationships and/or physical or logical couplings between the various elements. It should be noted that many alternative or additional functional relationships, physical connections or logical connections may be present in a practical device. The words “mechanism” and “element” are used broadly and are not limited to mechanical or physical embodiments, but may include software routines in conjunction with processors, etc.
  • As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, nodes, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, nodes, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless the context unambiguously indicates otherwise. In addition, it should be understood that although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms, which are only used to distinguish one element from another. The term “and/or”, abbreviated “/”, includes any and all combinations of one or more of the associated listed items.
  • When a node is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another node, it can be directly connected, coupled, or responsive to the other node or intervening nodes may be present. In contrast, when an node is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another node, there are no intervening nodes present. Like numbers refer to like nodes throughout.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • These computer program instructions may also be stored in a tangible computer-readable media that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable media produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
  • A tangible, non-transitory computer-readable media may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable media would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
  • The computer program instructions may also be loaded onto a computer and/or other programmable data processing device to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
  • It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
  • Many different embodiments have been disclosed herein, in connection with the above description and the drawings. It will be understood that it would be unduly repetitious and obfuscating to literally describe and illustrate every combination and subcombination of these embodiments. Accordingly, the present specification, including the drawings, shall be construed to constitute a complete written description of various example combinations and subcombinations of embodiments and of the manner and process of making and using them, and shall support claims to any such combination or subcombination.
  • Other controllers, SVDU, systems, and/or methods according to embodiments of the invention will be or become apparent to one with skill in the art upon review of the present drawings and description. It is intended that all such additional controllers, SVDU, systems, and/or methods be included within this description, be within the scope of the present invention, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.

Claims (20)

  1. 1. A controller for controlling an entertainment system that includes a video display unit that is separate from the controller, the controller comprising:
    a network interface for communicating with the video display unit via at least one data network;
    a display device; and
    a processing device that is configured to communicate a first command over the at least one data network to control a display of first content on the video display unit, and to control a display of second content on the display device of the controller, wherein the second content is displayed concurrently with the first content.
  2. 2. The controller of claim 1, wherein:
    the network interface comprises:
    a wireless transceiver that is configured to communicate with a remote wireless transceiver associated with the video display unit; and
    a wired network interface that is configured to communicate with the video display unit via a wired data network; and
    the processing device is configured to:
    control the wireless transceiver to establish a wireless communication link with the remote wireless transceiver to receive an identifier for the video display unit; and
    control the wired network interface to use the identifier to communicate the first command through the wired data network to control the display of the first content on the video display unit.
  3. 3. The controller of claim 2, wherein:
    the wireless transceiver comprises a near field transceiver that is configured to communicate with a remote near field transceiver associated with the video display unit; and
    the processing device is configured to control the near field transceiver to establish the communication link with the remote near field transceiver to receive the identifier for the video display unit without performing pairing of the near field transceiver and the remote near field transceiver.
  4. 4. The controller of claim 2, wherein:
    the processing device is configured to receive a network address for the video display unit via the wireless communication link, and to use the network address to communicate the first command through the wired data network to control the display of the first content on the video display unit.
  5. 5. The controller of claim 1, wherein:
    the network interface comprises:
    a first wireless transceiver that is configured to communicate with a remote first wireless transceiver associated with the video display unit; and
    a second wireless transceiver that is configured to communicate with a remote second wireless transceiver associated with the video display unit; and
    the processing device is configured to:
    control the first wireless transceiver to establish a first communication link with the remote first wireless transceiver and receive an identifier for the video display unit without performing pairing of the first wireless transceiver and the remote first wireless transceiver; and
    control the second wireless transceiver to use the identifier to perform pairing to the remote second wireless transceiver to establish a second communication link with the remote second wireless transceiver and to communicate the first command through the second wireless transceiver to control the display of the first content on the video display unit.
  6. 6. The controller of claim 5, wherein:
    the first wireless transceiver comprises a near field communication transceiver that is configured to communicate with a remote near field transceiver associated with the video display unit; and
    the processing device is configured to control the near field transceiver to establish the first communication link with the remote near field transceiver to receive the identifier for the video display unit without performing pairing of the near field transceiver and the remote near field transceiver.
  7. 7. The controller of claim 6, wherein:
    the second wireless transceiver comprises a Bluetooth transceiver that is configured to communicate with a remote Bluetooth transceiver associated with the video display unit; and
    the processing device is configured to:
    control the second wireless transceiver to use the identifier to perform pairing between the Bluetooth transceiver and the remote Bluetooth transceiver to establish a Bluetooth communication link; and
    communicate the first command through the Bluetooth communication link to control the display of the first content on the video display unit.
  8. 8. The controller of claim 6, wherein:
    the second wireless transceiver comprises a wireless local area network transceiver that is configured to communicate with a wireless local area network router that has a communication link to a remote wireless local area network transceiver associated with the video display unit; and
    the processing device is configured to:
    control the wireless local area network transceiver to use the identifier to perform pairing to the wireless local area network router to establish the second communication link through the wireless local area network router to the remote wireless local area network transceiver; and
    communicate the first command through the second communication link to control the display of the first content on the video display unit.
  9. 9. The controller of claim 1, wherein:
    the network interface comprises:
    a first wireless transceiver that is configured to communicate with a remote first wireless transceiver associated with the video display unit; and
    a second wireless transceiver that is configured to communicate with a remote second wireless transceiver associated with the video display unit, wherein the first wireless transceiver transmits at a lower data communication rate than the second wireless transceiver; and
    the processing device is configured to:
    communicate commands received from a user through the first wireless transceiver to control the video display unit; and
    receive the second content through the second wireless transceiver for display on the display device of the controller.
  10. 10. The controller of claim 9, wherein:
    the first wireless transceiver comprises a Bluetooth transceiver that is configured to communicate with a remote Bluetooth transceiver associated with the video display unit; and
    the second wireless transceiver comprises a wireless local area network transceiver that is configured to communicate with a remote wireless local area network transceiver associated with the video display unit.
  11. 11. The controller of claim 9, wherein the processing device is configured to:
    communicate the first command through the first wireless transceiver to control selection of a video stream; and
    receive the selected video stream through the second wireless transmitter for display on the display device of the controller.
  12. 12. The controller of claim 9, wherein:
    the processing device is configured to communicate the first command through the first wireless transceiver to control starting, pausing, and/or stopping of streaming of the selected video stream received through the second wireless transmitter of the controller.
  13. 13. The controller of claim 9, wherein:
    the processing device is configured to communicate commands through the first wireless transceiver to control operation of a game program that is being executed by a remote processing device of the video display unit.
  14. 14. The controller of claim 13, wherein:
    the processing device is configured to communicate commands through the first wireless transceiver to cause first video content to be output from the game program and displayed by the video display unit and to further cause second video content to be output from the game program and transmitted from the video display unit and received through the second wireless transceiver for display on the display device of the controller.
  15. 15. The controller of claim 1,
    further comprising a removable memory interface that is configured to receive a removable memory media and to store and retrieve data on the removable memory media,
    wherein the processing device is configured to control the display of the second content on the display device responsive to program code that is retrieved from the removable memory media.
  16. 16. The controller of claim 15, wherein:
    the processing device is configured to execute at least one application program that is stored on the removable memory media to control interaction between the controller and a user.
  17. 17. An entertainment system for a passenger vehicle that includes a plurality of passenger seats, the entertainment system comprising:
    a plurality of passenger controllers that are each associated with a different one of the passenger seats, wherein each of the passenger controllers includes a passenger interface for receiving and outputting information;
    a communication network that communicatively interconnects the passenger controllers, wherein each of the passenger controllers is assigned a network address that is used for routing information thereto through the communication network; and
    a network address translation router that is configured to:
    route a communication packet from an originating one of the passenger controllers through the communication network to a destination one of the passenger controllers by translating between a passenger readable identifier for the destination passenger seat and the network address of the passenger controller associated with the destination passenger seat.
  18. 18. The entertainment system of claim 17, wherein:
    the network address translation router is further configured to:
    maintain a routing table that programmatically associates a passenger readable identifier for each of the passenger seats with the network address of the passenger controller associated with the passenger seat;
    receive a phone call and/or a text message via the communication network from the originating passenger controller, wherein the phone call and/or the text message contains the passenger readable identifier for the destination passenger seat;
    query the routing table using the passenger readable identifier to identify the associated network address for the passenger controller associated with the destination passenger seat; and
    route the phone call and/or the text message through the communication network to the network address for the passenger controller associated with the destination passenger seat.
  19. 19. The entertainment system of claim 18, wherein:
    the network address translation router is further configured to:
    receive an originating network address for the originating passenger controller of the phone call and/or the text message;
    query the routing table using the originating network address to identify the passenger readable identifier for the passenger seat associated with the originating passenger controller; and
    route the phone call and/or the text message, with information identifying the passenger readable identifier for the passenger seat associated with the originating passenger controller, to the network address for the passenger controller associated with the destination passenger seat.
  20. 20. The entertainment system of claim 19, wherein:
    each of the passenger controllers includes a display device and a processing device, each of the processing devices are configured to:
    receive a phone call and/or a text message having information identifying a passenger readable identifier for a passenger seat associated with another passenger controller that originated the phone call and/or the text message; and
    display the passenger readable identifier for the received phone call and/or text message on the display device.
US13328224 2010-12-29 2011-12-16 Controlling display of content on networked passenger controllers and video display units Active 2032-02-21 US9060202B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201061427871 true 2010-12-29 2010-12-29
US13328224 US9060202B2 (en) 2010-12-29 2011-12-16 Controlling display of content on networked passenger controllers and video display units

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13328224 US9060202B2 (en) 2010-12-29 2011-12-16 Controlling display of content on networked passenger controllers and video display units
US14028185 US9584846B2 (en) 2011-12-16 2013-09-16 In-flight entertainment system with wireless handheld controller and cradle having controlled locking and status reporting to crew

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14028185 Continuation-In-Part US9584846B2 (en) 2010-12-29 2013-09-16 In-flight entertainment system with wireless handheld controller and cradle having controlled locking and status reporting to crew

Publications (2)

Publication Number Publication Date
US20120174165A1 true true US20120174165A1 (en) 2012-07-05
US9060202B2 US9060202B2 (en) 2015-06-16

Family

ID=46382024

Family Applications (1)

Application Number Title Priority Date Filing Date
US13328224 Active 2032-02-21 US9060202B2 (en) 2010-12-29 2011-12-16 Controlling display of content on networked passenger controllers and video display units

Country Status (4)

Country Link
US (1) US9060202B2 (en)
EP (1) EP2659385A4 (en)
CN (1) CN103502967B (en)
WO (1) WO2012091961A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120005495A1 (en) * 2008-09-26 2012-01-05 Yoshimichi Matsuoka Portable power supply device with outlet connector
US20130205212A1 (en) * 2012-02-07 2013-08-08 Nishith Kumar Sinha Method and system for a universal remote control
US8525830B2 (en) 2010-09-17 2013-09-03 The Boeing Company Point cloud generation system
US8556162B2 (en) * 2011-11-21 2013-10-15 The Boeing Company Component programming system
US20130314605A1 (en) * 2012-05-23 2013-11-28 Takashi Minemura Content transmitter and content transmission method
US20140094143A1 (en) * 2011-06-28 2014-04-03 The Boeing Company Passenger mobile station registration with a passenger communications system using near field communicaitons
DE102013203226A1 (en) 2013-02-27 2014-08-28 Airbus Operations Gmbh Method and system for seat allocation personal electronic devices in an aircraft
US8850045B2 (en) 2008-09-26 2014-09-30 Qualcomm Incorporated System and method for linking and sharing resources amongst devices
WO2014209135A1 (en) * 2013-06-24 2014-12-31 Nigel Greig In-flight entertainment control unit
WO2015044617A3 (en) * 2013-09-30 2015-06-04 Zodiac Aerotechnics System for managing the comfort equipment of an aircraft cabin, aircraft cabin and method for identifying and locating the comfort equipment
CN104777897A (en) * 2013-12-29 2015-07-15 伊默森公司 Distributed control architecture for haptic devices
WO2015154043A2 (en) 2014-04-04 2015-10-08 Systems And Software Enterprises, Llc Mobile device in-flight entertainment connection
USD741795S1 (en) 2013-10-25 2015-10-27 Milwaukee Electric Tool Corporation Radio charger
WO2016009348A1 (en) * 2014-07-18 2016-01-21 De Cori Riccardo Video player for remotely viewing media
WO2017004328A1 (en) * 2015-06-30 2017-01-05 Bae Systems Controls Inc. Vehicle display
US9614800B1 (en) * 2014-01-17 2017-04-04 Rockwell Collins, Inc. Threaded datalink message display
US9781496B2 (en) 2012-10-25 2017-10-03 Milwaukee Electric Tool Corporation Worksite audio device with wireless interface
EP3290949A1 (en) * 2016-08-29 2018-03-07 The Boeing Company Local positioning with communication tags

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150045439A (en) * 2012-07-13 2015-04-28 싱가포르 에어라인스 리미티드 A method and device for controlling a display device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6928654B2 (en) * 2000-10-27 2005-08-09 Audiovox Corporation Vehicle display device for simultaneously displaying one or more video programs on separate displays
US20060143662A1 (en) * 2004-12-28 2006-06-29 Livetv, Llc Aircraft in-flight entertainment system with a distributed memory and associated methods
US20060238301A1 (en) * 2005-02-22 2006-10-26 Jiangfeng Wu Multi-protocol radio frequency identification transponder tranceiver
US20070124765A1 (en) * 2005-11-30 2007-05-31 Bennett James D Parallel television remote control
US20070157285A1 (en) * 2006-01-03 2007-07-05 The Navvo Group Llc Distribution of multimedia content
US20070250872A1 (en) * 2006-03-21 2007-10-25 Robin Dua Pod module and method thereof
US20070288969A1 (en) * 2006-05-23 2007-12-13 Mga Entertainment, Inc. Interactive game system using game data encoded within a video signal
US20080057868A1 (en) * 2006-09-05 2008-03-06 Samsung Electronics Co.; Ltd Auto pairing system and method for bluetooth network
US20080253317A1 (en) * 2006-10-11 2008-10-16 Anil Gercekci Wireless Networks for Vehicles
US7840991B2 (en) * 2003-08-11 2010-11-23 Thomas Dusenberry In-theatre interactive entertainment system
US20110086631A1 (en) * 2009-10-13 2011-04-14 Samsung Electronics Co., Ltd. Method for controlling portable device, display device, and video system
US20110314507A1 (en) * 2010-06-22 2011-12-22 Livetv Llc Registration of a personal electronic device (ped) with an aircraft ife system using aircraft generated registration token images and associated methods
US20120066722A1 (en) * 2010-09-14 2012-03-15 At&T Intellectual Property I, L.P. Enhanced Video Sharing
US8677423B2 (en) * 2000-12-28 2014-03-18 At&T Intellectual Property I, L. P. Digital residential entertainment system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6131119A (en) * 1997-04-01 2000-10-10 Sony Corporation Automatic configuration system for mapping node addresses within a bus structure to their physical location
US6147980A (en) * 1997-11-28 2000-11-14 Motorola, Inc. Avionics satellite based data message routing and delivery system
US6499027B1 (en) * 1998-05-26 2002-12-24 Rockwell Collins, Inc. System software architecture for a passenger entertainment system, method and article of manufacture
EP1723537A4 (en) 2004-02-17 2010-07-07 Thales Avionics Inc A system and method utilizing internet protocol (ip) sequencing to identify components of a passenger flight information system (pfis)
US7945934B2 (en) * 2004-06-15 2011-05-17 Panasonic Avionics Corporation Portable media device and method for presenting viewing content during travel
CN106453260A (en) * 2006-10-02 2017-02-22 赛乐得公司 Method and system for improving client server transmission over fading channel
EP1793592A3 (en) * 2005-11-30 2008-02-27 Broadcom Corporation Parallel television remote control
US20070135046A1 (en) 2005-12-14 2007-06-14 Ash Kapur Method and system for bluetooth® common signaling for non-bluetooth® data channels
US7809333B2 (en) 2006-09-29 2010-10-05 Broadcom Corporation System and method for streaming identical data over several short range links
US20100060739A1 (en) 2008-09-08 2010-03-11 Thales Avionics, Inc. System and method for providing a live mapping display in a vehicle
US20100297941A1 (en) 2008-11-25 2010-11-25 Unify4Life Corporation Remote control system and method employing cellular telephones which include short range radio transceivers
US9016627B2 (en) 2009-10-02 2015-04-28 Panasonic Avionics Corporation System and method for providing an integrated user interface system at a seat

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6928654B2 (en) * 2000-10-27 2005-08-09 Audiovox Corporation Vehicle display device for simultaneously displaying one or more video programs on separate displays
US8677423B2 (en) * 2000-12-28 2014-03-18 At&T Intellectual Property I, L. P. Digital residential entertainment system
US7840991B2 (en) * 2003-08-11 2010-11-23 Thomas Dusenberry In-theatre interactive entertainment system
US20060143662A1 (en) * 2004-12-28 2006-06-29 Livetv, Llc Aircraft in-flight entertainment system with a distributed memory and associated methods
US20060238301A1 (en) * 2005-02-22 2006-10-26 Jiangfeng Wu Multi-protocol radio frequency identification transponder tranceiver
US20070124765A1 (en) * 2005-11-30 2007-05-31 Bennett James D Parallel television remote control
US20070157285A1 (en) * 2006-01-03 2007-07-05 The Navvo Group Llc Distribution of multimedia content
US20070250872A1 (en) * 2006-03-21 2007-10-25 Robin Dua Pod module and method thereof
US20070288969A1 (en) * 2006-05-23 2007-12-13 Mga Entertainment, Inc. Interactive game system using game data encoded within a video signal
US20080057868A1 (en) * 2006-09-05 2008-03-06 Samsung Electronics Co.; Ltd Auto pairing system and method for bluetooth network
US20080253317A1 (en) * 2006-10-11 2008-10-16 Anil Gercekci Wireless Networks for Vehicles
US20110086631A1 (en) * 2009-10-13 2011-04-14 Samsung Electronics Co., Ltd. Method for controlling portable device, display device, and video system
US20110314507A1 (en) * 2010-06-22 2011-12-22 Livetv Llc Registration of a personal electronic device (ped) with an aircraft ife system using aircraft generated registration token images and associated methods
US20120066722A1 (en) * 2010-09-14 2012-03-15 At&T Intellectual Property I, L.P. Enhanced Video Sharing

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8850045B2 (en) 2008-09-26 2014-09-30 Qualcomm Incorporated System and method for linking and sharing resources amongst devices
US20120005495A1 (en) * 2008-09-26 2012-01-05 Yoshimichi Matsuoka Portable power supply device with outlet connector
US8868939B2 (en) * 2008-09-26 2014-10-21 Qualcomm Incorporated Portable power supply device with outlet connector
US8525830B2 (en) 2010-09-17 2013-09-03 The Boeing Company Point cloud generation system
US20140094143A1 (en) * 2011-06-28 2014-04-03 The Boeing Company Passenger mobile station registration with a passenger communications system using near field communicaitons
US8556162B2 (en) * 2011-11-21 2013-10-15 The Boeing Company Component programming system
US20130205212A1 (en) * 2012-02-07 2013-08-08 Nishith Kumar Sinha Method and system for a universal remote control
US9210467B2 (en) * 2012-02-07 2015-12-08 Turner Broadcasting System, Inc. Method and system for a universal remote control
US20130314605A1 (en) * 2012-05-23 2013-11-28 Takashi Minemura Content transmitter and content transmission method
US9781496B2 (en) 2012-10-25 2017-10-03 Milwaukee Electric Tool Corporation Worksite audio device with wireless interface
DE102013203226A1 (en) 2013-02-27 2014-08-28 Airbus Operations Gmbh Method and system for seat allocation personal electronic devices in an aircraft
WO2014209135A1 (en) * 2013-06-24 2014-12-31 Nigel Greig In-flight entertainment control unit
WO2015044617A3 (en) * 2013-09-30 2015-06-04 Zodiac Aerotechnics System for managing the comfort equipment of an aircraft cabin, aircraft cabin and method for identifying and locating the comfort equipment
US20160236782A1 (en) * 2013-09-30 2016-08-18 Zodiac Aerotechnics System for managing the comfort equipment of an aircraft cabin, aircraft cabin and method for identifying and locating the comfort equipment
USD741795S1 (en) 2013-10-25 2015-10-27 Milwaukee Electric Tool Corporation Radio charger
US20150205352A1 (en) * 2013-12-29 2015-07-23 Immersion Corporation Distributed control architecture for haptic devices
CN104777897A (en) * 2013-12-29 2015-07-15 伊默森公司 Distributed control architecture for haptic devices
US9614800B1 (en) * 2014-01-17 2017-04-04 Rockwell Collins, Inc. Threaded datalink message display
WO2015154043A2 (en) 2014-04-04 2015-10-08 Systems And Software Enterprises, Llc Mobile device in-flight entertainment connection
WO2016009348A1 (en) * 2014-07-18 2016-01-21 De Cori Riccardo Video player for remotely viewing media
WO2017004328A1 (en) * 2015-06-30 2017-01-05 Bae Systems Controls Inc. Vehicle display
EP3290949A1 (en) * 2016-08-29 2018-03-07 The Boeing Company Local positioning with communication tags

Also Published As

Publication number Publication date Type
EP2659385A4 (en) 2015-03-11 application
CN103502967B (en) 2016-07-13 grant
CN103502967A (en) 2014-01-08 application
WO2012091961A1 (en) 2012-07-05 application
US9060202B2 (en) 2015-06-16 grant
EP2659385A1 (en) 2013-11-06 application

Similar Documents

Publication Publication Date Title
US20110022393A1 (en) Multimode user interface of a driver assistance system for inputting and presentation of information
US20090109126A1 (en) Multiple view display system
US20100162325A1 (en) In-Flight Entertainment System
US6396506B1 (en) Display operation method to change the number of images to be displayed and to independently change image direction and rotation of each image
US20120299870A1 (en) Wearable Heads-up Display With Integrated Finger-tracking Input Sensor
US20070182721A1 (en) Display Device, User Interface, and Method for Providing Menus
US20090322678A1 (en) Private screens self distributing along the shop window
US20100162326A1 (en) In-Flight Entertainment System
US20090288123A1 (en) Passenger tray convenience system
US5896129A (en) User friendly passenger interface including audio menuing for the visually impaired and closed captioning for the hearing impaired for an interactive flight entertainment system
US8547340B2 (en) Portable user control device and method for vehicle information systems
US20100088597A1 (en) Method and apparatus for configuring idle screen of portable terminal
US20100131188A1 (en) Navigation system and control method thereof
US20130152003A1 (en) Configurable dash display
US20070257889A1 (en) Display lit soft keys that have multiple states of activation potentially associated with each soft key
US20130293452A1 (en) Configurable heads-up dash display
US20110219409A1 (en) Aircraft in-flight entertainment system with enhanced seatback tray passenger control units and associated methods
US20100179864A1 (en) Multimedia, multiuser system and associated methods
EP2428947A2 (en) Terminal and contents sharing method for terminal
US20130063612A1 (en) In-flight system
US9292082B1 (en) Text-entry for a computing device
US20120066643A1 (en) System, method and apparatus for presenting a user interface
US9079749B2 (en) Simple node transportation system and node controller and vehicle controller therein
US20120132746A1 (en) Integrated User Interface System and Method
US9164619B2 (en) Configurable touch screen LCD steering wheel controls

Legal Events

Date Code Title Description
AS Assignment

Owner name: THALES AVIONICS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MONDRAGON, CHRISTOPHER K.;BLEACHER, BRETT;DALY, JOHN J.;AND OTHERS;REEL/FRAME:027399/0168

Effective date: 20111214