US20200118083A1 - Travel Environment Control - Google Patents

Travel Environment Control Download PDF

Info

Publication number
US20200118083A1
US20200118083A1 US16/710,932 US201916710932A US2020118083A1 US 20200118083 A1 US20200118083 A1 US 20200118083A1 US 201916710932 A US201916710932 A US 201916710932A US 2020118083 A1 US2020118083 A1 US 2020118083A1
Authority
US
United States
Prior art keywords
passenger
sensor
data
event
travel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/710,932
Inventor
Daniel JOBLING
Estelle LEVACHER
Glenn Morgan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Airways PLC
Original Assignee
British Airways PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Airways PLC filed Critical British Airways PLC
Priority to US16/710,932 priority Critical patent/US20200118083A1/en
Publication of US20200118083A1 publication Critical patent/US20200118083A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D11/00Passenger or crew accommodation; Flight-deck installations not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/40Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types
    • B60Q3/41Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types for mass transit vehicles, e.g. buses
    • B60Q3/44Spotlighting, e.g. reading lamps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/40Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types
    • B60Q3/41Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types for mass transit vehicles, e.g. buses
    • B60Q3/47Circuits; Control arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D11/00Passenger or crew accommodation; Flight-deck installations not otherwise provided for
    • B64D11/0007Devices specially adapted for food or beverage distribution services
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D11/00Passenger or crew accommodation; Flight-deck installations not otherwise provided for
    • B64D11/06Arrangements of seats, or adaptations or details specially adapted for aircraft seats
    • B64D11/0639Arrangements of seats, or adaptations or details specially adapted for aircraft seats with features for adjustment or converting of seats
    • B64D11/064Adjustable inclination or position of seats
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D13/00Arrangements or adaptations of air-treatment apparatus for aircraft crew or passengers, or freight space, or structural parts of the aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D13/00Arrangements or adaptations of air-treatment apparatus for aircraft crew or passengers, or freight space, or structural parts of the aircraft
    • B64D13/06Arrangements or adaptations of air-treatment apparatus for aircraft crew or passengers, or freight space, or structural parts of the aircraft the air being conditioned
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • G06Q10/025Coordination of plural reservations, e.g. plural trip segments, transportation combined with accommodation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/14Travel agencies
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D11/00Passenger or crew accommodation; Flight-deck installations not otherwise provided for
    • B64D2011/0053Cabin passenger reading lights

Definitions

  • the present invention relates to systems and methods for controlling a travel environment, such as for example in an aircraft cabin, so that the travel environment is personalised to the individual passenger.
  • Jet lag is also be addressed by controlling a passenger's sleep, eating and exercise patterns.
  • Mobile apps such as ‘Jet Lag Fighter’ from Virgin Atlantic, allow the user to enter personal data such as age, gender and health status, and provide a personalised programme to alleviate jet lag.
  • What is desired is a system that facilitates greater efficiencies within the aircraft travel environment and enables improved control and personalisation of the passenger's travel environment, in particular for enhanced passenger wellness and wellbeing when flying.
  • a system for controlling the travel environment for a passenger in which passenger data is obtained from an existing source of customer data rather than requiring the passenger to enter their data manually.
  • the customer data may include information on the passenger's travel itinerary.
  • one or more sensor inputs may provide information on the environmental conditions in the vicinity of the passenger.
  • the system provides one or more outputs to improve the passenger's travel environment or experience.
  • the present invention provides a system for dynamic travel event scheduling, in which stored data including information relating to a passenger's itinerary is retrieved, the itinerary including at least one scheduled journey.
  • the system generates a dynamic event schedule based on the retrieved data, the dynamic event schedule including at least one event associated with at least one action output.
  • One or more sensor inputs are received, providing information on the physiological state of the passenger and/or environmental conditions in the vicinity of the passenger.
  • the system identifies one or more affected events of the dynamic event schedule based on the received sensor inputs, and provides one or more action outputs to control the passenger's travel environment based on the at least one event.
  • the outputs to control the passenger's travel environment may comprises one or more of signals to control one or more properties of a passenger seat, and control lighting and/or air conditioning above and/or around the passenger's seat.
  • the at least one event may be selected from a set of predefined events including: sleep, wake, stretch, exercise, eat, drink, stay awake, and engage in-flight entertainment.
  • the sleep and wake events may be associated with respective action outputs to automatically control a recline position of the passenger's seat and a lighting level above or around the passenger's seat.
  • Each scheduled events may be associated with a respective timing parameter and wherein the system is further operable to update the travel path data by adjusting respective timing parameters of the one or more affected events.
  • the retrieved data may also include information relating to at least one of the passenger's personal preferences, an in-flight meal schedule, and an automated cabin lighting schedule.
  • the system may be further operable to generate data defining a dynamic event schedule is further operable to generate auxiliary data for an event defining the associated action output.
  • a new event for the dynamic event schedule may be determined based on the received sensor inputs.
  • the system may be further operable to output the travel path data as an interactive interface.
  • the sensor inputs may be received from one or more of: a temperature sensor, a lighting sensor, a humidity sensor, a body movement sensor, a sleep phase sensor, an eye movement sensor, a heart rate sensor, a body temperature sensor, and an ingestible sensor.
  • a computer program comprising machine readable instructions stored thereon arranged to cause a programmable device to become configured as the systems as described above.
  • FIG. 1 is a block diagram of a system according to an embodiment of the invention.
  • FIG. 2 is a perspective view of a seating unit to which the system may be applied.
  • FIG. 3 is a block diagram of a mobile device for use in the embodiments of the invention.
  • FIG. 4 is a block diagram illustrating the processing modules of the mobile device of FIG. 3 according to an embodiment of the invention.
  • FIG. 5 is a block diagram of a server according to an alternative embodiment of the invention.
  • FIG. 6 which comprises FIGS. 6A and 6B , is a flow diagram illustrating processing steps performed by the mobile device of FIG. 4 according to an embodiment.
  • FIG. 7 schematically illustrates an example of an initial view of an interactive travel path displayed by the mobile device.
  • FIG. 8 which comprises FIGS. 8A and 8B , schematically illustrates an example of a detailed view of the interactive travel path in FIG. 7 .
  • FIG. 9 is a diagram of an example of a computer system for use in embodiments of the invention.
  • FIG. 1 shows schematically the elements of a travel environment control system in an embodiment of the invention that relates to commercial air travel. Aspects of the invention may be applicable to other travel environments, such as trains, cars, buses, etc. At least some of the elements are optional, at least for certain applications.
  • an automated system accesses customer data and environmental data relating to the local environment of a passenger, for example from sensors. On the basis of these inputs, the system is able to control the local environment according to the specific requirements of the passenger and the properties of the local environment. The system is also able to determine an event timeline for the passenger's optimal wellness, indicating events along the timeline and associated timing aspects, and dynamically adjust the event timeline in response to passenger and environment data from the sensors. The events may define automated control of the local environment. Specific examples and applications will be described below.
  • a server 3 located on board an aircraft, is in communication with one or more in-cabin networks 5 , which connect the server 3 to a passenger's mobile device 7 running a travel app 9 , one or more sensors 11 , a seat controller 13 and an environment controller 15 .
  • An IFE (In-Flight Entertainment) unit 17 may also be connected to the server 3 .
  • the in-cabin network 5 may be a wired or wireless network, and may include one or more ad-hoc data connections between nodes in the network, such as the server 3 and the passenger's mobile device 7 .
  • the server 3 may also be in communication with a crew mobile device 19 via the in-cabin network(s) 5 .
  • the crew mobile device 19 may run a crew app 21 , as described in more detail below.
  • the in-cabin network 5 may include a mesh network, as is known in the art, with mesh nodes formed by one or more of the passenger's mobile device 7 and the crew mobile devices 19 , as well as sensors 11 , seat controllers 13 and environment controllers 15 adapted for ad-hoc network connectivity. It will be appreciated that such a mesh network can typically also include one or more mesh routers that forward data to and from one or more gateways, which may optionally be connected to the Internet.
  • the server 3 has access to customer data 23 and flight data 25 , for example from one or more local databases and/or remote databases 26 accessible via one or more external data networks 28 , such as the Internet.
  • the customer data 23 may comprise customer biometric details such as age, gender, height, weight, etc., health status, and personal preferences, such as dietary requirements, sleeping habits etc.
  • the customer data 23 may be provided by customer input, for example within a travel app 9 running on a passenger's mobile device 7 or via a web-based interface, or may be provided from a user profile within another service with which the user is registered, such as a social network.
  • the customer data 23 may also comprise location data relating to the current or last-known geographical location of the passenger or the passenger's mobile device 7 , for example from location tracking information received from the passenger's mobile device 7 .
  • the customer data 23 and flight data 25 may be stored in the passenger passenger's mobile device 7 , and may be updated when the travel app 9 is connected to the server 3 .
  • the flight data 25 is linked to the customer data 23 and includes data relating to flights that the customer has booked, checked in for, or boarded.
  • the flight data 25 includes the timing and duration of the flight, as well as the departure and arrival points of the flight, and information of any connecting flights.
  • the flight data 25 may also include information associated with in-flight aspects, such as meal and/or cabin lighting schedules for the specific flight, as well as information associated with offers for the customer, such as available flight upgrades.
  • the travel app 9 on the passenger's mobile device 7 generates and outputs a dynamic event timeline based on the customer data 23 and flight data 25 , and enables passenger interaction with the dynamic event timeline.
  • the travel app 9 can also output control signals or messages to the seat controller 13 , the environment controller 15 and/or the crew mobile device 19 and crew app 21 .
  • the server 3 may be configured to generate data associated with the dynamic event timeline based on the customer data 23 and flight data 25 , and to transmit the generated data to the passenger's mobile device 7 for display by the travel app 9 .
  • the travel app 9 may be configured to run in the background, to collect and provide information to the server 3 on an ongoing basis, and to receive and process push updates and event triggers from the server 3 .
  • the seat controller 13 may automatically control, without direct user input, one or more properties of a passenger seat 31 , for example as shown in FIG. 2 .
  • the passenger seat 31 comprises a reclinable seat back 33 and a seat pan 35 that moves forward as the seat back 33 reclines, under the control of the seat controller 13 .
  • Arm rests 35 a , 35 b may drop as the seat reclines, again under the control of the seat controller 13 .
  • a foot rest 37 may drop or be adjustable, under the control of the seat controller 13 .
  • the seat may be contained within a housing 39 , and separated from adjacent seats by a retractable privacy screen 41 , both of which afford a degree of privacy to the passenger.
  • the IFE unit 17 is provided adjacent the seat 31 .
  • the environment controller 15 may control the environment around the passenger, for example by controlling lighting 43 and/or air conditioning 45 above and/or around the passenger's seat 31 .
  • the sensor(s) 11 may include one or more environmental sensors for collecting and providing data relating to aspects of the environment on board the aircraft, either locally to the passenger or within the cabin as a whole, such as:
  • the sensor(s) 11 may also include one or more passenger sensors for collecting and providing data relating to aspects of the customer on board the aircraft, which can vary depending on the physiological state of the customer.
  • the passenger sensors may be wearable by the passenger, separate from the passenger, or included within the passenger's mobile device 7 , and include sensors such as:
  • the passenger sensors may also be configured to collect data relating to aspects of the customer before boarding the flight, for example over a predefined period of time preceding a scheduled flight and/or on the day before boarding the scheduled flight.
  • the pre-flight collected data can be stored by the respective passenger sensors and/or provided to the passenger's mobile device 7 , for use in determining and scheduling events associated with the journey segments as will be described in more detail below.
  • the mobile device 7 , 19 may be a smartphone, tablet, PDA (Personal Digital Assistant), or a wearable device such as a smart watch, an electronic wristband, or Google GlassTM, etc.
  • FIG. 3 is a schematic diagram of one such exemplary mobile device 7 , 19 , having a processor 51 , memory 53 , a display screen 55 , user input module 57 , a location signal receiver 59 and communications interface(s) 61 .
  • the location signal receiver 59 may be a GPS based receiver for determining a geolocation of the mobile device 7 , 19 .
  • the mobile device 7 , 19 may also include one or more of: a microphone 63 , one or more sensors 65 , one or more sensor interfaces 67 that connect the mobile device 7 , 19 to respective sensors 11 , a speaker 69 and a camera 71 .
  • the travel app 9 and crew app 21 may be downloaded and installed to the memory 53 of the mobile device 7 , 19 , and may require registration of the user with the server 3 via the app, or secure log-in to the app by an existing registered user with the server 3 .
  • the server 3 may also be connected to a mobile data network (not shown), for communication with the mobile devices 7 , 19 .
  • a mobile data network not shown
  • the mobile devices 7 , 19 are typically required to be placed in “flight mode” when the passenger and crew are on board the aircraft, and data communication during a flight may be restricted to connections via the in-cabin network(s) 9 .
  • FIG. 4 is a block diagram illustrating processing modules 81 of the passenger's mobile device 7 in the present embodiment.
  • the passenger's mobile device 7 stores a local copy of the passenger's customer data 83 and flight data 85 that can be retrieved and processed by the processing modules 81 .
  • One or more of the processing modules 81 may be provided as integrated components of the passenger's travel app 21 , or may be provided as separate mobile applications linked to the travel app 21 . Additionally or alternatively, one or more of the processing modules 81 may be configured to receive data from respective modules in the server 3 for output by the passenger's travel app 21 .
  • the passenger's mobile device 7 includes a travel path module 81 - 1 for generating a travel path for a passenger having a booked and/or purchased journey, including various booked entities such as an outbound flight from the passenger's home location to a destination location, a hotel reservation at the destination location, a return flight, etc.
  • the travel path for the passenger's booked journey may be an end-to-end plan, consisting of a plurality of journey segments from a departure point, such as the passenger's home location, to a destination point, such as a boarding gate in the departure airport terminal assigned to the passenger's flight or the location of a hotel booked in the destination city.
  • the travel path module 51 processes the customer and flight data 23 , 25 to identify and determine the journey segments, as well as associated timing parameters, such as anticipated start time, duration, etc.
  • the travel path module 51 can also determine and schedule predefined events associated with the journey segments, that together define the timeline of scheduled events for the travel path.
  • the travel path module 51 can also dynamically revise and update the travel path and event timeline based on the monitored geographical location of the customer together with environmental information retrieved from a plurality of data sources, for example to take into account identified disruptions to the travel path.
  • the travel path module 81 - 1 may generate data for a graphical user interface (GUI) representation of the travel path.
  • GUI graphical user interface
  • the generated travel path data may define an interactive graphical representation of the travel path to be displayed by the travel app 9 .
  • the passenger's mobile device 7 also includes a wellness planner module 81 - 3 for determining and scheduling events associated with the journey segments, based on a predefined set of rules that combine the customer data, flight data and sensor data to create a personalised travel experience for the passenger.
  • the wellness planning module 81 - 3 can also schedule predefined events associated with respective journey segments, based on factors such as passenger preferences, travel itinerary, passenger current physical state, etc.
  • the wellness planner module 51 can determine a sequence of events and associated scheduling information to assist a passenger with overcoming the effects of jet lag at the destination.
  • the predefined events may include sleep, wake, stretch, exercise, eat, drink, stay awake, engage in-flight entertainment, etc.
  • the predefined events may be determined for a plurality of journey segments, such as a segment of time before the scheduled flight, a segment of time in-flight, and a segment of time after arrival at the destination.
  • Post-flight events may also be scheduled, such as treatment bookings.
  • the wellness planner module 81 - 3 can also dynamically adjust and/or reschedule the events, such as the event order, start time, end time, duration, in response to received sensor input data.
  • the wellness planning module 81 - 3 can schedule in-flight events relating to optimal sleeping, eating and exercise patterns to assist with alleviating the passenger's jet lag at the destination.
  • the wellness planning module 53 can dynamically adjust the events in response to received passenger and environment data from the sensors indicating, for example, that the passenger is awake, asleep, hungry, nervous, hot, cold, uncomfortable, etc.
  • the wellness planner module 51 can determine a sequence of events and associated scheduling information to assist a passenger with a personalized and more comfortable in-flight experience.
  • the predefined events may include sleep suggestion, wake-up alarm, personalised sound/audio file output, in-flight exercise programme.
  • the scheduled events can also include automated commands and/or instructions to external controllers and output devices.
  • the control system 87 can output:
  • the wellness planner module 81 - 3 can also directly control aspects of the passenger's local environment defined by the scheduled events, such as one or more properties of the passenger seat 31 via a seat controller module 81 - 7 in communication with the seat controller 13 over the in-cabin network 5 , and one or more properties of the environment around the passenger via an environment controller module 81 - 9 in communication with the environment controller 15 over the in-cabin network 5 .
  • the seat controller module 81 - 7 and the environment controller module 81 - 9 may be configured to communicate control instructions to the seat controller 13 and the environment controller 15 , respectively, via the server 3 .
  • the mobile device 7 can include a module to enable the passenger to control, via direct user input, properties of the passenger seat 31 and/or the environment.
  • the passenger's mobile device 7 can also include an IFE interface module 81 - 11 for communicating data with the IFE 17 over the in-cabin network 5 .
  • the passenger's mobile device 7 can include a cockpit simulator module 81 - 13 for displaying an interactive cockpit simulator via the passenger's mobile device 7 , for example based on input data from sensors located about the aircraft and non-sensitive information received from the aircraft's cockpit system(s).
  • FIG. 5 is a block diagram of the server 3 , illustrating the processing modules of the server 3 in an alternative embodiment.
  • the server 3 includes a control system 87 that retrieves customer data 23 and flight data 25 for a plurality of registered users via respective database interfaces 88 , 89 , and receives sensor data from sensors 11 via a sensor interface module 90 .
  • the control system 87 includes a travel path module 91 - 1 for determining travel paths and associated scheduled events for the plurality of passengers, based on the retrieved customer and flight data 23 , 25 and the received sensor data, and for determining and scheduling predefined events associated with the journey segments.
  • the control system 87 also includes a wellness planner module 91 - 3 for processing the data based on specific requirements of the respective passengers and scheduling predefined events for the respective passenger's optimal wellbeing, and for automatically controlling the respective passenger's local in-flight environment according to the scheduled events.
  • a wellness planner module 91 - 3 for processing the data based on specific requirements of the respective passengers and scheduling predefined events for the respective passenger's optimal wellbeing, and for automatically controlling the respective passenger's local in-flight environment according to the scheduled events.
  • the travel path module 91 - 1 of the control system 87 can generate data for the interactive GUI representation of the travel path, for example based on scheduling data determined by the wellness planner module 91 - 3 .
  • the generated travel path data is communicated to the travel app 9 on the passenger's mobile device 7 , via a travel app interface module 92 .
  • the control system 87 can also output control signals associated with scheduled event actions to the seat controller 13 via a seat controller interface 93 , to the environment controller 15 via an environment controller interface module 94 , and to the IFE unit 17 via an IFE module 95 . Feedback data and control signals may also be received from the seat controller 13 , environment controller 15 and IFE unit 17 via the respective modules.
  • the travel app interface module 92 can receive and process data in response to user input via the travel app 9 , for example, to search for and retrieve flight data 25 , retrieve and/or update customer data 23 , book or purchase a new flight, re-book a flight at a new time and/or date, etc.
  • the generated travel path data may define user-selectable elements of the travel path, associated with the scheduled events for example, for display by the travel app 9 based on one or more predefined travel path GUI templates.
  • the generated travel path data may consist of scheduling data elements in a structured data format, such as XML, CSV, etc.
  • FIG. 6 which comprises FIGS. 6A and 6B , for an example computer-implemented wellness planning and environment control process using the passenger's mobile device 7 .
  • FIGS. 7 and 8 schematically illustrating exemplary dynamic travel paths displayed by the travel app 9 on the passenger's mobile device 7 .
  • the process begins at step S 6 - 1 where the passenger's mobile device 7 loads the travel app 9 , for example in response to a user command to launch the app.
  • the travel app 9 may require the customer to login with pre-registered details.
  • the travel path module 81 - 1 on the mobile device 7 retrieves customer data 23 for the customer registered with the travel app 9 , for example in response to a user command to display an interactive travel plan interface via the travel app 9 .
  • the retrieved customer data 23 includes information relating to the passenger's next booked journey, such as details of the outbound and return flights that are booked for the journey.
  • the mobile device 7 is configured to plan and generate a travel path for the passenger's booked journey that can be displayed in an interactive travel path interface of the travel app 9 .
  • the travel path module 81 - 1 retrieves flight data 25 including information relating to the passenger's next flight in the retrieved booked journey.
  • the travel path module 81 - 1 may also retrieve data from additional sources, such as terminal information relating to the departure and arrival airport terminals of the passenger's next flight.
  • the travel path module 51 processes the retrieved data and determines a plurality of journey segments for the booked journey.
  • a booked journey between departure and destination locations can be processed into a plurality of high level journey segments, based on information relating to the outbound and return flights, such as time and date, flight number, carrier, airport, etc.
  • the travel app 9 displays an initial view of the interactive travel plan, including the retrieved information relating to the passenger's next booked journey.
  • FIG. 7 schematically illustrates one example of an initial view of the interactive travel path displayed by the travel app 9 on the passenger's mobile device 7 .
  • the high level segments of this initial view include a plurality of in-flight segments 107 corresponding to discrete time periods when the customer is on-board a respective booked flight, and intervening ground segments 109 corresponding to discrete time periods between booked flights.
  • the travel path is presented as a scrollable ribbon interface 101 , with a horizontal dynamic time axis 103 indicating the location along the ribbon corresponding to the current time 105 .
  • the ribbon interface 101 may instead be displayed in a vertical orientation.
  • the customer data 23 includes information relating to a booked journey to Malibu, Calif., with an outbound flight departing today from London's Heathrow Airport and arriving at Los Angeles International Airport, displayed as a first raised segment 107 - 1 of the ribbon interface 101 .
  • the customer data 23 also includes information relating to the return flight in six weeks time, displayed as a second raised segment 107 - 2 , with a corresponding indication on the time axis 103 .
  • the booked journey may also include details of a hotel reservation while the customer is at the destination, displayed as a lower segment 109 - 2 between the respective raised segments 107 - 1 , 107 - 2 .
  • a lower segment 109 - 1 precedes the raised segment 107 - 1 associated with the outbound flight, indicating that the customer was at a predefined home location, London, UK in this example.
  • the raised segments 107 correspond to in-flight segments of the passenger's booked journey and the lower segments 109 correspond to ground segments of the journey.
  • the user can scroll the ribbon interface 101 along the horizontal axis, for example via user input 37 , to view the passenger's past and future booked journeys.
  • each raised segment 107 of the ribbon interface 101 may be a user-selectable element of the interface in order to retrieve and view more data relating to the associated flight.
  • the ribbon interface 101 may be configured to process user input commands to zoom into the travel path at a selected position to retrieve and view more data relating to the segment 107 , 109 at that position, and to zoom out to return to the previous or initial view.
  • the travel path module 51 can also processes each high level journey segment to determine a respective plurality of lower level journey segments, and to identify one or more defined and/or anticipated geographical locations associated with each lower level journey segment.
  • FIG. 8 which consists of FIGS. 8A and 8B , schematically illustrates an example of a zoomed-in view of the interactive travel path displayed by the travel app 9 .
  • a first high level ground segment 109 - 2 prior to the passenger's outbound flight segment 107 - 1 from London to Los Angeles, is broken down into three discrete and sequential lower level segments 111 , as illustrated in the first portion 101 a of the ribbon interface in FIG. 7A .
  • the first lower level segment 111 - 1 is associated with a discrete time period of the travel path when the customer is at a predefined home location, for example the passenger's home city or home address.
  • the second lower level segment 111 - 2 is associated with the subsequent time period of the travel path when the customer is, or should be, travelling to the departure airport terminal.
  • the third lower level segment 111 - 3 is associated with the subsequent time period of the travel path when the customer is in the airport terminal.
  • the high level ground segment 109 - 2 after the passenger's outbound flight segment 107 - 1 from London to Los Angeles is also broken down into three lower level segments 113 , as illustrated in the second portion 101 b of the ribbon interface in FIG. 7B .
  • the first lower level segment 113 - 1 is associated with the time period of the travel path when the customer is in the destination airport terminal
  • the second lower level segment 113 - 2 is associated with the subsequent time period when the customer will be travelling to the hotel in the destination city
  • the third lower level segment 113 - 3 is associated with the subsequent time period when the customer arrives at the hotel.
  • Each journey segment is associated with a respective time or time period along the time axis 103 , which may be calculated relative to the current time 105 , based on the retrieved and processed data.
  • the travel path module 51 can also determine one or more events for respective journey segments. Each event is also associated with a time or time period along the time axis 103 , which may be calculated relative to the current time 105 , based on the retrieved data. For example, as illustrated in FIG. 8 , which comprises FIGS.
  • the in-flight segment 107 - 1 includes a sequence of predefined events 121 that are scheduled at respective times during the flight, such as a welcome drink event 121 - 1 shortly after boarding or take-off, a first dining event 121 - 2 , a sleep event 121 - 3 , a wake event 121 - 4 , an IFE event 121 - 5 after the passenger has woken up, and a second dining event 121 - 6 .
  • predefined events 121 that are scheduled at respective times during the flight, such as a welcome drink event 121 - 1 shortly after boarding or take-off, a first dining event 121 - 2 , a sleep event 121 - 3 , a wake event 121 - 4 , an IFE event 121 - 5 after the passenger has woken up, and a second dining event 121 - 6 .
  • Events 115 can also be determined for the ground segments 109 .
  • the first lower level segment 111 - 1 of the first ground segment 109 - 1 includes an event 115 - 1 associated with an earliest possible and/or recommended time for the customer to proceed with online check-in for the outbound flight.
  • the second lower level segment 111 - 2 of the first ground segment 109 - 1 includes an event 115 - 2 associated with a recommended route to the departure airport terminal, for example as determined by the travel path module 51 or by the travel app 9 based on the passenger's current geo-location 29 .
  • the third lower level segment 111 - 3 of the first ground segment 109 - 1 includes one or more events 115 - 3 associated with respective navigation stages that the customer must progress through the departure airport terminal, such as bag drop, passport control and security, before arriving at the departure gate assigned to the outbound flight.
  • a plurality of events 115 are determined for the second ground segment 109 - 2 .
  • the first lower level segment 113 - 1 of the second ground segment 109 - 2 includes one or more events 115 - 4 associated with respective stages that the customer must progress through the arrival airport terminal, such as the arrival gate assigned to the flight, passport control and the baggage reclaim belt or area assigned to the flight.
  • the second lower level segment 113 - 2 of the second ground segment 109 - 2 includes an event 115 - 5 associated with a recommended route from the arrival airport terminal to the hotel at the destination.
  • the third lower level segment 113 - 3 of the second ground segment 109 - 2 includes an event 115 - 6 associated with an anticipated time of check-in at the hotel, for example as calculated by the travel path module 81 - 1 .
  • the travel path module 51 can retrieve data from one or more third-party data sources, such as traffic, public transport, weather and airport terminal data, and process the retrieved data to determine and schedule the plurality of predefined events for the ground segments 109 of the passenger's journey.
  • the determination of events that can be scheduled for the passenger's journey may depend on the availability of data from the third-party data sources for the geographical locations along the travel path.
  • the wellness planner module 81 - 3 retrieves passenger and flight details from customer data 83 and flight data 85 , such as information relating to the passenger's registered details and personal preferences, travel itinerary, meal schedule, cabin lighting schedule, etc.
  • the wellness planner module 81 - 3 receives sensor data from one or more sensors 11 via the sensor interface module 81 - 5 . The received sensor data will vary depending on the availability of sensors associated with the passenger and/or passenger's local environment.
  • the wellness planner module 81 - 3 determines and schedules one or more predefined events for the in-flight journey segment 107 - 1 , based on the retrieved passenger details from customer data 83 , flight details from flight data 85 , and the received sensor data.
  • the wellness planner module 81 - 3 identifies events that are associated with one or more predefined actions, and generates auxiliary data for the identified events that can be displayed or transmitted by the travel app 9 in response to a user command to select the respective event 115 , 121 from the ribbon interface 101 a .
  • auxiliary data can be generated for the sleep event 121 - 3 illustrated in FIG. 8A , defining control instructions that are transmitted to the seat controller 13 to recline the seat to a sleeping position, as well as control instructions that are transmitted to the environment controller 15 to dim the lights, in response to user input selection of the event 121 - 3 from the interactive display.
  • auxiliary data can be generated to suggest and guide the passenger through one or more exercise routines, based on sensor input feedback relating to the passenger's body movements and heart rate.
  • Table 1 sets out a number of exemplary data and sensor input parameters that may be processed by the wellness planner module 81 - 3 to determine and schedule one or more respective event outputs. It will be appreciated that many other combinations of predefined data inputs, environment sensor data inputs, and changeable passenger sensor data inputs due to physiological states, can be used to determine and trigger system responses and events.
  • Biometric data travel Body movements, Sleep/wake event, seat itinerary, personal heart rate control, lighting preferences control Biometric data, Body movements Seat control personal preferences Biometric data, meal Cabin lighting, Body movements, Wake suggestion schedule Ambient noise, local heart rate event, seat control, temperature lighting control Biometric data, meal Local temperature Body movements, Drink/Meal suggestion schedule heart rate event, cabin crew instructions Biometric data, Local temperature, Air conditioning personal preferences humidity, cabin control pressure Biometric data, travel Air pressure, Altitude Body movements, Exercise itinerary, personal heart rate suggestion/routine preferences event Biometric data, travel Ambient temperature Body temperature, Post-flight treatment itinerary, personal stomach acidity booking, sleep/meal preferences management suggestion events
  • a passenger's mobile device 7 may transmit data identifying the passenger's recorded body temperature to a processing node, such as the server 3 or a cabin crew's mobile device 19 , via the in-flight network 5 .
  • the processing node may then determine a temperature adjustment for that passenger based on the received temperature data, and in response, transmit control instructions to the cabin environment controller 15 to effect a change of environment temperature automatically.
  • the crew app 21 on the crew mobile device 19 may prompt the cabin crew to manually adjust the temperature controls based on the calculated needs of passenger.
  • the processing node may instead, or additionally, be configured to receive temperature data generated from a plurality of passenger devices and sensors located in a particular cabin, along with input data from the associated passengers indicative of a vote on the desired cabin temperature, and to determine environment temperature adjustments based on the group of passenger votes.
  • the passenger may also be informed of the changes to the travel environment and an aggregate body temperate, for example via data received and displayed by their associated mobile device 7 and/or the IFE unit 17 .
  • the passenger's mobile device 7 is monitoring his or her biorhythms
  • the system 1 may use data received from the passenger's mobile device 7 to detect or determine when the passenger is about to sleep or wake, and in response, to send control instructions to adjust the lighting around the passenger, as well as inform a crew member to prepare a beverage service for when the passenger awakes or not to disturb the passenger when asleep.
  • the system 1 may be configured to detect the passenger's emotional and physical state, and in response, determine event output(s) and indicate to the cabin crew to attend to a nervous or stressed passenger's needs appropriately.
  • the system may be further configured to retrieve and process historical data associated with the passenger, travel environment, and/or journey.
  • the server can be configured to retrieve recorded sensor and/or user input data that has been previously collected, outside of the travel environment, and stored on the passenger's mobile device or accessible from a remote server via an API.
  • the received historical data may then used along with sensor data collected within the travel environment to make an optimal decision of one or more changes to the environment based on the passenger's detected or determined wellness.
  • historical data associated with the passenger such as previous geo-locations and health data collected by the passenger's mobile device over a period of time, may be retrieved and processed by the server, together with received real-time sensor inputs, in order to determine or modify a personalised programme or event schedule to aid with combating travel fatigue.
  • the travel path module 81 - 1 can also be configured to generate auxiliary data for events associated with the ground segments 109 , such as information relating to specific navigation, routing and timing, for example to and within the airport terminal.
  • auxiliary data may include a link to a website or an external mobile app, such as a flight online check-in website, a hotel website or app with information relating to the hotel reservation, a public transport website or app with additional route, time and map information, a dedicated map website or app, etc.
  • the travel app 9 displays the detailed view of the generated travel path for the current journey in the interactive user interface, including the user-selectable events 115 , 121 associated with the journey segments 111 , 113 of the travel path.
  • the travel app 9 receives and processes user interactions with the travel path interface 101 and user-selectable events 115 , 121 of the interface, for example to handle user commands to scroll and/or zoom the displayed portion of the travel path, and in response, retrieves and executes the one or more actions associated with a user selected event 115 , 121 , at step S 6 - 25 as necessary.
  • the travel app 9 also monitors the scheduled events 115 , 121 to identify events that are scheduled for action at the current time, and automatically retrieves and executes the one or more actions associated with any identified scheduled event 115 , 121 , at step S 6 - 29 as necessary.
  • the system is also configured to dynamically adjust the travel path and scheduled events in response to received sensor data.
  • the wellness planner module 81 - 3 can receive, at step S 6 - 31 , information from sensors 11 relating to an increased heart rate, indicative of the passenger waking up from a sleep cycle before the scheduled wake event.
  • the wellness planner module 81 - 3 determines and reschedules one or more affected events 121 for the in-flight segment 107 of the passenger's booked journey, at step S 6 - 33 as necessary.
  • the affected events 121 can include the wake event 121 - 4 and the IFE event 121 - 5 that are brought forward along the timeline, by adjustment of the respective timing parameters.
  • the passenger can be prompted to accept the wake event 121 - 4 , and in response, the system can automatically adjust the passenger's seat 31 , lighting 43 and air-conditioning 45 based on the passenger's preferences. Additionally, the wellness planner module 81 - 3 can determine one or more new events 121 to be inserted into the event timeline, such as suggested refreshments that can be ordered by the passenger and automatically transmitted to the crew app 21 .
  • the wellness planner module 81 - 3 updates the travel path and associated scheduling data based on the identified and rescheduled events 121 .
  • the wellness planner module 81 - 3 generates or updates auxiliary data for any new and affected events, for example including options for the passenger to override the automatically rescheduled event.
  • the travel app 9 displays the updated travel path and the process returns to step S 6 - 23 where the travel app 9 continues to monitor and respond to user interactions, sensor inputs and/or further scheduled events.
  • the system described herein may comprise a computer system 600 as shown in FIG. 11 .
  • Embodiments of the present invention may be implemented as programmable code for execution by the computer system 600 .
  • Various embodiments of the invention are described in terms of this example computer system 600 . After reading this description, it will become apparent to a person skilled in the art how to implement the invention using other computer systems and/or computer architectures.
  • Computer system 600 includes one or more processors, such as processor 604 .
  • Processor 604 may be any type of processor, including but not limited to a special purpose or a general-purpose digital signal processor.
  • Processor 604 is connected to a communication infrastructure 606 (for example, a bus or network).
  • Computer system 600 also includes a main memory 608 , preferably random access memory (RAM), and may also include a secondary memory 610 .
  • Secondary memory 610 may include, for example, a hard disk drive 612 and/or a removable storage drive 614 , representing a floppy disk drive, a magnetic tape drive, an optical disk drive, etc.
  • Removable storage drive 614 reads from and/or writes to a removable storage unit 618 in a well-known manner.
  • Removable storage unit 618 represents a floppy disk, magnetic tape, optical disk, etc., which is read by and written to by removable storage drive 614 .
  • removable storage unit 618 includes a computer usable storage medium having stored therein computer software and/or data.
  • secondary memory 610 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 600 .
  • Such means may include, for example, a removable storage unit 622 and an interface 620 .
  • Examples of such means may include a program cartridge and cartridge interface (such as that previously found in video game devices), a removable memory chip (such as an EPROM, or PROM, or flash memory) and associated socket, and other removable storage units 622 and interfaces 620 which allow software and data to be transferred from removable storage unit 622 to computer system 600 .
  • the program may be executed and/or the data accessed from the removable storage unit 622 , using the processor 604 of the computer system 600 .
  • Computer system 600 may also include a communication interface 624 .
  • Communication interface 624 allows software and data to be transferred between computer system 600 and external devices. Examples of communication interface 624 may include a modem, a network interface (such as an Ethernet card), a communication port, a Personal Computer Memory Card International Association (PCMCIA) slot and card, etc.
  • Software and data transferred via communication interface 624 are in the form of signals 628 , which may be electronic, electromagnetic, optical, or other signals capable of being received by communication interface 624 . These signals 628 are provided to communication interface 624 via a communication path 626 .
  • Communication path 626 carries signals 628 and may be implemented using wire or cable, fibre optics, a phone line, a wireless link, a cellular phone link, a radio frequency link, or any other suitable communication channel. For instance, communication path 626 may be implemented using a combination of channels.
  • computer program medium and “computer usable medium” are used generally to refer to media such as removable storage drive 614 , a hard disk installed in hard disk drive 612 , and signals 628 . These computer program products are means for providing software to computer system 600 . However, these terms may also include signals (such as electrical, optical or electromagnetic signals) that embody the computer program disclosed herein.
  • Computer programs are stored in main memory 608 and/or secondary memory 610 . Computer programs may also be received via communication interface 624 . Such computer programs, when executed, enable computer system 600 to implement the present invention as discussed herein. Accordingly, such computer programs represent controllers of computer system 600 . Where the invention is implemented using software, the software may be stored in a computer program product and loaded into computer system 600 using removable storage drive 614 , hard disk drive 612 , or communication interface 624 , to provide some examples.
  • the invention can be implemented as control logic in hardware, firmware, or software or any combination thereof.
  • the apparatus may be implemented by dedicated hardware, such as one or more application-specific integrated circuits (ASICs) or appropriately connected discrete logic gates.
  • a suitable hardware description language can be used to implement the method described herein with dedicated hardware.
  • the server determines the travel path and associated scheduling data for display by the travel app.
  • some of the processing steps performed by the travel path generator module and/or the planning sub-module in the above embodiment can instead or additionally be performed by the processing modules of the passenger's travel app.
  • the dynamic travel path module on the mobile device can be configured to generate data identifying the sequence of journey segments and associated events and to determine scheduling information relating to estimations of timing and duration.
  • the mobile device would not need to communicate with the server in order to monitor the passenger and environment input data, dynamically adjust the travel path, and/or control the local environment.
  • the server may not include a travel path module and wellness planner module.
  • the travel path is displayed by the travel app on the passenger's mobile device.
  • the travel path and scheduling data may be automatically transmitted to the passenger's IFE unit.
  • a server is configured to determine travel paths and associated scheduled events for the plurality of passengers, based on retrieved customer and travel data and the received sensor data, and to determine, schedule and control predefined events associated with the journey segments.
  • one or more nodes of the in-cabin network may instead be configured to carry out the functionalities described in the above embodiment for dynamic travel event scheduling based on determined passenger wellness.
  • the cabin crew mobile devices may be configured as distributed, or peer-to-peer (P2P), computing nodes that communicate and coordinate actions by passing data messages over the in-cabin network.
  • the cabin crew mobile devices may receive environment and passenger sensor input data from sensors and device nodes on the network, and transmit control instructions to seat and environment controller nodes on the network.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Mechanical Engineering (AREA)
  • Pulmonology (AREA)
  • Data Mining & Analysis (AREA)
  • Primary Health Care (AREA)
  • Traffic Control Systems (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Navigation (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A system and method for controlling the travel environment for a passenger are described, in which passenger data is obtained from an existing source of stored data, the stored data including information on the passenger's itinerary. One or more sensor inputs are received, providing information on the physiological state of the passenger and/or environmental conditions in the vicinity of the passenger. One or more outputs are provided to control the passenger's travel environment based on the passenger data and the one or more sensor inputs. A system and method of dynamic travel event scheduling is also described, in which a dynamic event schedule is generate based on the retrieved data, the dynamic event schedule including at least one event associated with at least one action output.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 15/128,946, filed Sep. 23, 2016, which is a national phase entry of PCT/GB2015/050882 filed Mar. 24, 2015, which claims priority to GB 1405201.3, filed Mar. 24, 2014, each of which is incorporated herein by reference in its entirety.
  • FIELD OF THE INVENTION
  • The present invention relates to systems and methods for controlling a travel environment, such as for example in an aircraft cabin, so that the travel environment is personalised to the individual passenger.
  • BACKGROUND OF THE INVENTION
  • In the field of passenger travel, various measures have been introduced to improve comfort and convenience for the passenger, such as by allowing the seat to be modified between different positions as disclosed for example in WO-A-2007/072045 and WO-A-2009/066054. Various alternative measures for improving the passenger environment have been studied in the SEAT (Smart tEchnologies for stress free Air Travel) project. It is also known from U.S. Pat. No. 7,878,586 to store user profile data to control the environment automatically, although it is not known whether these proposals have ever been put into operation.
  • One aim of controlling the travel environment has been to alleviate travel fatigue, also commonly referred to as ‘jet lag’, for example by controlling lighting within an aircraft cabin. Jet lag may also be addressed by controlling a passenger's sleep, eating and exercise patterns. Mobile apps, such as ‘Jet Lag Fighter’ from Virgin Atlantic, allow the user to enter personal data such as age, gender and health status, and provide a personalised programme to alleviate jet lag.
  • What is desired is a system that facilitates greater efficiencies within the aircraft travel environment and enables improved control and personalisation of the passenger's travel environment, in particular for enhanced passenger wellness and wellbeing when flying.
  • STATEMENTS OF THE INVENTION
  • Aspects of the present invention are set out in the accompanying claims.
  • According to one aspect, there is provided a system for controlling the travel environment for a passenger, in which passenger data is obtained from an existing source of customer data rather than requiring the passenger to enter their data manually. The customer data may include information on the passenger's travel itinerary. Additionally, one or more sensor inputs may provide information on the environmental conditions in the vicinity of the passenger. On the basis of the customer data and the sensor input(s), the system provides one or more outputs to improve the passenger's travel environment or experience.
  • According to another aspect of the present invention, the present invention provides a system for dynamic travel event scheduling, in which stored data including information relating to a passenger's itinerary is retrieved, the itinerary including at least one scheduled journey. The system generates a dynamic event schedule based on the retrieved data, the dynamic event schedule including at least one event associated with at least one action output. One or more sensor inputs are received, providing information on the physiological state of the passenger and/or environmental conditions in the vicinity of the passenger. In response, the system identifies one or more affected events of the dynamic event schedule based on the received sensor inputs, and provides one or more action outputs to control the passenger's travel environment based on the at least one event.
  • The outputs to control the passenger's travel environment may comprises one or more of signals to control one or more properties of a passenger seat, and control lighting and/or air conditioning above and/or around the passenger's seat.
  • The at least one event may be selected from a set of predefined events including: sleep, wake, stretch, exercise, eat, drink, stay awake, and engage in-flight entertainment. The sleep and wake events may be associated with respective action outputs to automatically control a recline position of the passenger's seat and a lighting level above or around the passenger's seat.
  • Each scheduled events may be associated with a respective timing parameter and wherein the system is further operable to update the travel path data by adjusting respective timing parameters of the one or more affected events.
  • The retrieved data may also include information relating to at least one of the passenger's personal preferences, an in-flight meal schedule, and an automated cabin lighting schedule.
  • The system may be further operable to generate data defining a dynamic event schedule is further operable to generate auxiliary data for an event defining the associated action output. A new event for the dynamic event schedule may be determined based on the received sensor inputs. The system may be further operable to output the travel path data as an interactive interface.
  • The sensor inputs may be received from one or more of: a temperature sensor, a lighting sensor, a humidity sensor, a body movement sensor, a sleep phase sensor, an eye movement sensor, a heart rate sensor, a body temperature sensor, and an ingestible sensor.
  • In other aspects, there are provided methods of operating the systems as described above. In another aspect, there is provided a computer program comprising machine readable instructions stored thereon arranged to cause a programmable device to become configured as the systems as described above.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Specific embodiments of the invention will now be described, purely by way of example, with reference to the accompanying drawings in which:
  • FIG. 1 is a block diagram of a system according to an embodiment of the invention.
  • FIG. 2 is a perspective view of a seating unit to which the system may be applied.
  • FIG. 3 is a block diagram of a mobile device for use in the embodiments of the invention.
  • FIG. 4 is a block diagram illustrating the processing modules of the mobile device of FIG. 3 according to an embodiment of the invention.
  • FIG. 5 is a block diagram of a server according to an alternative embodiment of the invention.
  • FIG. 6, which comprises FIGS. 6A and 6B, is a flow diagram illustrating processing steps performed by the mobile device of FIG. 4 according to an embodiment.
  • FIG. 7 schematically illustrates an example of an initial view of an interactive travel path displayed by the mobile device.
  • FIG. 8, which comprises FIGS. 8A and 8B, schematically illustrates an example of a detailed view of the interactive travel path in FIG. 7.
  • FIG. 9 is a diagram of an example of a computer system for use in embodiments of the invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • FIG. 1 shows schematically the elements of a travel environment control system in an embodiment of the invention that relates to commercial air travel. Aspects of the invention may be applicable to other travel environments, such as trains, cars, buses, etc. At least some of the elements are optional, at least for certain applications.
  • In this embodiment, an automated system accesses customer data and environmental data relating to the local environment of a passenger, for example from sensors. On the basis of these inputs, the system is able to control the local environment according to the specific requirements of the passenger and the properties of the local environment. The system is also able to determine an event timeline for the passenger's optimal wellness, indicating events along the timeline and associated timing aspects, and dynamically adjust the event timeline in response to passenger and environment data from the sensors. The events may define automated control of the local environment. Specific examples and applications will be described below.
  • In the system 1 shown in FIG. 1, a server 3, located on board an aircraft, is in communication with one or more in-cabin networks 5, which connect the server 3 to a passenger's mobile device 7 running a travel app 9, one or more sensors 11, a seat controller 13 and an environment controller 15. An IFE (In-Flight Entertainment) unit 17 may also be connected to the server 3. The in-cabin network 5 may be a wired or wireless network, and may include one or more ad-hoc data connections between nodes in the network, such as the server 3 and the passenger's mobile device 7. The server 3 may also be in communication with a crew mobile device 19 via the in-cabin network(s) 5. The crew mobile device 19 may run a crew app 21, as described in more detail below. As another example, the in-cabin network 5 may include a mesh network, as is known in the art, with mesh nodes formed by one or more of the passenger's mobile device 7 and the crew mobile devices 19, as well as sensors 11, seat controllers 13 and environment controllers 15 adapted for ad-hoc network connectivity. It will be appreciated that such a mesh network can typically also include one or more mesh routers that forward data to and from one or more gateways, which may optionally be connected to the Internet.
  • The server 3 has access to customer data 23 and flight data 25, for example from one or more local databases and/or remote databases 26 accessible via one or more external data networks 28, such as the Internet. The customer data 23 may comprise customer biometric details such as age, gender, height, weight, etc., health status, and personal preferences, such as dietary requirements, sleeping habits etc. The customer data 23 may be provided by customer input, for example within a travel app 9 running on a passenger's mobile device 7 or via a web-based interface, or may be provided from a user profile within another service with which the user is registered, such as a social network. The customer data 23 may also comprise location data relating to the current or last-known geographical location of the passenger or the passenger's mobile device 7, for example from location tracking information received from the passenger's mobile device 7. Instead or additionally, the customer data 23 and flight data 25 may be stored in the passenger passenger's mobile device 7, and may be updated when the travel app 9 is connected to the server 3.
  • The flight data 25 is linked to the customer data 23 and includes data relating to flights that the customer has booked, checked in for, or boarded. The flight data 25 includes the timing and duration of the flight, as well as the departure and arrival points of the flight, and information of any connecting flights. The flight data 25 may also include information associated with in-flight aspects, such as meal and/or cabin lighting schedules for the specific flight, as well as information associated with offers for the customer, such as available flight upgrades.
  • In this embodiment, the travel app 9 on the passenger's mobile device 7 generates and outputs a dynamic event timeline based on the customer data 23 and flight data 25, and enables passenger interaction with the dynamic event timeline. The travel app 9 can also output control signals or messages to the seat controller 13, the environment controller 15 and/or the crew mobile device 19 and crew app 21. Alternatively or additionally, the server 3 may be configured to generate data associated with the dynamic event timeline based on the customer data 23 and flight data 25, and to transmit the generated data to the passenger's mobile device 7 for display by the travel app 9. The travel app 9 may be configured to run in the background, to collect and provide information to the server 3 on an ongoing basis, and to receive and process push updates and event triggers from the server 3.
  • The seat controller 13 may automatically control, without direct user input, one or more properties of a passenger seat 31, for example as shown in FIG. 2. In this example, the passenger seat 31 comprises a reclinable seat back 33 and a seat pan 35 that moves forward as the seat back 33 reclines, under the control of the seat controller 13. Arm rests 35 a, 35 b may drop as the seat reclines, again under the control of the seat controller 13. A foot rest 37 may drop or be adjustable, under the control of the seat controller 13. The seat may be contained within a housing 39, and separated from adjacent seats by a retractable privacy screen 41, both of which afford a degree of privacy to the passenger. The IFE unit 17 is provided adjacent the seat 31.
  • The environment controller 15 may control the environment around the passenger, for example by controlling lighting 43 and/or air conditioning 45 above and/or around the passenger's seat 31.
  • The sensor(s) 11 may include one or more environmental sensors for collecting and providing data relating to aspects of the environment on board the aircraft, either locally to the passenger or within the cabin as a whole, such as:
      • temperature sensor(s) for sensing the air temperature within the cabin;
      • lighting sensor(s) for sensing the lighting level within the cabin;
      • humidity sensor(s) for sensing the humidity within the cabin;
      • pressure sensor(s) for sensing the air pressure within the cabin;
      • noise sensor(s) for sensing the ambient noise level within the cabin; and
      • altitude sensor(s) for sensing the travelling altitude of the cabin, which may be the absolute, true, pressure or density altitude.
  • The sensor(s) 11 may also include one or more passenger sensors for collecting and providing data relating to aspects of the customer on board the aircraft, which can vary depending on the physiological state of the customer. The passenger sensors may be wearable by the passenger, separate from the passenger, or included within the passenger's mobile device 7, and include sensors such as:
      • Body movement sensors, for example using an accelerometer connected to the passenger, or using a camera;
      • Sleep phase or biorhythm sensors, for example using detected heart rate, movement, or EEG (electroencephalography);
      • Eye movement sensors, such as the camera or a dedicated eye tracking device
      • Heart rate or blood pressure sensors;
      • External body temperature sensors;
      • Digital pills or other ingestible sensors, that detect internal temperature, stomach acidity and other internal properties and wirelessly relay this information outside the passenger's body
  • The passenger sensors may also be configured to collect data relating to aspects of the customer before boarding the flight, for example over a predefined period of time preceding a scheduled flight and/or on the day before boarding the scheduled flight. The pre-flight collected data can be stored by the respective passenger sensors and/or provided to the passenger's mobile device 7, for use in determining and scheduling events associated with the journey segments as will be described in more detail below.
  • The mobile device 7,19 may be a smartphone, tablet, PDA (Personal Digital Assistant), or a wearable device such as a smart watch, an electronic wristband, or Google Glass™, etc. FIG. 3 is a schematic diagram of one such exemplary mobile device 7,19, having a processor 51, memory 53, a display screen 55, user input module 57, a location signal receiver 59 and communications interface(s) 61. The location signal receiver 59 may be a GPS based receiver for determining a geolocation of the mobile device 7,19. The mobile device 7,19 may also include one or more of: a microphone 63, one or more sensors 65, one or more sensor interfaces 67 that connect the mobile device 7, 19 to respective sensors 11, a speaker 69 and a camera 71. The travel app 9 and crew app 21 may be downloaded and installed to the memory 53 of the mobile device 7,19, and may require registration of the user with the server 3 via the app, or secure log-in to the app by an existing registered user with the server 3.
  • The server 3 may also be connected to a mobile data network (not shown), for communication with the mobile devices 7,19. However, in practice, the mobile devices 7,19 are typically required to be placed in “flight mode” when the passenger and crew are on board the aircraft, and data communication during a flight may be restricted to connections via the in-cabin network(s) 9.
  • FIG. 4 is a block diagram illustrating processing modules 81 of the passenger's mobile device 7 in the present embodiment. The passenger's mobile device 7 stores a local copy of the passenger's customer data 83 and flight data 85 that can be retrieved and processed by the processing modules 81. One or more of the processing modules 81 may be provided as integrated components of the passenger's travel app 21, or may be provided as separate mobile applications linked to the travel app 21. Additionally or alternatively, one or more of the processing modules 81 may be configured to receive data from respective modules in the server 3 for output by the passenger's travel app 21.
  • In this embodiment, the passenger's mobile device 7 includes a travel path module 81-1 for generating a travel path for a passenger having a booked and/or purchased journey, including various booked entities such as an outbound flight from the passenger's home location to a destination location, a hotel reservation at the destination location, a return flight, etc. The travel path for the passenger's booked journey may be an end-to-end plan, consisting of a plurality of journey segments from a departure point, such as the passenger's home location, to a destination point, such as a boarding gate in the departure airport terminal assigned to the passenger's flight or the location of a hotel booked in the destination city. The travel path module 51 processes the customer and flight data 23,25 to identify and determine the journey segments, as well as associated timing parameters, such as anticipated start time, duration, etc. The travel path module 51 can also determine and schedule predefined events associated with the journey segments, that together define the timeline of scheduled events for the travel path. The travel path module 51 can also dynamically revise and update the travel path and event timeline based on the monitored geographical location of the customer together with environmental information retrieved from a plurality of data sources, for example to take into account identified disruptions to the travel path.
  • The travel path module 81-1 may generate data for a graphical user interface (GUI) representation of the travel path. The generated travel path data may define an interactive graphical representation of the travel path to be displayed by the travel app 9.
  • The passenger's mobile device 7 also includes a wellness planner module 81-3 for determining and scheduling events associated with the journey segments, based on a predefined set of rules that combine the customer data, flight data and sensor data to create a personalised travel experience for the passenger. As will be described in more detail below, the wellness planning module 81-3 can also schedule predefined events associated with respective journey segments, based on factors such as passenger preferences, travel itinerary, passenger current physical state, etc. For example, the wellness planner module 51 can determine a sequence of events and associated scheduling information to assist a passenger with overcoming the effects of jet lag at the destination. The predefined events may include sleep, wake, stretch, exercise, eat, drink, stay awake, engage in-flight entertainment, etc. The predefined events may be determined for a plurality of journey segments, such as a segment of time before the scheduled flight, a segment of time in-flight, and a segment of time after arrival at the destination. Post-flight events may also be scheduled, such as treatment bookings.
  • The wellness planner module 81-3 can also dynamically adjust and/or reschedule the events, such as the event order, start time, end time, duration, in response to received sensor input data. For example, the wellness planning module 81-3 can schedule in-flight events relating to optimal sleeping, eating and exercise patterns to assist with alleviating the passenger's jet lag at the destination. The wellness planning module 53 can dynamically adjust the events in response to received passenger and environment data from the sensors indicating, for example, that the passenger is awake, asleep, hungry, nervous, hot, cold, uncomfortable, etc.
  • As another example, the wellness planner module 51 can determine a sequence of events and associated scheduling information to assist a passenger with a personalized and more comfortable in-flight experience. The predefined events may include sleep suggestion, wake-up alarm, personalised sound/audio file output, in-flight exercise programme. The scheduled events can also include automated commands and/or instructions to external controllers and output devices. For example, the control system 87 can output:
      • cabin crew instructions to the crew mobile device 19 via the mobile device interface module 55, such as service instructions to provide water when the passenger is determined to be dehydrated, to offer a blanket when the detected temperature is determined to be below a predefined and/or preferred threshold, or not to disturb or wake up for a scheduled meal based on the determined sleep phase of the passenger, etc.
      • local temperature control signals to the environment controller via an environment controller interface module 69, and
      • seat control signals to the seat controller via a seat controller interface module 67.
  • The wellness planner module 81-3 can also directly control aspects of the passenger's local environment defined by the scheduled events, such as one or more properties of the passenger seat 31 via a seat controller module 81-7 in communication with the seat controller 13 over the in-cabin network 5, and one or more properties of the environment around the passenger via an environment controller module 81-9 in communication with the environment controller 15 over the in-cabin network 5. Alternatively, the seat controller module 81-7 and the environment controller module 81-9 may be configured to communicate control instructions to the seat controller 13 and the environment controller 15, respectively, via the server 3. Optionally, the mobile device 7 can include a module to enable the passenger to control, via direct user input, properties of the passenger seat 31 and/or the environment.
  • The passenger's mobile device 7 can also include an IFE interface module 81-11 for communicating data with the IFE 17 over the in-cabin network 5. In another embodiment, the passenger's mobile device 7 can include a cockpit simulator module 81-13 for displaying an interactive cockpit simulator via the passenger's mobile device 7, for example based on input data from sensors located about the aircraft and non-sensitive information received from the aircraft's cockpit system(s).
  • FIG. 5 is a block diagram of the server 3, illustrating the processing modules of the server 3 in an alternative embodiment. In this embodiment, the server 3 includes a control system 87 that retrieves customer data 23 and flight data 25 for a plurality of registered users via respective database interfaces 88,89, and receives sensor data from sensors 11 via a sensor interface module 90. The control system 87 includes a travel path module 91-1 for determining travel paths and associated scheduled events for the plurality of passengers, based on the retrieved customer and flight data 23,25 and the received sensor data, and for determining and scheduling predefined events associated with the journey segments. The control system 87 also includes a wellness planner module 91-3 for processing the data based on specific requirements of the respective passengers and scheduling predefined events for the respective passenger's optimal wellbeing, and for automatically controlling the respective passenger's local in-flight environment according to the scheduled events.
  • The travel path module 91-1 of the control system 87 can generate data for the interactive GUI representation of the travel path, for example based on scheduling data determined by the wellness planner module 91-3. The generated travel path data is communicated to the travel app 9 on the passenger's mobile device 7, via a travel app interface module 92. The control system 87 can also output control signals associated with scheduled event actions to the seat controller 13 via a seat controller interface 93, to the environment controller 15 via an environment controller interface module 94, and to the IFE unit 17 via an IFE module 95. Feedback data and control signals may also be received from the seat controller 13, environment controller 15 and IFE unit 17 via the respective modules.
  • Additionally, the travel app interface module 92 can receive and process data in response to user input via the travel app 9, for example, to search for and retrieve flight data 25, retrieve and/or update customer data 23, book or purchase a new flight, re-book a flight at a new time and/or date, etc. Alternatively, the generated travel path data may define user-selectable elements of the travel path, associated with the scheduled events for example, for display by the travel app 9 based on one or more predefined travel path GUI templates. As yet a further alternative, the generated travel path data may consist of scheduling data elements in a structured data format, such as XML, CSV, etc.
  • Wellness Planning and Environment Control
  • A description has been given above of the components forming part of the travel environment system 1 in one embodiment. A detailed description of the operation of these components will now be given with reference to the flow diagram of FIG. 6, which comprises FIGS. 6A and 6B, for an example computer-implemented wellness planning and environment control process using the passenger's mobile device 7. Reference is also made to FIGS. 7 and 8, schematically illustrating exemplary dynamic travel paths displayed by the travel app 9 on the passenger's mobile device 7.
  • As shown in FIG. 6, the process begins at step S6-1 where the passenger's mobile device 7 loads the travel app 9, for example in response to a user command to launch the app. The travel app 9 may require the customer to login with pre-registered details. At step S6-3, the travel path module 81-1 on the mobile device 7 retrieves customer data 23 for the customer registered with the travel app 9, for example in response to a user command to display an interactive travel plan interface via the travel app 9. The retrieved customer data 23 includes information relating to the passenger's next booked journey, such as details of the outbound and return flights that are booked for the journey.
  • In this embodiment, the mobile device 7 is configured to plan and generate a travel path for the passenger's booked journey that can be displayed in an interactive travel path interface of the travel app 9. Accordingly, at step S6-5 in FIG. 6, the travel path module 81-1 retrieves flight data 25 including information relating to the passenger's next flight in the retrieved booked journey. The travel path module 81-1 may also retrieve data from additional sources, such as terminal information relating to the departure and arrival airport terminals of the passenger's next flight. At step S6-7, the travel path module 51 processes the retrieved data and determines a plurality of journey segments for the booked journey. For example, a booked journey between departure and destination locations can be processed into a plurality of high level journey segments, based on information relating to the outbound and return flights, such as time and date, flight number, carrier, airport, etc. At step S6-9, the travel app 9 displays an initial view of the interactive travel plan, including the retrieved information relating to the passenger's next booked journey.
  • FIG. 7 schematically illustrates one example of an initial view of the interactive travel path displayed by the travel app 9 on the passenger's mobile device 7. The high level segments of this initial view include a plurality of in-flight segments 107 corresponding to discrete time periods when the customer is on-board a respective booked flight, and intervening ground segments 109 corresponding to discrete time periods between booked flights. In this example, the travel path is presented as a scrollable ribbon interface 101, with a horizontal dynamic time axis 103 indicating the location along the ribbon corresponding to the current time 105. The ribbon interface 101 may instead be displayed in a vertical orientation. In this example, the customer data 23 includes information relating to a booked journey to Malibu, Calif., with an outbound flight departing today from London's Heathrow Airport and arriving at Los Angeles International Airport, displayed as a first raised segment 107-1 of the ribbon interface 101. The customer data 23 also includes information relating to the return flight in six weeks time, displayed as a second raised segment 107-2, with a corresponding indication on the time axis 103.
  • The booked journey may also include details of a hotel reservation while the customer is at the destination, displayed as a lower segment 109-2 between the respective raised segments 107-1, 107-2. Similarly, a lower segment 109-1 precedes the raised segment 107-1 associated with the outbound flight, indicating that the customer was at a predefined home location, London, UK in this example. In this embodiment, the raised segments 107 correspond to in-flight segments of the passenger's booked journey and the lower segments 109 correspond to ground segments of the journey. The user can scroll the ribbon interface 101 along the horizontal axis, for example via user input 37, to view the passenger's past and future booked journeys. As described later, each raised segment 107 of the ribbon interface 101 may be a user-selectable element of the interface in order to retrieve and view more data relating to the associated flight. Alternatively or additionally, the ribbon interface 101 may be configured to process user input commands to zoom into the travel path at a selected position to retrieve and view more data relating to the segment 107,109 at that position, and to zoom out to return to the previous or initial view. Following from the example illustrated in FIG. 7,
  • The travel path module 51 can also processes each high level journey segment to determine a respective plurality of lower level journey segments, and to identify one or more defined and/or anticipated geographical locations associated with each lower level journey segment. FIG. 8, which consists of FIGS. 8A and 8B, schematically illustrates an example of a zoomed-in view of the interactive travel path displayed by the travel app 9. In this example, a first high level ground segment 109-2, prior to the passenger's outbound flight segment 107-1 from London to Los Angeles, is broken down into three discrete and sequential lower level segments 111, as illustrated in the first portion 101 a of the ribbon interface in FIG. 7A. The first lower level segment 111-1 is associated with a discrete time period of the travel path when the customer is at a predefined home location, for example the passenger's home city or home address. The second lower level segment 111-2 is associated with the subsequent time period of the travel path when the customer is, or should be, travelling to the departure airport terminal. The third lower level segment 111-3 is associated with the subsequent time period of the travel path when the customer is in the airport terminal.
  • Similarly, the high level ground segment 109-2 after the passenger's outbound flight segment 107-1 from London to Los Angeles is also broken down into three lower level segments 113, as illustrated in the second portion 101 b of the ribbon interface in FIG. 7B. However, in this ground segment, the first lower level segment 113-1 is associated with the time period of the travel path when the customer is in the destination airport terminal, the second lower level segment 113-2 is associated with the subsequent time period when the customer will be travelling to the hotel in the destination city, and the third lower level segment 113-3 is associated with the subsequent time period when the customer arrives at the hotel. Each journey segment is associated with a respective time or time period along the time axis 103, which may be calculated relative to the current time 105, based on the retrieved and processed data.
  • The travel path module 51 can also determine one or more events for respective journey segments. Each event is also associated with a time or time period along the time axis 103, which may be calculated relative to the current time 105, based on the retrieved data. For example, as illustrated in FIG. 8, which comprises FIGS. 8A and 8B, the in-flight segment 107-1 includes a sequence of predefined events 121 that are scheduled at respective times during the flight, such as a welcome drink event 121-1 shortly after boarding or take-off, a first dining event 121-2, a sleep event 121-3, a wake event 121-4, an IFE event 121-5 after the passenger has woken up, and a second dining event 121-6.
  • Events 115 can also be determined for the ground segments 109. For example, the first lower level segment 111-1 of the first ground segment 109-1 includes an event 115-1 associated with an earliest possible and/or recommended time for the customer to proceed with online check-in for the outbound flight. The second lower level segment 111-2 of the first ground segment 109-1 includes an event 115-2 associated with a recommended route to the departure airport terminal, for example as determined by the travel path module 51 or by the travel app 9 based on the passenger's current geo-location 29. The third lower level segment 111-3 of the first ground segment 109-1 includes one or more events 115-3 associated with respective navigation stages that the customer must progress through the departure airport terminal, such as bag drop, passport control and security, before arriving at the departure gate assigned to the outbound flight.
  • Similarly, a plurality of events 115 are determined for the second ground segment 109-2. The first lower level segment 113-1 of the second ground segment 109-2 includes one or more events 115-4 associated with respective stages that the customer must progress through the arrival airport terminal, such as the arrival gate assigned to the flight, passport control and the baggage reclaim belt or area assigned to the flight. The second lower level segment 113-2 of the second ground segment 109-2 includes an event 115-5 associated with a recommended route from the arrival airport terminal to the hotel at the destination. The third lower level segment 113-3 of the second ground segment 109-2 includes an event 115-6 associated with an anticipated time of check-in at the hotel, for example as calculated by the travel path module 81-1.
  • Accordingly, referring back to FIG. 6, at step S6-11 the travel path module 51 can retrieve data from one or more third-party data sources, such as traffic, public transport, weather and airport terminal data, and process the retrieved data to determine and schedule the plurality of predefined events for the ground segments 109 of the passenger's journey. The determination of events that can be scheduled for the passenger's journey may depend on the availability of data from the third-party data sources for the geographical locations along the travel path.
  • At step S6-13, the wellness planner module 81-3 retrieves passenger and flight details from customer data 83 and flight data 85, such as information relating to the passenger's registered details and personal preferences, travel itinerary, meal schedule, cabin lighting schedule, etc. At step S6-15, the wellness planner module 81-3 receives sensor data from one or more sensors 11 via the sensor interface module 81-5. The received sensor data will vary depending on the availability of sensors associated with the passenger and/or passenger's local environment. At step S6-17, the wellness planner module 81-3 determines and schedules one or more predefined events for the in-flight journey segment 107-1, based on the retrieved passenger details from customer data 83, flight details from flight data 85, and the received sensor data.
  • At step S6-19, the wellness planner module 81-3 identifies events that are associated with one or more predefined actions, and generates auxiliary data for the identified events that can be displayed or transmitted by the travel app 9 in response to a user command to select the respective event 115,121 from the ribbon interface 101 a. For example, auxiliary data can be generated for the sleep event 121-3 illustrated in FIG. 8A, defining control instructions that are transmitted to the seat controller 13 to recline the seat to a sleeping position, as well as control instructions that are transmitted to the environment controller 15 to dim the lights, in response to user input selection of the event 121-3 from the interactive display. As another example, auxiliary data can be generated to suggest and guide the passenger through one or more exercise routines, based on sensor input feedback relating to the passenger's body movements and heart rate.
  • Table 1 sets out a number of exemplary data and sensor input parameters that may be processed by the wellness planner module 81-3 to determine and schedule one or more respective event outputs. It will be appreciated that many other combinations of predefined data inputs, environment sensor data inputs, and changeable passenger sensor data inputs due to physiological states, can be used to determine and trigger system responses and events.
  • TABLE 1
    Predefined Data Environment Sensor Passenger Sensor
    Input(s) Input(s) Input(s) Event Output(s)
    Biometric data, travel Body movements, Sleep/wake event, seat
    itinerary, personal heart rate control, lighting
    preferences control
    Biometric data, Body movements Seat control
    personal preferences
    Biometric data, meal Cabin lighting, Body movements, Wake suggestion
    schedule Ambient noise, local heart rate event, seat control,
    temperature lighting control
    Biometric data, meal Local temperature Body movements, Drink/Meal suggestion
    schedule heart rate event, cabin crew
    instructions
    Biometric data, Local temperature, Air conditioning
    personal preferences humidity, cabin control
    pressure
    Biometric data, travel Air pressure, Altitude Body movements, Exercise
    itinerary, personal heart rate suggestion/routine
    preferences event
    Biometric data, travel Ambient temperature Body temperature, Post-flight treatment
    itinerary, personal stomach acidity booking, sleep/meal
    preferences management
    suggestion events
  • For example, a passenger's mobile device 7, such as a smart watch with a temperature sensor, may transmit data identifying the passenger's recorded body temperature to a processing node, such as the server 3 or a cabin crew's mobile device 19, via the in-flight network 5. The processing node may then determine a temperature adjustment for that passenger based on the received temperature data, and in response, transmit control instructions to the cabin environment controller 15 to effect a change of environment temperature automatically. Alternatively, the crew app 21 on the crew mobile device 19 may prompt the cabin crew to manually adjust the temperature controls based on the calculated needs of passenger. The processing node may instead, or additionally, be configured to receive temperature data generated from a plurality of passenger devices and sensors located in a particular cabin, along with input data from the associated passengers indicative of a vote on the desired cabin temperature, and to determine environment temperature adjustments based on the group of passenger votes. The passenger may also be informed of the changes to the travel environment and an aggregate body temperate, for example via data received and displayed by their associated mobile device 7 and/or the IFE unit 17.
  • As another example, the passenger's mobile device 7 is monitoring his or her biorhythms, and the system 1 may use data received from the passenger's mobile device 7 to detect or determine when the passenger is about to sleep or wake, and in response, to send control instructions to adjust the lighting around the passenger, as well as inform a crew member to prepare a beverage service for when the passenger awakes or not to disturb the passenger when asleep. Additionally, the system 1 may be configured to detect the passenger's emotional and physical state, and in response, determine event output(s) and indicate to the cabin crew to attend to a nervous or stressed passenger's needs appropriately.
  • As those skilled in the art will appreciate, the system may be further configured to retrieve and process historical data associated with the passenger, travel environment, and/or journey. For example, the server can be configured to retrieve recorded sensor and/or user input data that has been previously collected, outside of the travel environment, and stored on the passenger's mobile device or accessible from a remote server via an API. The received historical data may then used along with sensor data collected within the travel environment to make an optimal decision of one or more changes to the environment based on the passenger's detected or determined wellness. For example, historical data associated with the passenger, such as previous geo-locations and health data collected by the passenger's mobile device over a period of time, may be retrieved and processed by the server, together with received real-time sensor inputs, in order to determine or modify a personalised programme or event schedule to aid with combating travel fatigue.
  • The travel path module 81-1 can also be configured to generate auxiliary data for events associated with the ground segments 109, such as information relating to specific navigation, routing and timing, for example to and within the airport terminal. As yet another example, auxiliary data may include a link to a website or an external mobile app, such as a flight online check-in website, a hotel website or app with information relating to the hotel reservation, a public transport website or app with additional route, time and map information, a dedicated map website or app, etc.
  • At step S6-21, the travel app 9 displays the detailed view of the generated travel path for the current journey in the interactive user interface, including the user-selectable events 115,121 associated with the journey segments 111,113 of the travel path. At step S6-23, the travel app 9 receives and processes user interactions with the travel path interface 101 and user-selectable events 115,121 of the interface, for example to handle user commands to scroll and/or zoom the displayed portion of the travel path, and in response, retrieves and executes the one or more actions associated with a user selected event 115,121, at step S6-25 as necessary. At step S6-27, the travel app 9 also monitors the scheduled events 115,121 to identify events that are scheduled for action at the current time, and automatically retrieves and executes the one or more actions associated with any identified scheduled event 115,121, at step S6-29 as necessary.
  • In this embodiment, the system is also configured to dynamically adjust the travel path and scheduled events in response to received sensor data. For example, the wellness planner module 81-3 can receive, at step S6-31, information from sensors 11 relating to an increased heart rate, indicative of the passenger waking up from a sleep cycle before the scheduled wake event. In response, the wellness planner module 81-3 determines and reschedules one or more affected events 121 for the in-flight segment 107 of the passenger's booked journey, at step S6-33 as necessary. Following from the example of the passenger waking up early, the affected events 121 can include the wake event 121-4 and the IFE event 121-5 that are brought forward along the timeline, by adjustment of the respective timing parameters. In this way, the passenger can be prompted to accept the wake event 121-4, and in response, the system can automatically adjust the passenger's seat 31, lighting 43 and air-conditioning 45 based on the passenger's preferences. Additionally, the wellness planner module 81-3 can determine one or more new events 121 to be inserted into the event timeline, such as suggested refreshments that can be ordered by the passenger and automatically transmitted to the crew app 21.
  • At step S6-35, the wellness planner module 81-3 updates the travel path and associated scheduling data based on the identified and rescheduled events 121. At step S6-37, the wellness planner module 81-3 generates or updates auxiliary data for any new and affected events, for example including options for the passenger to override the automatically rescheduled event. At step S6-39, the travel app 9 displays the updated travel path and the process returns to step S6-23 where the travel app 9 continues to monitor and respond to user interactions, sensor inputs and/or further scheduled events.
  • Computer System
  • The system described herein may comprise a computer system 600 as shown in FIG. 11. Embodiments of the present invention may be implemented as programmable code for execution by the computer system 600. Various embodiments of the invention are described in terms of this example computer system 600. After reading this description, it will become apparent to a person skilled in the art how to implement the invention using other computer systems and/or computer architectures.
  • Computer system 600 includes one or more processors, such as processor 604. Processor 604 may be any type of processor, including but not limited to a special purpose or a general-purpose digital signal processor. Processor 604 is connected to a communication infrastructure 606 (for example, a bus or network). Computer system 600 also includes a main memory 608, preferably random access memory (RAM), and may also include a secondary memory 610. Secondary memory 610 may include, for example, a hard disk drive 612 and/or a removable storage drive 614, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, etc. Removable storage drive 614 reads from and/or writes to a removable storage unit 618 in a well-known manner. Removable storage unit 618 represents a floppy disk, magnetic tape, optical disk, etc., which is read by and written to by removable storage drive 614. As will be appreciated, removable storage unit 618 includes a computer usable storage medium having stored therein computer software and/or data.
  • In alternative implementations, secondary memory 610 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 600. Such means may include, for example, a removable storage unit 622 and an interface 620. Examples of such means may include a program cartridge and cartridge interface (such as that previously found in video game devices), a removable memory chip (such as an EPROM, or PROM, or flash memory) and associated socket, and other removable storage units 622 and interfaces 620 which allow software and data to be transferred from removable storage unit 622 to computer system 600. Alternatively, the program may be executed and/or the data accessed from the removable storage unit 622, using the processor 604 of the computer system 600.
  • Computer system 600 may also include a communication interface 624. Communication interface 624 allows software and data to be transferred between computer system 600 and external devices. Examples of communication interface 624 may include a modem, a network interface (such as an Ethernet card), a communication port, a Personal Computer Memory Card International Association (PCMCIA) slot and card, etc. Software and data transferred via communication interface 624 are in the form of signals 628, which may be electronic, electromagnetic, optical, or other signals capable of being received by communication interface 624. These signals 628 are provided to communication interface 624 via a communication path 626. Communication path 626 carries signals 628 and may be implemented using wire or cable, fibre optics, a phone line, a wireless link, a cellular phone link, a radio frequency link, or any other suitable communication channel. For instance, communication path 626 may be implemented using a combination of channels.
  • The terms “computer program medium” and “computer usable medium” are used generally to refer to media such as removable storage drive 614, a hard disk installed in hard disk drive 612, and signals 628. These computer program products are means for providing software to computer system 600. However, these terms may also include signals (such as electrical, optical or electromagnetic signals) that embody the computer program disclosed herein.
  • Computer programs (also called computer control logic) are stored in main memory 608 and/or secondary memory 610. Computer programs may also be received via communication interface 624. Such computer programs, when executed, enable computer system 600 to implement the present invention as discussed herein. Accordingly, such computer programs represent controllers of computer system 600. Where the invention is implemented using software, the software may be stored in a computer program product and loaded into computer system 600 using removable storage drive 614, hard disk drive 612, or communication interface 624, to provide some examples.
  • In alternative embodiments, the invention can be implemented as control logic in hardware, firmware, or software or any combination thereof. The apparatus may be implemented by dedicated hardware, such as one or more application-specific integrated circuits (ASICs) or appropriately connected discrete logic gates. A suitable hardware description language can be used to implement the method described herein with dedicated hardware.
  • Alternative Embodiments
  • The embodiments described above are illustrative of rather than limiting to the present invention. Alternative embodiments apparent on reading the above description may nevertheless fall within the scope of the invention.
  • For example, in the embodiment described above, the server determines the travel path and associated scheduling data for display by the travel app. It will be appreciated that in an alternative embodiment, some of the processing steps performed by the travel path generator module and/or the planning sub-module in the above embodiment can instead or additionally be performed by the processing modules of the passenger's travel app. For example, the dynamic travel path module on the mobile device can be configured to generate data identifying the sequence of journey segments and associated events and to determine scheduling information relating to estimations of timing and duration. In this alternative, the mobile device would not need to communicate with the server in order to monitor the passenger and environment input data, dynamically adjust the travel path, and/or control the local environment. As yet a further alternative, the server may not include a travel path module and wellness planner module.
  • In the embodiment described above, the travel path is displayed by the travel app on the passenger's mobile device. In an alternative embodiment, the travel path and scheduling data may be automatically transmitted to the passenger's IFE unit.
  • In the embodiment described above, a server is configured to determine travel paths and associated scheduled events for the plurality of passengers, based on retrieved customer and travel data and the received sensor data, and to determine, schedule and control predefined events associated with the journey segments. As those skilled in the art will appreciate, one or more nodes of the in-cabin network may instead be configured to carry out the functionalities described in the above embodiment for dynamic travel event scheduling based on determined passenger wellness. For example, the cabin crew mobile devices may be configured as distributed, or peer-to-peer (P2P), computing nodes that communicate and coordinate actions by passing data messages over the in-cabin network. The cabin crew mobile devices may receive environment and passenger sensor input data from sensors and device nodes on the network, and transmit control instructions to seat and environment controller nodes on the network.

Claims (23)

What is claimed is:
1. A system for dynamic travel event scheduling, comprising one or more processors configured to:
retrieve stored data including information relating to a passenger's itinerary, the itinerary including at least one scheduled journey;
generate data defining a dynamic event schedule based on the retrieved data, the dynamic event schedule including at least one event associated with at least one action output; and
during the at least one scheduled journey:
receive one or more sensor inputs providing information on the physiological state of the passenger and/or environmental conditions in the vicinity of the passenger;
identify one or more affected events of the dynamic event schedule based on the received sensor inputs; and
provide one or more action outputs to control the passenger's travel environment based on the at least one event.
2. The system of claim 1, wherein the outputs to control the passenger's travel environment comprise one or more of signals to: control one or more properties of a passenger seat, and control lighting and/or air conditioning above and/or around the passenger's seat.
3. The system of claim 1, wherein the at least one event is selected from a set of predefined events including: sleep, wake, stretch, exercise, eat, drink, stay awake, and engage in-flight entertainment.
4. The system of claim 3, wherein the sleep and wake events are associated with respective action outputs to automatically control a recline position of the passenger's seat and a lighting level above or around the passenger's seat.
5. The system of claim 1, wherein each scheduled events is associated with a respective timing parameter and wherein the system is further operable to update the travel path data by adjusting respective timing parameters of the one or more affected events.
6. The system of claim 1, wherein the retrieved data further includes information relating to at least one of the passenger's personal preferences, an in-flight meal schedule, and an automated cabin lighting schedule.
7. The system of claim 1, wherein the generating data defining a dynamic event schedule is further operable to generate auxiliary data for an event defining the associated action output.
8. The system of claim 1, wherein the one or more processors are further configured to determine a new event for the dynamic event schedule based on the received sensor inputs.
9. The system of claim 1, wherein the one or more processors are further configured to output the travel path data as an interactive interface.
10. The system of claim 1, wherein the sensor inputs providing information on the environmental conditions in the vicinity of the passenger are received from one or more of: temperature sensor(s), lighting sensor(s), humidity sensor(s), noise sensor(s), and altitude sensor(s).
11. The system of claim 1, wherein the sensor inputs providing information on the physiological state of the passenger are received from one or more of: a body movement sensor, a sleep phase sensor, an eye movement sensor, a heart rate sensor, a body temperature sensor and an ingestible sensor.
12. A method of dynamic travel event scheduling, comprising:
retrieving stored data including information relating to a passenger's itinerary, the itinerary including at least one scheduled journey;
generating data defining a dynamic event schedule based on the retrieved data, the dynamic event schedule including at least one event associated with at least one action output; and
during the at least one scheduled journey:
receiving one or more sensor inputs providing information on the physiological state of the passenger and/or environmental conditions in the vicinity of the passenger;
identifying one or more affected events of the dynamic event schedule based on the received sensor inputs; and
providing one or more action outputs to control the passenger's travel environment based on the at least one event.
13. The method of claim 12, wherein the outputs to control the passenger's travel environment comprise one or more of signals to: control one or more properties of a passenger seat, and control lighting and/or air conditioning above and/or around the passenger's seat.
14. The method of claim 12, wherein the at least one event is selected from a set of predefined events including: sleep, wake, stretch, exercise, eat, drink, stay awake, and engage in-flight entertainment.
15. The method of claim 14, wherein the sleep and wake events are associated with respective action outputs to automatically control a recline position of the passenger's seat and a lighting level above or around the passenger's seat.
16. The method of claim 12, wherein each scheduled events is associated with a respective timing parameter and wherein the system is further operable to update the travel path data by adjusting respective timing parameters of the one or more affected events.
17. The method of claim 12, wherein the retrieved data further includes information relating to at least one of the passenger's personal preferences, an in-flight meal schedule, and an automated cabin lighting schedule.
18. The method of claim 12, wherein the generating data defining a dynamic event schedule is further operable to generate auxiliary data for an event defining the associated action output.
19. The method of claim 12, further comprising determining a new event for the dynamic event schedule based on the received sensor inputs.
20. The method of claim 12, further comprising outputting the travel path data as an interactive interface.
21. The method of claim 12, wherein the sensor inputs providing information on the environmental conditions in the vicinity of the passenger are received from one or more of: temperature sensor(s), lighting sensor(s), humidity sensor(s), noise sensor(s), and altitude sensor(s).
22. The method of claim 12, wherein the sensor inputs providing information on the physiological state of the passenger are received from one or more of: a body movement sensor, a sleep phase sensor, an eye movement sensor, a heart rate sensor, a body temperature sensor and an ingestible sensor.
23. A non-transitory computer-readable medium comprising machine executable instructions stored thereon that when executed perform a method in accordance with claim 10.
US16/710,932 2014-03-24 2019-12-11 Travel Environment Control Abandoned US20200118083A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/710,932 US20200118083A1 (en) 2014-03-24 2019-12-11 Travel Environment Control

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
GB1405201.3A GB2524495C (en) 2014-03-24 2014-03-24 Travel environment control
GB1405201.3 2014-03-24
PCT/GB2015/050882 WO2015145142A1 (en) 2014-03-24 2015-03-24 Travel environment control
US201615128946A 2016-09-23 2016-09-23
US16/710,932 US20200118083A1 (en) 2014-03-24 2019-12-11 Travel Environment Control

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/GB2015/050882 Continuation WO2015145142A1 (en) 2014-03-24 2015-03-24 Travel environment control
US15/128,946 Continuation US10546274B2 (en) 2014-03-24 2015-03-24 Travel environment control

Publications (1)

Publication Number Publication Date
US20200118083A1 true US20200118083A1 (en) 2020-04-16

Family

ID=50686773

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/128,946 Active 2036-03-04 US10546274B2 (en) 2014-03-24 2015-03-24 Travel environment control
US16/710,932 Abandoned US20200118083A1 (en) 2014-03-24 2019-12-11 Travel Environment Control

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/128,946 Active 2036-03-04 US10546274B2 (en) 2014-03-24 2015-03-24 Travel environment control

Country Status (5)

Country Link
US (2) US10546274B2 (en)
EP (1) EP3123412A1 (en)
GB (2) GB2524495C (en)
HK (1) HK1215699A1 (en)
WO (1) WO2015145142A1 (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015105617B4 (en) 2015-04-13 2024-07-11 Recaro Aircraft Seating Gmbh & Co. Kg System for controlling an aircraft passenger seat unit
US10275587B2 (en) 2015-05-14 2019-04-30 Alclear, Llc Biometric ticketing
US9721081B2 (en) 2015-05-14 2017-08-01 Alclear, Llc Physical token-less security screening using biometrics
US10760809B2 (en) 2015-09-11 2020-09-01 Johnson Controls Technology Company Thermostat with mode settings for multiple zones
US20170074536A1 (en) 2015-09-11 2017-03-16 Johnson Controls Technology Company Thermostat with near field communication features
US10655881B2 (en) 2015-10-28 2020-05-19 Johnson Controls Technology Company Thermostat with halo light system and emergency directions
US10332039B2 (en) * 2016-08-17 2019-06-25 International Business Machines Corporation Intelligent travel planning
US10757451B2 (en) * 2017-02-16 2020-08-25 Thales Avionics, Inc. User centric service and content curation through in-flight entertainment system
US20180335776A1 (en) * 2017-05-16 2018-11-22 GM Global Technology Operations LLC Systems and methods for selecting driving modes in autonomous vehicles
CN107550128A (en) * 2017-08-28 2018-01-09 京东方科技集团股份有限公司 A kind of Intelligent seat and its control method
US10513339B2 (en) * 2017-08-30 2019-12-24 The Boeing Company Aircraft cabin climate control using data from mobile electronic devices
EP3460731A1 (en) * 2017-09-21 2019-03-27 Wolfgang Hildebrand Method and system for reducing the fuel consumption of passenger planes in an air traffic system
US10991138B2 (en) * 2017-12-22 2021-04-27 The Boeing Company Systems and methods for in-flight virtual reality displays for passenger and crew assistance
US10785340B2 (en) * 2018-01-25 2020-09-22 Operr Technologies, Inc. System and method for a convertible user application
US10765325B2 (en) * 2018-02-06 2020-09-08 The Boeing Company Passenger comfort system
JP7234496B2 (en) * 2018-02-06 2023-03-08 トヨタ自動車株式会社 Passenger support device
US11847548B2 (en) * 2018-02-23 2023-12-19 Rockwell Collins, Inc. Universal passenger seat system and data interface
US11107390B2 (en) 2018-12-21 2021-08-31 Johnson Controls Technology Company Display device with halo
JP7251975B2 (en) * 2018-12-27 2023-04-04 川崎重工業株式会社 Skin cooling system
US10929475B2 (en) 2019-02-25 2021-02-23 International Business Machines Corporation Facilitating a collaboration experience using social networking information
DE102019115203A1 (en) * 2019-06-05 2020-12-10 Recaro Aircraft Seating Gmbh & Co. Kg Seating system
US11203433B2 (en) * 2019-07-30 2021-12-21 B/E Aerospace, Inc. Predictive aircraft passenger cabin service system and method
US11169664B2 (en) 2019-10-25 2021-11-09 Panasonic Avionics Corporation Interactive mapping for passengers in commercial passenger vehicle
GB2590406B (en) * 2019-12-16 2024-04-03 Safran Seats Gb Ltd An aircraft passenger suite
FR3107508B1 (en) * 2020-02-21 2023-09-01 Stelia Aerospace System for managing the sleep and/or well-being of an aircraft passenger and method for implementing such a system
US11505325B2 (en) * 2020-09-29 2022-11-22 Ami Industries, Inc. Automatic ejection seat performance and accommodation optimization based on connector
US11518527B2 (en) 2020-09-29 2022-12-06 Ami Industries, Inc. Automatic ejection seat performance optimization based on detection of aircrew weight
US11891183B2 (en) 2020-09-29 2024-02-06 Ami Industries, Inc. Automatic ejection seat performance and accomodation optimization based on active input of aircrew data
US11518528B2 (en) * 2020-09-29 2022-12-06 Ami Industries, Inc. Automatic ejection seat performance and accommodation optimization based on passive detection of aircrew data
US20220371605A1 (en) * 2021-05-21 2022-11-24 At&T Intellectual Property I, L.P. Carrier transport vehicle personal sensor zone

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9425078D0 (en) * 1994-12-13 1995-02-08 British Airways Plc A seating unit
WO2004075714A2 (en) * 2003-02-28 2004-09-10 Cornel Lustig Device for manipulating the state of alertness
JP3788463B2 (en) * 2004-03-19 2006-06-21 ダイキン工業株式会社 Biological rhythm adjustment device, biological rhythm adjustment system
EP1963132B1 (en) 2005-12-23 2010-07-21 British Airways PLC Aircraft passenger seat
EP2069182A1 (en) * 2006-09-18 2009-06-17 Weber Aircraft LP Passenger seats
US8576064B1 (en) 2007-05-29 2013-11-05 Rockwell Collins, Inc. System and method for monitoring transmitting portable electronic devices
US20090112638A1 (en) 2007-10-29 2009-04-30 The Boeing Company System and Method for Virtual Queuing
US7878586B2 (en) 2007-10-29 2011-02-01 The Boeing Company System and method for an anticipatory passenger cabin
GB2454751B (en) 2007-11-19 2012-12-12 British Airways Plc Aircraft passenger seat
US20090187640A1 (en) 2008-01-23 2009-07-23 International Business Machines Corporation In-flight information system
US8321083B2 (en) 2008-01-30 2012-11-27 The Boeing Company Aircraft maintenance laptop
DE102009001366B4 (en) * 2009-03-06 2014-07-03 Airbus Operations Gmbh System for providing functions to a passenger
US9523985B1 (en) * 2009-08-11 2016-12-20 Rockwell Collins, Inc. Real time and in-flight dynamic personalization of an aircraft
FR2975303A1 (en) * 2011-05-17 2012-11-23 Stephane Bloch DEVICE AND METHOD FOR IN-FLIGHT REDUCTION OF TIME DIFFERENCE SYNDROME
US20130074108A1 (en) * 2011-09-15 2013-03-21 Douglas Cline Seatback Video Display Unit Wireless Access Points for Inflight Entertainment System
WO2013059671A1 (en) * 2011-10-21 2013-04-25 Nest Labs, Inc. Energy efficiency promoting schedule learning algorithms for intelligent thermostat
DE102012004840A1 (en) * 2012-03-13 2013-09-19 Recaro Aircraft Seating Gmbh & Co. Kg Seat control device, in particular for a passenger seat
US20130338857A1 (en) * 2012-06-15 2013-12-19 The Boeing Company Aircraft Passenger Health Management

Also Published As

Publication number Publication date
GB201405201D0 (en) 2014-05-07
GB2524495B (en) 2016-07-20
WO2015145142A1 (en) 2015-10-01
GB2524495A (en) 2015-09-30
GB2524495B8 (en) 2016-10-19
US20180181919A1 (en) 2018-06-28
GB2538339B (en) 2017-05-24
GB2524495C (en) 2017-04-05
GB2538339A (en) 2016-11-16
HK1215699A1 (en) 2016-09-09
GB2524495A8 (en) 2016-10-19
US10546274B2 (en) 2020-01-28
GB201600548D0 (en) 2016-02-24
EP3123412A1 (en) 2017-02-01

Similar Documents

Publication Publication Date Title
US20200118083A1 (en) Travel Environment Control
US10757451B2 (en) User centric service and content curation through in-flight entertainment system
US11532245B2 (en) Technical solutions for customized tours
JP7251989B2 (en) Passenger comfort system
US20200363220A1 (en) Systems and methods for personalized ground transportation
US20130338857A1 (en) Aircraft Passenger Health Management
EP3079090B1 (en) Method for processing data and electronic device thereof
US11494390B2 (en) Crowd-based scores for hotels from measurements of affective response
US11406788B2 (en) Information processing apparatus and method
KR20200097704A (en) Information processing apparatus, information processing method, and program
KR20100053149A (en) Apparatus and method for scheduling considering each attendees' context in mobile communicatiion terminal
EP3042327A2 (en) Wearable device
WO2015145139A1 (en) Dynamic tracking and control of passenger travel progress
US20230016773A1 (en) Dynamic Identity Verification System and Method
US9889934B2 (en) Digital crew assist
KR101624539B1 (en) Method for providing tour route using wireless network and Terminal using the same
JP2019006370A (en) Vehicle attendant task management system and method
US20220415162A1 (en) Flight attendant calling device, system and method for configuring a flight attendant calling device
CN112749872A (en) Dynamic delivery of crowd-sourced travel planning information
JP7447140B2 (en) Organization of points of interest during flight
JP2023553227A (en) Dynamic optimization environment control system
JP6505385B2 (en) INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING DEVICE, AND INFORMATION PROCESSING METHOD
US20240219183A1 (en) Method and system for guiding people to desired locations within an environment
JPWO2022098370A5 (en)
Liu et al. Follow your heart.

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION