GB2592217A - Method and system for providing adaptive display content on a display interface of a vehicle - Google Patents
Method and system for providing adaptive display content on a display interface of a vehicle Download PDFInfo
- Publication number
- GB2592217A GB2592217A GB2002288.5A GB202002288A GB2592217A GB 2592217 A GB2592217 A GB 2592217A GB 202002288 A GB202002288 A GB 202002288A GB 2592217 A GB2592217 A GB 2592217A
- Authority
- GB
- United Kingdom
- Prior art keywords
- user
- adaptive
- display
- learning module
- display system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000003044 adaptive effect Effects 0.000 title claims abstract description 63
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000006870 function Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 5
- 230000002123 temporal effect Effects 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- 230000009471 action Effects 0.000 abstract description 9
- 230000008859 change Effects 0.000 abstract description 3
- 238000012508 change request Methods 0.000 abstract description 2
- 238000010801 machine learning Methods 0.000 abstract description 2
- 238000013528 artificial neural network Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000001556 precipitation Methods 0.000 description 2
- 235000013361 beverage Nutrition 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/122—Instrument input devices with reconfigurable control functions, e.g. reconfigurable menus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/151—Instrument output devices for configurable output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/172—Driving mode indication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
- B60K2360/1868—Displaying information according to relevancy according to driving situations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/195—Blocking or enabling display functions
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Remote Sensing (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A method and an adaptive display system for providing a customised display of content based on a user's desire in a vehicle. It receives monitored sensor data obtained from on-board sensors of the vehicle. Received sensor data is extracted. Interpreted signals of individual sensors are processed by a probabilistic machine learning algorithm. A learning module weighs different interpreted signals based on the user's repeated patterns and situational usage habits making their information selection via the vehicle's inputs. The learning module automatically configures the layout of the user interface (UI) with content based on historical usage patterns. When the user requests a change in the display, this request is forwarded to a learning algorithm. An automation learning module takes this change request as an error and re-weighs the prediction probabilities far the predicted actions by the learning algorithm and the decision to automate a prediction is calibrated based on historically estimated errors.
Description
FORM 2 THE PATENTS ACT 1970 [39 OF 1970] & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10; rule 13)
TITLE OF THE INVENTION
METHOD AND SYSTEM FOR PROVIDING ADAPTIVE DISPLAY CONTENT ON A DISPLAY INTERFACE OF
A VEHICLE
PREAMBLE OF THE DESCRIPTION:
[0001] The following specification particularly describes the invention and the manner in which it is to be performed.
DESCRIPTION OF THE INVENTION:
TECHNICAL FIELD
100021 The present disclosure generally relates to a display support system. Particularly, but not exclusively, the disclosure relates to a method and system of providing adaptive display content on a display interface of a vehicle.
BACKGROUND OF THE DISCLOSURE
[0003] Modern vehicles are equipped with numerous digital displays for displaying information which is required by users for driving. However, these systems have too much information available to be displayed at one time on the limited screen space, and it can also be potentially distracting even if all the information can fit in the same screen space. Therefore a user would benefit from a vehicle that provides a customizable display that shows adaptive and predictive information to the used based on a subset of all available information that can be displayed.
[0004] Manual selection of content on the display causes inconveniences in some cases, distraction hazards in others, as the user may need different information in different situations and thus try to reconfigure displayed content during driving. Displaying an appropriate choice of content can make a crucial difference in emergency situations and may prevent potential accidents. For example, knowing the current engine RPM (Revolutions per minute) and G-meter may help a driver precisely anticipate the real-time throttle response and weight distribution to quickly plan an appropriate emergency manoeuvre for preventing a collision. This is a common issue for drivers using a traditional in-dash instrument cluster (IC) particularly if all displayed information is static or permanent. Similar issues exist in content selection on a head -up display (HUD), head-unit (HU), or steering wheel or passenger displays.
[0005] Modern vehicles have a built-in display unit that allows it to display selected information such as driving speed, navigation guidance, etc. with either a fixed layout or with very limited modularity. However, the display unit has no self-adaptive ability to context or user inputs. Generally, drivers and passengers have preferences over the content to be displayed on an IC, HUE), HU, steering wheel display and passenger display. For example., on the IC" drivers may prefer displaying a transmission gear number to properly anticipate power delivery during a lane-takeover on the freeway, while some may prefer to see their gas mileage during weekday rush hour, and others may prefer to receive haptic feedback (e.g. a vibration alert) on the steering wheel to be aware of nearby shopping carts and pedestrians when backing out of a supermarket parking spot.
10006] U520160321545 provides comprehensive collection and storing of data from at least one vehicle sensor then assigning likelihoods to at least two possible interface actions. The document further describes determining at least one most likely interface control action and providing that action to the user. Currently, the existing systems do not provide an efficient technique for defining and making content selection adaptive by selecting the appropriate content to be displayed with tailored predictions for multiple users throughout the vehicle, potentially using data from multiple vehicles,
SUMMARY OF THE DISCLOSURE
100071 In one non-limiting embodiment of the present disclosure a method is described of providing adaptive content on a display interface of a vehicle. The method comprises receiving by the adaptive display system the real-time signal data obtained from one or more sensors associated with the vehicle. The method comprises extracting real-time signal data associated with each of the one or more sensors. Thereafter, the method comprises processing the extracted real-time signal data to predict a probability of a user selection, where each interpreted real-time signal data is weighed based on a probability learning module associated with the adaptive display system. Any type of probability learning module can be used, such as for example a neural network, a Naive Bayes model, or a non-parametric Gaussian mixture model combined with a Naïve Bayes at its core, or other known models in the art. Further, the method dynamically provides the user interface with an adaptive layout and adaptive content based on the predicted probability of the user selection and historical usage patterns of the user along with corresponding context information by an automation learning module associated with the adaptive display system.
10008] In an embodiment of the disclosure, the contextual information includes one or more of: temporal information, geolocation information, weather information, driving information associated with the user, driving mode, user attention and activities, haptic feedback and acoustic feedback.
[0009] In an embodiment of the disclosure, the probability of the user selection is monitored against a predetermined confidence score to determine an execution possibility of the user selection. The automation learning module estimates a prediction error made during prediction of the user selection based on monitoring the actual selection, and continuously learns based on the selection.
100101 In an embodiment of the disclosure, the predetermined confidence score is updated adaptively based on the estimated prediction error. In an embodiment, based on error rate, at least one predetermined weight function is adopted to re-weigh predicted probabilities to calibrate the automated prediction based on historically estimated errors 10011] In an embodiment of the disclosure, the user interface displayed with the user preference may be displayed on the IC, HUD, HU, steering wheel displays or passenger display.
[0012] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0013] The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements, and in which: [0014] Fig.1 illustrates an exemplary environment of providing adaptive content on a display interface of a vehicle in accordance with some embodiments of the present disclosure; [0015] Fig.2 shows a detailed block diagram of providing adaptive content on a display interface of a vehicle in accordance with some embodiments of the present disclosure; [0016] Fig.3 shows an exemplary representation of a user and a passenger having adaptive displays based on a selection decision in accordance with some embodiments of the present disclosure; and [0017] Fig.4 shows a flowchart depicting a method of providing an adaptive display during driving of a vehicle in accordance with some embodiments of the present disclosure.
[0018] The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
10019] While the embodiments in the disclosure are subject to various modifications and alternative forms, a specific embodiment thereof has been shown by way of example in the figures and will be described below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
10020] It is to be noted that a person skilled in the art would be motivated from the present disclosure and may modify various configurations of circuits, systems, the architecture and method of updating vehicle functions, which may vary from application to application. However, such modification should be construed to be within the scope and spirit of the present disclosure. Accordingly, the drawings show only those specific details that are pertinent to understand the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
[0021] The terms "comprises", "comprising", or any other variations thereof used in the disclosure, are intended to cover a non-exclusive inclusion, such that a device, system, method that comprises a list of components does not include only those components but may include other components not expressly listed or inherent to such system, or assembly, or device. In other words, one or more elements in a system proceeded by "comprises.., a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or mechanism.
10022] The present disclosure provides a method and an adaptive display system for providing adaptive content to be displayed based on a user's desire in a vehicle. Particularly, the present disclosure receives sensor data obtained from on-board sensors of the vehicle. The received sensor data is processed by a probability learning module that extracts and interprets signals from individual sensors. The neural network learning module is used to weigh different interpreted signals based on the user's repeated patterns and situational usage habits of making available vehicle user-input selections. The learning module automatically configures a layout of the user interface (UI) with available content based on historical usage patterns of the user. When a user requests a change in the Ul display, this information is forwarded to a learning algorithm. The automation learning module takes this change request as a prediction error and re-weighs prediction probabilities for the predicted actions by the learning algorithm. The decision to automate a prediction is calibrated based on historically estimated errors. Therefore, the learning algorithm may be used to learn the strategy of weighing different interpreted signals to dynamically and conditionally automate a selection decision based on a user's choice.
[0023] The following paragraphs describe the present disclosure with reference to FIG. 1 through FIG. 4. In the figures, the same element or elements which have the same functions are indicated by the same reference numbers. It is to be noted that the vehicle is not completely illustrated in the figures for the purpose of simplicity. One skilled in the art would appreciate that the method as disclosed in the present disclosure may be used for providing an adaptive display during driving of a vehicle.
[0024] Fig.1 illustrates an exemplary environment of providing adaptive content on a display interface of a vehicle in accordance with some embodiments of the present disclosure.
[0025] As shown in Fig.1, an environment 100 includes an adaptive display system 101. The adaptive display system 101 is connected through a communication network 109 to a vehicle 103 and a display unit 105. The vehicle 103 includes a sensing unit 107. The sensing unit 107 may include one or more sensors configured at different locations in or on the vehicle 103. In an embodiment, the vehicle 103 may be any automobile for example, a car, truck, bus or the like. The one or more sensors may include but are not limited to: a distance radar, active exterior and interior cameras, exterior and interior microphones, a rain sensor, light sensor, motion sensor or the like. A person skilled in the art would understand that any other vehicle sensor not mentioned explicitly may also be used to obtain vehicle information in the present disclosure. Further, the display unit 105 may include an instrument cluster (IC), head-up display (HUD), head-unit (HU), steering wheel display or passenger displays. Vehicles are equipped with digital displays that offer a level of customizable ability for displaying information the user desires. A person skilled in the art would understand that any other display unit not mentioned explicitly may also be used as display unit in the present disclosure. Further, the communication network 109 may include, but is not limited to, a direct interconnection, an e-commerce network, a Peer-to-Peer (P2P) network, local area network (LAN), wide area network (WAN), wireless network (for example, using wireless application protocol), Internet, Wi-Fi, or the like.
[0026] The adaptive display system (ADS) 101 is used to provide a customizable ability for displaying the user desired information on a display interface of a vehicle. The ADS 101 can define and select the appropriate content on display unit 105 to dynamically provide a personalized driving experience.
[0027] Further, the adaptive display system 101 may include at least one central processing unit ("CPU" or "processor") 113 and a memory 111 for storing instructions executable by the at least one processor 113. The processor 113 may include at least one data processor for executing program components for executing user or system-generated requests. The memory 111 is communicatively coupled to the processor 113. In an embodiment, the vehicle 103 may include other units (not shown explicitly in the Fig.1) for operating the vehicle 103.
[0028] In an embodiment, the ADS 101 includes a probability learning module 115. The probability learning module 115 may be deployed in the instrument cluster (IC) of a vehicle, and the choice of information to be displayed on the IC are dynamically and conditionally selected by the learning algorithm. Driving information associated with the user such as braking, acceleration, steering, and driving mode such as comfort or sport mode can be considered user-selected inputs, and the algorithm can make a display selection decision depending at least upon the historical user selection of these inputs. The algorithm makes a selection decision based on corresponding context information such as but not limited to: absolute time, day of the week, location information, and weather information such as humidity, temperature, precipitation type (if any), along with historical user selection inputs and previous user patterns. Further, the probability learning module 115 can extract the repeated patterns and situational usage habits of the user to make a selection decision.
[0029] In an embodiment, different sensor data is recorded from the vehicle's on-board sensing unit 107. For instance, one of the sensors such as a camera may be used for emotion interpretation and gaze tracking of the user. Other sensors used may include but are not limited to: microphones, motion sensors, and distance radar readings. The probability learning module 115 may be used to extract and interpret real-time signal data from individual sensors to predict the probability of user selections. Neural network probability learning module 115 may be used to weight different interpreted real-time signal data. The user interface may be displayed based on the predicted probability of user selection and historical usage patterns of the user and context information by automation learning module 117. The contextual information may include geolocation information, temporal information, weather information, user vehicle driving information, driving mode, user attention and activities, haptic feedback, and acoustic feedback.
[0030] In an embodiment the ADS 101 includes automation learning module 117. The automation learning module 117 may be used to track the probability of user selection and to assess whether the predictions have been executed or not by the user and the errors from the prediction have been determined. Estimated prediction errors includes mistakes made in predicting the probability of a user selection determined by the probability learning module. The error rate is used to calculate qualities such as true positives, false positives, true negatives, and false negatives to to re-weigh the prediction probability made by probability learning module 115. Hence, the decision to automate a prediction is performed based on historical estimated errors. The automation learning module 117 learns from the error that's occurred in predicting the user selection. Thus, continuous learning of the prediction made by automation learning module 117 helps to display more adaptive and customized contents and layouts based on user preferences.
[0031] As shown in Fig. 2, the adaptive display system 101 may include data 200 and one or more modules 209 which are described herein in detail. In an embodiment, data 200 may be stored within the memory 111. The data 200 may include, for example, sensor data 201, weighted data 203, displayed data 205 and other data 207.
[0032] The sensor data 201 may include details regarding driving information of the user such as acceleration, braking, steering and the like. It may also include information regarding driving mode, driver attention and activities monitoring information obtained from various sensors. The sensors may include but are not limited to cameras, microphones, movement sensors, all vehicle sensors and distance sensors.
[0033] The weighted data 203 may include the sensor data 201 extracted from the sensing unit 107. The weighted data may be used to predict the probability of the user selection. The extracted sensor data is interpreted and weighted by neural network probability learning module 213. Weighting of the extracted sensor data 203 is based on the historical usage pattern of the user to automate the selection prediction. Thus, the algorithm dynamically displays the contents based on a predicted probability of the user selection.
[0034] The displayed data 205 may include the weighted data extracted from the sensor data 203. It may be configured to automatically display the layout and contents based on historical patterns and situational usage habits of the users. Thus, the dynamic adaptive content based on user selection may be provided to meet user preferences. Hence, the user interface displayed with user preference contents may be displayed on one of the IC, HUD, HC, steering wheel displays or passenger displays.
[0035] The other data 207 may store data, including temporary data and temporary files, generated by modules 209 for performing the various functions of the adaptive display system 101. It includes cloud-streamed information which includes a trained algorithm for processing the obtained contextual information along with the sensor data 201 to result in a user preference-based user interface.
[0036] In an embodiment, the data 200 in the memory 111 are processed by the one or more modules 209 present within the memory 111 of the adaptive display system 101. In an embodiment, the one or more modules 209 may be implemented as dedicated units. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a field-programmable gate array (FPGA), programmable system-on-chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. In some implementations, the one or more modules 209 may be communicatively coupled to the processor 113 for performing one or more functions of the adaptive display system 101.
[0037] In one implementation, the one or more modules 209 may include, but are not limited to, a receiving module 211, a probability learning module 213, an automation learning module 215, and a display module 217. The one or more modules 209 may also include other modules 219 to perform various miscellaneous functionalities of the adaptive display system 101. In an embodiment, the other modules 219 may include a cloud-streamed training module. The cloud-streamed training module may include an already-trained learning module used for processing the obtained contextual information from the sensor data 201 to obtain a user preference-based Ul.
[0038] The receiving module 211 may receive the sensor data 201 from the sensing unit 107. Further, the receiving module 211 may receive the real-time signal data from one or more sensing unit 107 associated with the vehicle 103.
[0039] The probability learning module 213 may initiate sensing unit 107 to monitor the sensor data 201 by processing the extracted sensor data to predict the probability of the user selection. The extracted sensor data 201 is interpreted from individual sensing unit 107 and weighted based on the probability learning module 213. Weighting the sensor data 201 depends on the historical usage patterns of the user and contextual information associated with adaptive display system 101. The probability learning module 213 is to learn an efficient, mathematically low-dimensional compressed description of the input features. Probability learning module 213 is used to predict user preference conditions which are to be displayed dynamically on the Ul.
[0040] The automation learning module 215 may monitor the probability of user selection by keeping track of prediction quality. Prediction quality may include probabilities maintained by probability learning module 213 used to predict whether predictions may be executed or not. Particularly, it maintains the prediction scores or probabilities from the probability learning module 213 to check whether the predictions made by probability learning module 213 were executed or not and whether or not there were mistakes made in the predictions. By monitoring and predicting the user preferences, the module 215 continuously learns to display adaptive content on the display module 217. If a confidence score or prediction score to determine the user selection is high, the module automatically changes the contents on the display module 217. In an embodiment, for a threshold Tc, the historical usage pattern of the user may be split into two sets whose P(a) is less than and greater than the threshold Tc, as shown in the below equations.
AHI = {(Sa; Pa) for (Sa; Pa) C AH I Sa <= Tc} (1) AHg = {(Sa; Pa) for (Sa; Pa) C AH I Sa > (2) where (Sa; Pa) is a usage tuple in the historical usage pattern of the user, AH is action history, and the action score Sa is the probability of that usage as stored in the historical usage pattern, whereas Sa is either True or False, and stored if the predicted action was actually correct in the past.
If the user changes the display back to the previous state, it is considered as a negative or error in prediction and the probability learning module 213 is adapted to reduce the score of wrongly predicted action. Based on the error rate, a predetermined weight function is adopted to re-weigh predicted probabilities to automate prediction based on historical estimated errors. Thus, the automation learning module 215 learns to automate the Ul layout change based on user preferences.
[0041] The display module 217 may display the adaptive content on the display interface of a vehicle 103. The user interface with the adaptive content may be displayed on at least one of the IC, HUD, KU steering wheel display(s) and passenger display(s). For instance, more advanced steering units may have extended displays built into the entire ring or face of the steering wheel, which may be used to display graphical, colored indications for emphasizing directional information, nearby possible obstacles, and current engine/motor/wheel rotation count. For instance arrows could be displayed on the steering wheel to alert the driver to perform emergency evasive steering and the like. In a passenger display, the user can customize the zone-specific comfort settings such as air-conditioning direction, temperature, fan speed, massage settings, seat heating and ventilation, ambient lighting color and patterns, seating position, active noise canceling, powered sunshades, and surround sound settings. Users can customize multi-media such as video and music playback, online video streaming, and video games. Users can also customize assistance features including daily schedule reminders, place-of-interest recommendations, flight or airport terminal information, beverage or food mobile ordering services, and remote commands for smart home integration. In one embodiment, the layout of the Ul may be automated to configure display content based on historical usage patterns of the user. Audio and haptic feedback can be combined to provide extra channels of communication to the user. For instance, a selection of haptic vibration on the steering wheel and acoustic read-out from speakers along with IC display information can be used to alert the driver or user(s) of nearby pedestrians or detecting obstacles.
[0042] The other modules 219 may include a cloud-streamed training module which includes a trained algorithm for processing the obtained contextual information along with sensor data 201 to obtain a user preference-based U I. [0043] Fig.3 shows an exemplary representation of a user and a passenger viewing respective adaptive displays based on a selection decision in accordance with some embodiments of the present disclosure.
[0044] Referring now to Fig.3, an exemplary representation 300 for providing adaptive content on the display interface of a vehicle is illustrated. The exemplary representation 300 includes an environment in which a user 302 and a passenger 307 are experiencing the adaptive content on two different display interfaces (304 and 309) of the vehicle.
[0045] The user 302 of car 301 may be provided with one or more of the user interfaces described above. In a usage scenario, the user 302 wants to experience the steering wheel display, which is configured to display detailed vehicle performance information on small LCD displays, where physical display changing user input devices can include buttons and rotating dials. The probability learning module 213 may be configured to situationally choose or predict the user selection based on the present user's historical habits using those physical display changing user input devices. Similarly, passenger 307 may want to play a particular video game. Based on historical usage preferences such as the time of day and day of the week, the probability learning module 213 may be configured to again situationally choose or predict the passenger's 307 preferred video game and volume setting for the rear display.
[0046] An exemplary embodiment of the present disclosure provides a selection of adaptive content for IC and HUD. In order to provide adaptive content selection, a probabilistic based algorithm such as a deep learning algorithm, kernel density estimator, Gaussian mixture model or similar is provided within the IC software, so that the choice of information to display on the IC screen(s) are dynamically and conditionally selected by the learning based algorithm itself automatically. The algorithm makes such selection decisions primarily based on historical user selection inputs. In one embodiment, the user selection input may be received explicitly or implicitly. Further, the algorithm may also receive context information corresponding to the input. For example, the context information may include absolute time, day within a week, location, weather information such as precipitation, humidity, temperature, raining/hailing/storm/snowing and the like, a user's vehicle driving inputs such as acceleration, braking, steering, and the like, driving mode such as comfort, sport and the like. Since the IC screen space may act as a different information source as compared to the HUD or HU, the specifications of the HUD or HU-controlling algorithm may be different from that of the IC.
[0047] Further, the user 302 may also be provided with the adaptive content displayed on the user interface of the display units jointly to manage the content selection of any or all the displays as a whole. The learning modules maximise the utility over the combined screen space of the displays with the context information along with the historical user selection inputs. Thus, it assists in maximizing the selection variety on the combined screen space, not excluding the content duplication between the displays.
[0048] In an embodiment, the passenger 307 is provided with the passenger display configured to provide entertainment, convenience and comfort. The passenger can further customize the zone-specific settings such as seat-specific, comfort settings such as air-conditioning direction, temperature and fan speed, seating position, active noise cancelling, powered sunshades, surround sound settings, and multi-media such as video and music playback and online video streaming. The learning modules automatically configure layout and content based on historical usage patterns of the user to maintain customized quality of the passengers.
[0049] Additionally, haptic and audio feedback are jointly added to the above-mentioned display units for providing extra channels of communication to the user. These include a selection of haptic vibrations on the steering wheel, acoustic output from the speakers, and alerts displayed based on detected obstacles. The learning modules are employed for Ul layout and content changes to include haptic feedback and acoustic feedback. Thus, the learning of the module happens together as a whole, though it can be an ensemble of probabilistic neural networks.
[0050] Fig.4 shows a flowchart depicting a method of providing an adaptive display while driving a vehicle in accordance with some embodiments of the present disclosure [0051] As illustrated in FIG. 4, the method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
[0052] The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
10053] At block 401, the real-time signal data is received, by the receiving module 211, from one or more sensors associated with the vehicle 103.
10054] At block 403, the real-time signal data is extracted from one or more sensors associated with the vehicle 103.
[0055] At block 405, real-time signal data is processed by extracting and interpreting from the sensor(s), and each interpreted signal data is weighted based for example using the neural network probability module or other machine learning algorithm.
10056] At block 407, the adaptive display system 101 provides a user interface with a layout and content based on the predicted probability of user selection and historical usage patterns of the user and context information by the automation learning module.
[0057] EQUIVALENTS 10058] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0059] It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.).
[0060] It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations.
[0061] However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, typically means at/east two recitations, or two or more recitations).
[0062] Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B".
[0063] In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
10064] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims. is
REFERRAL NUMERALS
Reference Number Description
Environment 101 Adaptive display system 103 Vehicle Display unit 107 Sensing unit 109 Communication network 111 Memory 113 Processor Probability learning module 117 Automation learning module Data 201 Sensor data 203 Weighed data 205 Displayed data 207 Other data 209 Modules 211 Receiving module 213 Probability learning module 215 Automation learning module 217 Display module 219 Other modules 301 Car 302 User 307 Passenger 304 User display 309 Passenger display
Claims (10)
- Claims: We claim: 1. A method of providing adaptive content on a display interface of a vehicle, the method comprising: receiving, by the adaptive display system, real-time signal data obtained from one or more sensors associated with the vehicle; extracting, by the adaptive display system, real-time signal data associated with each of the one or more sensors; processing, by the adaptive display system, the extracted real-time signal data to predict a probability of a user selection, wherein each interpreted real-time signal data is weighted using a probability learning module associated with the adaptive display system; and providing dynamically, by the adaptive display system, the user interface with an adaptive layout and the adaptive content is based on the predicted probability of the user selection and historical usage pattern of the user and corresponding context information by an automation learning module associated with the adaptive display system.
- 2. The method as claimed in claim 1, wherein the contextual information includes one or more of: temporal information, geolocation information, weather information, driving information associated with the user, driving mode, user attention and activities, haptic feedback and acoustic feedback.
- 3.The method as claimed in claim 1, wherein the probability of user selection is monitored, by the automation learning module, against a predetermined confidence score to determine an execution possibility of the user selection, wherein the automation learning module estimates prediction error made during prediction of the user selection and continuously learns based on the execution possibility determination.
- 4.The method as claimed in claim 3, wherein the predetermined confidence score is updated adaptively based on the estimated prediction error, further wherein based on the error rate at least one predetermined weight function is adopted to re-weigh predicted probabilities based on historically estimated errors.
- 5. The method as claimed in claim 1, wherein the user interface displayed with the user preference may be used to display on at least one of: an instrument cluster (IC), head-up display (HUD), head unit (HU), steering wheel displays and passenger display.
- 6. An adaptive display system for providing adaptive content on a display interface of a vehicle, comprising: a processor; a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to: receive a real-time signal data obtained from one or more sensors associated with the vehicle; extract the real-time signal data associated with each of the one or more sensors; process the extracted real-time signal data to predict a probability of a user selection, wherein each interpreted real-time signal data is weighted based on a probability learning module associated with the adaptive display system; provide dynamically the user interface with an adaptive layout where the adaptive content is based on the predicted probability of the user selection and historical usage pattern of the user and corresponding context information that is predicted by an automation learning module associated with the adaptive display system.
- 7. The adaptive display system as claimed in claim 6, wherein the contextual information includes one or more of temporal information, geolocation information, weather information, driving information associated with the user, driving mode, user attention and activities, haptic feedback and acoustic feedback.
- 8. The adaptive display system as claimed in claim 6, wherein the probability of the user selection is monitored by the automation learning module against a predetermined confidence score to determine an execution possibility of the user selection, wherein the automation learning module estimates a prediction error made during prediction of the user selection and continuously learns based on the execution possibility determination.
- 9. The adaptive display system as claimed in claim 8, wherein the predetermined confidence score is updated adaptively based on the estimated prediction error, further wherein based on an error rate at least one predetermined weight function is adopted to re-weigh predicted probabilities to calibrate a decision to automate the prediction based on historically estimated errors.
- 10. The adaptive display system as claimed in claim 6, wherein the user interface displayed with the user preference may be used to display on at least one of: an instrument cluster (IC), head-up display (HUD), head unit (HU), steering wheel displays and passenger display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2002288.5A GB2592217A (en) | 2020-02-19 | 2020-02-19 | Method and system for providing adaptive display content on a display interface of a vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2002288.5A GB2592217A (en) | 2020-02-19 | 2020-02-19 | Method and system for providing adaptive display content on a display interface of a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202002288D0 GB202002288D0 (en) | 2020-04-01 |
GB2592217A true GB2592217A (en) | 2021-08-25 |
Family
ID=69956488
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2002288.5A Withdrawn GB2592217A (en) | 2020-02-19 | 2020-02-19 | Method and system for providing adaptive display content on a display interface of a vehicle |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2592217A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220261136A1 (en) * | 2021-02-09 | 2022-08-18 | Ford Global Technologies, Llc | Vehicle display |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015072255A1 (en) * | 2013-11-18 | 2015-05-21 | Mitsubishi Electric Corporation | Information system and method for using prediction engine of vehicle |
US20160321545A1 (en) | 2013-12-19 | 2016-11-03 | Daimler Ag | Predicting an interface control action of a user with an in-vehicle user interface |
WO2017038166A1 (en) * | 2015-08-28 | 2017-03-09 | ソニー株式会社 | Information processing device, information processing method, and program |
-
2020
- 2020-02-19 GB GB2002288.5A patent/GB2592217A/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015072255A1 (en) * | 2013-11-18 | 2015-05-21 | Mitsubishi Electric Corporation | Information system and method for using prediction engine of vehicle |
US20160321545A1 (en) | 2013-12-19 | 2016-11-03 | Daimler Ag | Predicting an interface control action of a user with an in-vehicle user interface |
WO2017038166A1 (en) * | 2015-08-28 | 2017-03-09 | ソニー株式会社 | Information processing device, information processing method, and program |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220261136A1 (en) * | 2021-02-09 | 2022-08-18 | Ford Global Technologies, Llc | Vehicle display |
US11977715B2 (en) * | 2021-02-09 | 2024-05-07 | Ford Global Technologies, Llc | Vehicle display |
Also Published As
Publication number | Publication date |
---|---|
GB202002288D0 (en) | 2020-04-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2914023B1 (en) | Data aggregation and delivery | |
US20190394097A1 (en) | Vehicle application store for console | |
US9720680B2 (en) | Methods and apparatus for wirelessly updating vehicle systems | |
CN106335513B (en) | Method and system for intelligent use of on-board time with advanced driver assistance and autonomous driving | |
EP3482344B1 (en) | Portable personalization | |
EP2891589B1 (en) | Automatic driver identification | |
CN108284840B (en) | Autonomous vehicle control system and method incorporating occupant preferences | |
CN104768130B (en) | In-vehicle notification presentation scheduling | |
EP2985985A1 (en) | Driver status indicator | |
CN107628033B (en) | Navigation based on occupant alertness | |
CN110023981B (en) | Device for assisting a user of a walking tool, walking tool and method | |
US20150160019A1 (en) | Controlling in-vehicle computing system based on contextual data | |
CN105824494B (en) | Personalized display system for integrally changing vehicle content and vehicle content management method | |
US9098367B2 (en) | Self-configuring vehicle console application store | |
JP6348357B2 (en) | Information providing apparatus, communication system, and information providing method | |
KR102099328B1 (en) | Apparatus, vehicle, method and computer program for calculating at least one video signal or control signal | |
US20160025497A1 (en) | Pre-caching of navigation content based on cellular network coverage | |
CN110321043B (en) | Method of adapting the operation of a vehicle control system and device for use in the method | |
EP1999736A2 (en) | In-vehicle conditional multi-media center | |
US9376117B1 (en) | Driver familiarity adapted explanations for proactive automated vehicle operations | |
JP5621681B2 (en) | In-vehicle information presentation device | |
WO2018101978A1 (en) | Vehicle tutorial system and method for sending vehicle tutorial to tutorial manager device | |
US10106173B2 (en) | Systems and methods of an adaptive interface to improve user experience within a vehicle | |
GB2592217A (en) | Method and system for providing adaptive display content on a display interface of a vehicle | |
CN112109645B (en) | Method and system for providing assistance to a vehicle user |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |