GB2519936A - Human-machine interface - Google Patents

Human-machine interface Download PDF

Info

Publication number
GB2519936A
GB2519936A GB1315618.7A GB201315618A GB2519936A GB 2519936 A GB2519936 A GB 2519936A GB 201315618 A GB201315618 A GB 201315618A GB 2519936 A GB2519936 A GB 2519936A
Authority
GB
United Kingdom
Prior art keywords
human
user
interface
changing
machine interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1315618.7A
Other versions
GB201315618D0 (en
Inventor
Simon Thompson
Jean-Jacques Loeillet
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1315618.7A priority Critical patent/GB2519936A/en
Publication of GB201315618D0 publication Critical patent/GB201315618D0/en
Publication of GB2519936A publication Critical patent/GB2519936A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/29
    • B60K35/65
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • B60K2360/1442
    • B60K2360/184
    • B60K2360/186
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/60Substation equipment, e.g. for use by subscribers including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6075Portable telephones adapted for handsfree use adapted for handsfree use in a vehicle
    • H04M1/6083Portable telephones adapted for handsfree use adapted for handsfree use in a vehicle by interfacing with the vehicle audio system

Abstract

A method of adapting a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interface within the vehicle; determining the frequency with which the user selects available user-selectable interface options within the human-machine interface; adapting the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface. Aspects of the invention include the human-machine interface comprises a display screen and adapting the interface comprises changing the appearance of certain on-screen elements in comparison to other on-screen elements. Changing the appearance of on-screen elements could comprise: changing the font colour of text; changing the colour of the element; increasing the brightness of high use interface options relative to low use interface options; decreasing the brightness of low use interface options relative the high use interface options; removing certain on-screen elements from a given menu level of a menu hierarchy. It provides a method for adapting a human-machine interface within a vehicle based on the users usage pattern. The frequency with which a user interacts with available user-selectable interface options is monitored and the HMI adapted accordingly.

Description

HUMAN-MACHINE INTERFACE
TECHNICAL FIELD
The present disclosure relates to a human-machine interface. In particular, the present disclosure relates to methods and systems for configuring a human-machine interface in a vehicle to improve usability. Aspects of the invention relate to a system, to a method and to a vehicle.
BACKGROUND
Within a vehicle cabin there will usually be a range of controls that a user may interact with to select/alter various on-board vehicle sub-systems. Such "human-machine interfaces" (HMIs) may allow control of air conditioning systems, selection of a vehicle drive mode, access to vehicle entertainment systems and may also allow a suitable mobile telecommunications device to pair with the vehicle systems (e.g. via a Bluetooth® or physical connection).
HMIs within the cabin may take the form of physical controls (which term is taken to include switches, buttons, control knobs etc.) or display screens that may be touch-enabled.
Modern vehicles may contain a range of HMIs and the operation of such HMIs may increase workload on the vehicle user. There may also be a training burden for new users of a vehicle or where additional functions are provided via an upgrade (e.g. to operating software for a display screen).
It is therefore an object of the present invention to provide a method and system for improving user interaction with the human-machine interfaces within a vehicle.
SUMMARY OF THE INVENTION
Aspects of the invention provide a system, a method and a vehicle as claimed in the appended claims.
According to another aspect of the present invention there is provided a method of adapting a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interface within the vehicle; determining the frequency with which the user selects available user-selectable interface options within the human-machine interface; adapting the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
Embodiments of the present invention provide a method for adapting a human-machine interface (HMI) within a vehicle based on the user's usage pattern. The frequency with which a user interacts with available user-selectable interface options is monitored and the HMI adapted accordingly.
Conveniently, the human-machine interface may be adapted between ignition key events. In this way the HMI presents a consistent interface to a user during a drive cycle.
The human-machine interface may comprise a display screen and adapting the interface may comprise changing the appearance of certain on-screen elements in comparison to other on-screen elements. Changing the appearance of on-screen elements may comprise changing the font colour of text within the element or changing the colour of the element.
Conveniently, changing the appearance of on-screen elements may comprise changing the appearance of high use interface options relative to low use interface options. For example, the font colour of high use elements may be made more visible while low use interface elements may be changed to a less readable font colour.
Changing the appearance of on-screen elements may comprise increasing the brightness of high use interface options relative to low use interface options. Alternatively or additionally, changing the appearance of on-screen elements comprises decreasing the brightness of low use interface options relative the high use interface options.
Changing the appearance of on-screen elements may comprise removing certain on-screen elements from a given menu level of a menu hierarchy (e.g. low use elements may be removed such that high use elements remain on a given menu level). Conveniently, the method may further comprise placing the removed on-screen elements in a lower menu level than the given menu level and providing a single access button on the given menu level in order to access the removed elements.
Changing the appearance of on-screen elements may also comprise moving certain on-screen elements within a menu level of a menu hierarchy or reordering certain on-screen elements within a menu level of a menu hierarchy.
In the event that the human-machine interface comprises a physical button array! adapting the interface may comprise changing the illumination levels of at least one button (or other physical control element such as a switch, dial, lever etc.) in the array.
S According to a further aspect of the present invention there is provided a system for adapting a human-machine interface within a vehicle, the method comprising: an input arranged to receive data relating to user interaction with the human-machine interface within the vehicle; processing means arranged to monitor user interaction with the human-machine interface within the vehicle and to determine the frequency with which the user selects available user- selectable interface options within the human-machine interface and to adapt the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
According to a first example useful for understanding the present invention there is provided a method of providing help prompts relating to the use of human-machine interfaces within a vehicle, the method comprising: monitoring user interaction with the human-machine interfaces within the vehicle; assessing a monitored user interaction against pre-determined available user interaction options; outputting a help prompt to the user in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
The present example useful for understanding the invention provides a method for dynamically assessing whether a user of a human-machine interface (HMI) requires help with their interaction. By monitoring the user's interaction against a known set of available interaction options the present invention is capable of outputting a help prompt. For example, if the system assesses that a user is repeatedly omitting to select certain functionality that is available then a help prompt may be sent highlighting the missed functionality. Additionally, if a user's action is deemed to fall below a certain efficiency threshold (e.g. a user's path between two menu points is not the shortest available) then a help prompt may be output highlighting mechanisms by which their interaction with the HMI can be improved.
Conveniently, the human-machine interface may comprise a touch-enabled display screen.
Touch-enabled display screens may comprise screens which incorporate a touch sensitive element (e.g. a screen similar to those found in smart phones/tablet devices) and may also comprise screens where touch sensitive elements are disposed around the display screen (e.g. a screen arrangement similar to the Samsung® Galaxy 53 smart phone which incorporates touch sensitive buttons beneath the main screen).
The monitoring the user interaction may comprise monitoring when the user touches the display screen. For example, each time the user interacts with the display device (each "touch event") may be monitored.
Conveniently, monitoring user interaction may comprise monitoring a user as they move from a start interface on the display screen to an end interface, the start and end interfaces being different menu levels within a menu hierarchy. Foi example, the user may be monitored moving from a "welcome" screen to a telephone contacts screen.
In instances where the user moves between menu levels the action of monitoring user interaction may comprise recording instances where the user moves up a menu level during a menu interaction where the end interface is at a lower menu level than the start interface.
Furthermore, monitoring user interaction may also or additionally comprise recording the time taken for the user to move from the start to the end interface.
The pre-determined available user interaction options may comprise optimal routes through a menu hierarchy between two interface menu levels. Pre-determined available user interaction options may also comprise shortcut buttons to jump between interface menu levels. The interface levels may be non-adjacent levels within the menu hierarchy.
Pre-deterrnined available user interaction options may also comprise alternative user input interaction methods (e.g. swipe' gesture and up/"down buttons may comprise alternative methods of navigating through a list). The alternative user input interaction methods may comprise swipe gestures versus on-screen scroll buttons.
The pre-determined available user interaction options may comprise available user-selectable menu options at a given menu level within a nienu hierarchy (e.g. for any given menu level the present invention may monitor with respect to all the available selectable menu/UI controls).
The help prompt may comprise a visual dialog box displayed on the display screen. In one embodiment the dialog box may comprise an overlay dialogue box. In an alternative arrangement a notification message may be displayed on the display screen. Such a notification message may be displayed in a dedicated part of the display screen or may be incorporated within the information already displayed on the screen.
The dialog box may comprise text based help message or icon/graphical based help.
The help prompt may comprise an audio message.
Preferably, the help prompt relates to available user input interaction methods, unused user-selectable menu options, unused shortcut functions.
The human-machine interface may comprise a physical control array (e.g. control buttons, control switches, dials, levers etc.) and monitoring user interaction may comprise monitoring user interaction with physical controls within the array. Furthermore, pre-determined available user interaction options may comprise available user-selectable control settings for a given vehicle settings configuration. For a physical control array the help prompt may comprise increasing illumination of certain controls relative to other buttons in the array.
Illumination of controls within a physical control array may be controlled in a number of ways.
For example, available user input interaction controls may have their illumination levels increased compared to other controls within the array. Alternatively, controls that will not have any effect in a given scenario may have their default illumination levels decreased. In one embodiment, the illumination of a single button within an instrument panel may be altered so as to guide the user to that button.
Monitoring user interaction may comprise monitoring user workload. User workload may be determined via one or more of: vehicle speed; vehicle acceleration; steering inputs to a vehicle steering wheel; traffic density; road situation; driver eye tracking, Monitoring user interaction may comprise monitoring user stress, e.g. by monitoring galvanic skin response, changes in skin temperature.
According to a second example useful for understanding the present invention, there is provided a system for providing help prompts relating to the use of human-machine interfaces within a vehicle, the system comprising: an input arranged to data relating to user interaction with the human-machine interfaces within the vehicle; processing means arranged to monitor user interaction with the human-machine interfaces within the vehicle and to assess a monitored user interaction against pre-determined available user interaction options; output arranged to output a help prompt to the user, the help prompt being generated in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a representation of a vehicle system including human-machine interfaces, control components and electronic control unit; Figures 2 to 9 show a human-machine interface within a vehicle operating in accordance with an example useful for understanding the present invention; Figure 10 is a flow chart of a control method in accordance with human-machine interactions depicted in Figures 2 to 9; Figures 11 to 15 show a human-machine interface within a vehicle operating in accordance with an embodiment of the present invention; Figure 16 is a flow chart of a control method in accordance with human-machine interactions depicted in Figure 11 to 15.
DETAILED DESCRIPTION
Figure 1 shows a representation of a number of vehicle sub-systems within a vehicle. Shown in Figure 1 are an electronic control unit (ECU) 2 which is in communication with two main types of user interfaces, a touch-enabled display screen and a number of hard user-operable controls 6 (e.g. buttons, switches, dials etc.). The display screen 4 and control switches 6 are part of an instrument cluster 8 within the vehicle and represent human-machine interfaces (HMIs) that enable the user to interact with the vehicle.
The display screen 4 may be used to display various information types to the user (e.g. time information, radio information, GPS mapping data etc.) as well enabling the user to interact with the vehicle via the use of on-screen buttons on the touch-enabled screen and via user gestures that may be captured by the touch-enabled screen. The display screen may also display interfaces in which a user may select contact names from an address book, enter a telephone number into a carphone system, operate a radio or GPS mapping system etc. The control switches 6 may represent more traditional mechanisms for the user to operate various vehicle systems such as an air conditioning system, transmission mode selector etc. Information for display on the display means may be communicated to the display screen from the electronic control unit (ECU) 2. Control inputs entered either via the touch-enabled display screen or the control switches may be communicated to the electronic control unit for controlling vehicle sub-systems.
In addition to the human-machine interfaces it is noted that the electronic control unit is in communication with sensors (10, 12, 14, 16) providing a variety of sensor data about vehicle sub-systems. As shown in Figure 1 the ECU receives data regarding the steering wheel position 10, the vehicle speed 12 and the position of the accelerator and brake pedals 14.
The ECU may receive further sensor data from other on-vehicle sensors, e.g. passenger status data via passenger seat sensors 16. Potentially the ECU may also receive further sensor data from sensors that are not part of the vehicle, e.g. smartphone sensor data.
The ECU is further in communication with a data store 18 for storing historical data relating to user interaction with the various vehicle sub-systems.
The ECU may monitor user interaction with the HMIs (4, 6) and store the monitored interactions in the data store 18. The ECU 2 may monitor a number of different aspects of the user's interaction with the interfaces. For example, as far as the user's interaction with the display screen 4 is concerned, the ECU may, from a given starting interface, monitor the time taken for the user to reach and select a given function (end' interface point). The time spent on any given screen (in particular intermediate screens between start and end interface points) may be monitored. Any Back" button presses between the start and end interface points may also be monitored (this may provide an indication of "incorrect" user selections made as the user attempts to navigate from the start interface point to the end interface point).
A user's stress level and workload may also be monitored during their interaction with the HMI. Workload monitoring may comprise monitoring vehicle speed and acceleration and steering wheel inputs. Stress levels may be measured by monitoring galvanic skin response, changes in skin temperature.
The ECU 2 may monitor user interaction with the vehicle's physical controls over time (e.g. over a period of a few days), over a number of vehicle journeys or over a number of ignition key cycles.
After having monitored the user's interaction with the HMI and stored the results of this monitoring in the data store 18, the results of the monitored interactions may subsequently be used to provide help to the user on the basis of their past behaviour.
For example, as a result of the monitored user interactions the ECU 2 may determine that the user is not using certain functionality that is available to them. As a consequence of this determination the ECU 2 may provide help prompts to the user to inform them of the unused functionality.
Figures 2 to 10 show an example useful for understanding the present invention.
Figure 2 shows a standard interface 20 for a phonebook on a display screen 4 in a vehicle.
The phonebook may be stored within a data store on the vehicle or conveniently the vehicle may be arranged to dock (either via a physical connector or via a wireless connection, such as Bluetooth®) with a user's phone and display the contacts list from the user's phone on the vehicle display screen.
The phonebook shown in Figure 2 comprises five contact entries. On screen scroll buttons 22 are provided to the left of the interface and a "Back" button 24 is provided to the top left portion of the interface.
The interface 20 of Figure 2 may represent an intermediate interface between a "Home" screen (i.e. a starting interface) and a screen in which a call is initiated (e.g. selecting one of the contact entries in the interface shown in Figure 2 may bring up a user prompt asking if a call should be placed to the user in question. Such a screen could represent an end interface). Alternatively a call may be initiated directly from the interface shown in Figure 2 in which case the interface shown would represent the end interface in the user interaction with the HMI.
In order to reach the interface shown in Figure 2 the user may be required to select from the Home screen one or more on-screen buttons (e.g. select Phone", select "Contact list" etc.).
In the example of the interface shown in Figure 2, as noted above, "up" and "down" scroll buttons 22 have been provided in order to allow the user to move up and down the list of contacts in the Phonebook. However, further interaction gestures may be supported such as a "swipe up" or "swipe down" motion to allow the user to move through the list of contacts.
If the ECU 2 determines that the user is not using the available "swipe" gestures it may prompt the user via a number of different methods.
For example, as shown in Figure 3 an overlay dialog box 26 has been provided that provides a text based help prompt to the user to inform them of the additional functionality. In the example of Figure 3 the user is required to select a further on-screen button (the "Done" button 28) to dismiss the dialog box 26.
A further prompt method is shown in Figure 4 in which an overlay dialog box 30 has been provided with a mixture of text based help and a visual prompt (in Figure 4 the visual prompt is provided by the arrow to the right of the dialog box).
A yet further prompt method is shown in Figure 5 in which an overlay dialog box 32 that obscures the entire Phonebook interface is displayed. This prompt method represents a variation to that shown in Figure 3.
A still further prompt method is shown in Figure 6 in which a dialog box 34 has been overlaid in the bottom right corner of the interface 20 of Figure 2. Rather than providing the help instructions (as in Figures 3-5) the help prompt dialog box 34 of Figure 6 just indicates that help is available if the user wishes to access it.
The prompt methods of Figures 3-6 show on-screen prompt options that are related to the interface of Figure 2. It is noted that alternative prompt methods may be available, e.g. the user may be provided with an audio warning or a pre-recorded audio message.
As well as highlighting unused gestures (as in the example of Figures 2 to 6) the ECU may also highlight shortcut options that are available to the user. Figure 7 shows an on-screen dialog box 36 highlighting an alternative option to returning to a Home screen.
As well as help relating to the user's interaction with the HMI, the ECU may also provide help prompts that relate to unused vehicle functions. Figure 8 shows an on-screen dialog box 38 that has highlighted a "vehicle tip" relating to the use of the vehicle's traction control functions.
As with the help prompts highlighted with respect to Figures 2 to 6, the on-screen help prompts of Figures 7 and 8 may also be provided via alternative means, such as an audio prompt.
Figure 9 shows a physical button station 40 that has been adapted to display a help prompt.
In the Figure a number of buttons 42 have been highlighted by means of increased illumination in order to prompt the user as to user interaction options that are available to them. Alternatively, the user options that are highlighted may correspond to a suggestion to the user (e.g. if the outside temperature is low, the Winter" driving mode may be illuminated more than the other options on the button station).
Figure 10 is a flow chart of a control method according to an example useful for understanding the present invention for providing help prompts to a vehicle user.
In step 100, the ECU 2 (or other suitable processor within the vehicle) monitors a user's interaction with the various HMIs available to them (e.g. the physical buttons 6 or touch-enabled display screen 4 shown in Figure 1). The monitoring of the user's interaction with the HMIs may comprise monitoring the user between two menu points including time taken to traverse the menu options, any Back" button events (indicating an incorrect menu selection by the user), user workload, user stress.
In step 102, the user's interaction with the HMIs may be checked against the interactions available to the user. For example, the ECU may have the optimal route between any two menu points stored in the data store and the user's progression between two menu points may be compared to the optimal route.
In step 104, the ECU may provide a help prompt to the user based on the comparison of the user's interaction with the vehicle compared to pre-determined available/optimal user interactions. For example, the ECU may determine that there are unused interaction options available to the user and the help prompt may highlight such unused options. Such unused interaction options may comprise alternative menu interaction options (e.g. "swipe" gesture versus "up"i'down" gesture) or hidden functionality (e.g. screen shortcuts) or unused vehicle related functionality (e.g. the user has selected function mode A but has not selected a related mode B). The ECU may also determine that a user is selecting on-screen buttons on the display screen in error.
The prompt may comprise an on-screen prompt (which may be text based, icon based or a mixture of the two) or an audio prompt. Additional prompt types may comprise altering the appearance of an HMI (e.g. illuniination of a physical button or illuminating an on-screen button or changing the size of an on-screen button if a user consistently misses an available on-screen button).
It is noted that a user may be presented with the help prompt at the end of a given user interaction or the next time the ECU determines that the user is performing the given user interaction again.
In an embodiment of the present invention, as shown in Figures 11 to 16, a human-machine interface may be configured such that user-selectable options with a high utilisation frequency are presented with a greater level of prominence to a user.
Figure 11 shows a typical interface 52 for a phone interface on a display screen 4 as shown in Figure 1. A 0-9 numeric dial pad 54 is provided along with a number of other user-selectable buttons such as a "settings" button 56, a mute" button 58, a "voicemail" button 60, a button 62 to switch from a hands-free mode of operation to a handset mode of operation, a delete" button 64, a "phonebook' button 66 to display a contacts list similar to that shown in Figure 2 and a "Back" button 68 to return to a higher menu level. A display area 70 is also provided to show the dialled number. The last 10 numbers may be accessed via button 72.
As discussed above the ECU may monitor a user's interaction with the human-machine interfaces within a vehicle. As well using the monitored interactions to determine when to display a help prompt, the monitored interactions may also be analysed to determine the user-selectable options that the user most often uses. In this manner, high use user-selectable options may be identified. If the ECU subsequently determines that the user's current usage pattern includes one or more high use user-selectable options then the display of such options may be modified such that they are more prominently visible to the user. The interface may therefore be adapted such that unwanted or unneeded options are given a lower prominence relative to the high use options thereby reducing the information load on a user of the vehicle.
Figure 12 shows the interface 52 of Figure 11 that has been modified to highlight the most commonly used user-selectable options and it can be seen that the numeric 0-9 buttons 54, the "Delete" button 64, the "Phonebook" button 66 and the "Last 10" dialled number buttons 72 have been highlighted by changing the appearance of the buttons. In Figure 12 the change of appearance comprises a different background colour for the buttons and the removal of the text on the buttons (e.g. the letters "ABC" have been removed from button "2" etc.).
Figure 13 shows the interface of Figure 11 that has also been modified to highlight the most commonly used user-selectable options. In this Figure the high use buttons (54, 64 66, 68, 72) have had the text shown in Figure 11 removed and the low use buttons (including buttons 56, 58, 60 and 62) have had their background colour changed to reduce their prominence.
Figure 14 shows a further modification of the user interface of Figure 11 in which the font colour of the low use buttons 56, 58, 60, 62 has been changed to reduce their prominence.
In Figure 15 the low use buttons have been removed (see area 74) from the interface 52 entirely. A More" button 76 has been added to indicate a further level of menu options where the removed low use buttons may be located.
Figures 11 to 15 relate to an interface that is shown on a display screen within the vehicle. In the context of a physical button array (e.g. an array as shown in Figure 9) a regularly used feature may be illuminated with a greater level of intensity.
Figure 16 is a flow chart of a control method according to an example useful for understanding the present invention for highlighting high use user-selectable options to a vehicle user.
In Step 106, the ECU (or other suitable processor within the vehicle) monitors a user's interaction with the various HMIs available to them (e.g. the physical buttons or touch-enabled display screen shown in Figure 1). The monitoring of the user's interaction with the HMIs may comprise monitoring the user between two menu points including time taken to traverse the menu options, any Back' button events (indicating an incorrect menu selection by the user), user workload, user stress.
In step 108, the ECU 2 may determine user-selectable options that are frequently selected by a user. Such high use user-selectable options may be stored within the data store.
In Step 110, the ECU 2 adapts the display of a human-machine interface to highlight the high use user-selectable options to the user. It is noted that the prominence of user-selectable options may conveniently be re-configured at a key cycle to avoid the user interfaces changing during a journey. The re-configuring of the user-interfaces may comprise changing the appearance of a buttons or options within an interface. Additionally, lower use options may be removed from an interface screen and placed a further menu level down beneath a "More" button 76.
Aspects of the present invention extend to the method and system described in the following numbered clauses.
Clause 1. A method of adapting a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interface within the vehicle; determining the frequency with which the user selects available user-selectable interface options within the human-machine interface; adapting the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
Clause 2. A method according to Clause 1, wherein the human-machine interface is adapted between ignition key events.
Clause 3. A method according to Clause 1, wherein the human-machine interface comprises a display screen and adapting the interface comprises changing the appearance of certain on-screen elements in comparison to other on-screen elements.
Clause 4. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises changing font colour of text within the element.
Clause 5. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises changing the colour of the element.
Clause 6. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises changing the appearance of high use interface options relative to low use interface options.
Clause 7. A method according to Clause 6, wherein changing the appearance of on-screen elements comprises increasing the brightness of high use interface options relative to low use interface options.
Clause 8. A method according to Clause 6, wherein changing the appearance of on-screen elements comprises decreasing the brightness of low use interface options relative the high use interface options.
Clause 9. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises removing certain on-screen elements from a given menu level of a menu hierarchy.
Clause 10. A method according to Clause 9, comprising placing the removed on-screen elements in a lower menu level than the given menu level and providing a single access button on the given menu level in order to access the removed elements.
Clause 11. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises moving certain on-screen elements within a menu level of a menu hierarchy.
Clause 12. A method according to Clause 3, wherein changing the appearance of on-screen elements comprises reordering certain on-screen elements within a menu level of a menu hierarchy.
Clause 13. A method according to Clause 1, where the human-machine interface comprises a physical button array and adapting the interface comprises changing the illumination levels of at least one button in the array.
Clause 14. A system for adapting a human-machine interface within a vehicle, the method comprising: an input arranged to receive data relating to user interaction with the human-machine interface within the vehicle; processing means arranged to monitor user interaction with the human-machine interface within the vehicle, to determine the frequency with which the user selects available user-selectable interface options within the human-machine interface and to adapt the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.

Claims (16)

  1. CLAIMS: 1. A method of operating a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interlace within the vehicle; determining the frequency with which the user selects available user-selectable interface options within the human-machine interlace; adapting the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
  2. 2. A method as claimed in Claim 1, wherein the human-machine interface is adapted between ignition key events.
  3. 3. A method as claimed in any preceding claim, wherein the human-machine interface comprises a display screen and adapting the interlace comprises changing the appearance of certain on-screen elements in comparison to other on-screen elements.
  4. 4. A method as claimed in Claim 3, wherein changing the appearance of on-screen elements comprises changing font colour of text within the element.
  5. 5. A method as claimed in Claim 3 or Claim 4, wherein changing the appearance of on-screen elements comprises changing the colour of the element.
  6. 6. A method as claimed in any one of Claims 3 to 5, wherein changing the appearance of on-screen elements comprises changing the appearance of high use interface options relative to low use interface options.
  7. 7. A method as claimed in Claim 6, wherein changing the appearance of on-screen elements comprises increasing the brightness of high use interface options relative to low use interface options.
  8. 8. A method as claimed in Claim 6, wherein changing the appearance of on-screen elements comprises decreasing the brightness of low use interface options relative the high use interface options.
  9. 9. A method as claimed in any one of Claims 3 to 8, wherein changing the appearance of on-screen elements comprises removing certain on-screen elements from a given menu level of a menu hierarchy.
  10. 10. A method as claimed in Claim 9, comprising placing the removed on-screen elements in a lower menu level than the given menu level and providing a single access button on the given menu level in order to access the removed elements.
  11. 11. A method as claimed in any one of Claims 3 to 10, wherein changing the appearance of on-screen elements comprises moving certain on-screen elements within a menu level of a menu hierarchy.
  12. 12. A method as claimed in any one of Claims 3 to 11, wherein changing the appearance of on-screen elements comprises reordering certain on-screen elements within a menu level of a menu hierarchy.
  13. 13. A method as claimed in any preceding claim, where the human-machine interface comprises a physical button array and adapting the interface comprises changing the illumination levels of at least one button in the array.
  14. 14. A system for adapting a human-machine interface within a vehicle, the method comprising: an input arranged to receive data relating to user interaction with the human-machine interface within the vehicle; processing means arranged to monitor user interaction with the human-machine interface within the vehicle, to determine the frequency with which the user selects available user-selectable interface options within the human-machine interface and to adapt the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
  15. 15. A vehicle having a system or adapted to perform a method as claimed in any preceding claim.
  16. 16. A method, system or vehicle substantially as herein described in relation to Figures 11 to 16.
GB1315618.7A 2013-09-03 2013-09-03 Human-machine interface Withdrawn GB2519936A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1315618.7A GB2519936A (en) 2013-09-03 2013-09-03 Human-machine interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1315618.7A GB2519936A (en) 2013-09-03 2013-09-03 Human-machine interface

Publications (2)

Publication Number Publication Date
GB201315618D0 GB201315618D0 (en) 2013-10-16
GB2519936A true GB2519936A (en) 2015-05-13

Family

ID=49397186

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1315618.7A Withdrawn GB2519936A (en) 2013-09-03 2013-09-03 Human-machine interface

Country Status (1)

Country Link
GB (1) GB2519936A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002362188A (en) * 2001-06-11 2002-12-18 Denso Corp Cabin operation device
JP2004178363A (en) * 2002-11-28 2004-06-24 Sony Electronics Inc Terminal device
JP2005062978A (en) * 2003-08-20 2005-03-10 Seiko Epson Corp Information processor, display control method, and program executing it on computer
WO2007063714A1 (en) * 2005-11-29 2007-06-07 Matsushita Electric Industrial Co., Ltd. I/o device, i/o method, and program thereof
US20110258581A1 (en) * 2010-04-14 2011-10-20 Wei-Han Hu Method for adjusting size of an icon and related handheld device
JP2011232913A (en) * 2010-04-27 2011-11-17 Sharp Corp Information terminal device
WO2013085856A1 (en) * 2011-12-09 2013-06-13 Microsoft Corporation Adjusting user interface elements

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002362188A (en) * 2001-06-11 2002-12-18 Denso Corp Cabin operation device
JP2004178363A (en) * 2002-11-28 2004-06-24 Sony Electronics Inc Terminal device
JP2005062978A (en) * 2003-08-20 2005-03-10 Seiko Epson Corp Information processor, display control method, and program executing it on computer
WO2007063714A1 (en) * 2005-11-29 2007-06-07 Matsushita Electric Industrial Co., Ltd. I/o device, i/o method, and program thereof
US20110258581A1 (en) * 2010-04-14 2011-10-20 Wei-Han Hu Method for adjusting size of an icon and related handheld device
JP2011232913A (en) * 2010-04-27 2011-11-17 Sharp Corp Information terminal device
WO2013085856A1 (en) * 2011-12-09 2013-06-13 Microsoft Corporation Adjusting user interface elements

Also Published As

Publication number Publication date
GB201315618D0 (en) 2013-10-16

Similar Documents

Publication Publication Date Title
KR102176305B1 (en) Method and device for representing recommended operating actions of a proposal system and interaction with the proposal system
JP5565421B2 (en) In-vehicle operation device
US9361000B2 (en) Information display device for vehicle
US8910086B2 (en) Method for controlling a graphical user interface and operating device for a graphical user interface
US9238409B2 (en) Steering wheel and integrated touchpads for inputting commands
US7467037B2 (en) Operator control device for individually operating a motor vehicle device
US8527900B2 (en) Motor vehicle
US9933885B2 (en) Motor vehicle operating device controlling motor vehicle applications
US10139988B2 (en) Method and device for displaying information arranged in lists
KR102216299B1 (en) Methods and assemblies for interaction with proposed systems with automated manipulation actions
US10331314B2 (en) User interface including recyclable menu
US20180307405A1 (en) Contextual vehicle user interface
JP2014115578A (en) Display device for vehicle and program
JP6747835B2 (en) Image display
US20140181749A1 (en) User interface device and program for the same
KR101558354B1 (en) Blind control system for vehicle
JP2016097928A (en) Vehicular display control unit
JP2015007841A (en) Information display device for vehicle
JP5954156B2 (en) In-vehicle information processing equipment
GB2517792A (en) Human-machine interface
JP2018010472A (en) In-vehicle electronic equipment operation device and in-vehicle electronic equipment operation method
US20240109418A1 (en) Method for operating an operating device for a motor vehicle, and motor vehicle having an operating device
KR20190134978A (en) Method for dynamically adapting an operating device in an motor vehicle and operating device and motor vehicle
GB2519936A (en) Human-machine interface
US10168858B2 (en) Method for displaying information in a vehicle, and a device for controlling the display

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)