GB2517792A - Human-machine interface - Google Patents

Human-machine interface Download PDF

Info

Publication number
GB2517792A
GB2517792A GB1315619.5A GB201315619A GB2517792A GB 2517792 A GB2517792 A GB 2517792A GB 201315619 A GB201315619 A GB 201315619A GB 2517792 A GB2517792 A GB 2517792A
Authority
GB
United Kingdom
Prior art keywords
user
user interaction
vehicle
options
menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1315619.5A
Other versions
GB201315619D0 (en
GB2517792B (en
Inventor
Simon Thompson
Lee Skrypchuk
Jean-Jacques Loeillet
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1315619.5A priority Critical patent/GB2517792B/en
Publication of GB201315619D0 publication Critical patent/GB201315619D0/en
Publication of GB2517792A publication Critical patent/GB2517792A/en
Application granted granted Critical
Publication of GB2517792B publication Critical patent/GB2517792B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/60Substation equipment, e.g. for use by subscribers including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6075Portable telephones adapted for handsfree use adapted for handsfree use in a vehicle
    • H04M1/6083Portable telephones adapted for handsfree use adapted for handsfree use in a vehicle by interfacing with the vehicle audio system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3041Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is an input/output interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/186Displaying information according to relevancy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/26Devices for calling a subscriber
    • H04M1/27Devices whereby a plurality of signals may be stored simultaneously
    • H04M1/274Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc
    • H04M1/2745Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips
    • H04M1/27467Methods of retrieving data
    • H04M1/2747Scrolling on a display

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Mathematical Physics (AREA)
  • Computer Hardware Design (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of providing help prompts relating to the use of human-machine interfaces within a vehicle, the method comprising: monitoring user interaction with the human-machine interfaces within the vehicle; assessing a monitored user interaction against pre-determined available user interaction options; outputting a help prompt to the user in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options. Aspects of the invention include the pre-determined available user interaction options comprising: optimal routes through a menu hierarchy between interface menu levels; shortcut buttons to jump between interface menu levels; alternative user input interaction methods; swipe gestures versus on-screen scroll buttons; available user-selectable menu options at a given menu level within a menu hierarchy. The help prompt may comprise a visual dialog box on the display screen, text based help message, audio message, icon/graphical based help, and the help prompt may relate to available user input interaction methods, unused user-selectable menu options, unused shortcut functions. Monitoring user interaction may comprise monitoring user workload, wherein the workload is determined via one or more of vehicle speed, acceleration, steering inputs to wheel, traffic density, road situation etc.

Description

HUMAN-MACHINE INTERFACE
TECHNICAL FIELD
The present disclosure relates to a human-machine interface and particularly, but not exclusively, to methods and systems for configuring a human-machine interface in a vehicle to improve usability. Aspects of the invention relate to a system, to a method and to a vehicle.
BACKGROUND
Within a vehicle cabin there will usually be a range of controls that a user may interact with to select/alter various on-board vehicle sub-systems. Such "human-machine interfaces" (HMI5) may allow control of air conditioning systems, selection of a vehicle drive mode, access to vehicle entertainment systems and may also allow a suitable mobile telecommunications device to pair with the vehicle systems (e.g. via a Bluetooth® or physical connection).
HMIs within the cabin may take the form of physical controls (which term is taken to include switches, buttons, control knobs etc.) or display screens that may be touch-enabled.
Modern vehicles may contain a range of HMIs and the operation of such HMIs may increase workload on the vehicle user. There may also be a training burden for new users of a vehicle or where additional functions are provided via an upgrade (e.g. to operating software for a display screen).
It is therefore an object of the present invention to provide a method and system for improving user interaction with the human-machine interfaces within a vehicle.
SUMMARY OF THE INVENTION
Aspects of the invention provide a method, a system and a vehicle as claimed in the appended claims.
According to another aspect of the present invention there is provided a method of providing help prompts relating to the use of human-machine interfaces within a vehicle, the method comprising: monitoring user interaction with the human-machine interfaces within the vehicle; assessing a monitored user interaction against pre-determined available user interaction options; outputting a help prompt to the user in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
Embodiments of the present invention provide a method for dynamically assessing whether a user of a human-machine interface (HMI) requires help with their interaction. By monitoring the user's interaction against a known set of available interaction options the present invention is capable of outputting a help prompt. For example, if the system assesses that a user is repeatedly omitting to select certain functionality that is available then a help prompt may be sent highlighting the missed functionality. Additionally, if a user's action is deemed to fall below a certain efficiency threshold (e.g. a user's path between two menu points is not the shortest available) then a help prompt may be output highlighting mechanisms by which their interaction with the HMI can be improved.
Conveniently, the human-machine interface may comprise a touch-enabled display screen.
Touch-enabled display screens may comprise screens which incorporate a touch sensitive element (e.g. a screen similar to those found in smart phones/tablet devices) and may also comprise screens where touch sensitive elements are disposed around the display screen (e.g. a screen arrangement similar to the Samsung® Galaxy 53 smart phone which incorporates touch sensitive buttons beneath the main screen).
The monitoring the user interaction may comprise monitoring when the user touches the display screen. For example, each time the user interacts with the display device (each "touch event") may be monitored.
Conveniently, monitoring user interaction may comprise monitoring a user as they move from a start interface on the display screen to an end interface, the start and end interfaces being different menu levels within a menu hierarchy. For example, the user may be monitored moving from a "welcome" screen to a telephone contacts screen.
In instances where the user moves between menu levels the action of monitoring user interaction may comprise recording instances where the user moves up a menu level during a menu interaction where the end interface is at a lower menu level than the start interface.
Furthermore, monitoring user interaction may also or additionally comprise recording the time taken for the user to move from the start to the end interface.
The pre-determined available user interaction options may comprise optimal routes through a menu hierarchy between two interface menu levels. Pre-determined available user interaction options may also comprise shortcut buttons to jump between interface menu levels. The interface levels may be non-adjacent levels within the menu hierarchy.
Pre-deterrnined available user interaction options may also comprise alternative user input interaction methods (e.g. "swipe" gesture and up"f'down" buttons may comprise alternative methods of navigating through a list). The alternative user input interaction methods may comprise swipe gestures versus on-screen scroll buttons.
The pre-determined available user interaction options may comprise available user-selectable menu options at a given menu level within a menu hierarchy (e.g. for any given menu level the present invention may monitor with respect to all the available selectable menu/UI controls).
The help prompt may comprise a visual dialog box displayed on the display screen. In one embodiment the dialog box may comprise an overlay dialogue box. In an alternative arrangement a notification message may be displayed on the display screen. Such a notification message may be displayed in a dedicated part of the display screen or may be incorporated within the information already displayed on the screen.
The dialog box may comprise text based help message or icon/graphical based help.
The help prompt may comprise an audio message.
Preferably, the help prompt relates to available user input interaction methods, unused user-selectable menu options, unused shortcut functions.
The human-machine interface may comprise a physical control array (e.g. control buttons, control switches, dials, levers etc.) and monitoring user interaction may comprise monitoring user interaction with physical controls within the array. Furthermore, pre-determined available user interaction options may comprise available user-selectable control settings for a given vehicle settings configuration. For a physical control array the help prompt may comprise increasing illumination of certain controls relative to other buttons in the array.
Illumination of controls within a physical control array may be controlled in a number of ways.
For example, available user input interaction controls may have their illumination levels increased compared to other controls within the array. Alternatively, controls that will not have any effect in a given scenario may have their default illumination levels decreased. In one embodiment, the illumination of a single button within an instrument panel may be altered so as to guide the user to that button.
Monitoring user interaction may comprise monitoring user workload. User workload may be determined via one or more of: vehicle speed; vehicle acceleration; steering inputs to a vehicle steering wheel; traffic density; road situation; driver eye tracking, Monitoling usel interaction may complise monitoring usel stiess, e.g. by monitoring galvanic skin response, changes in skin temperature.
According to a further aspect of the present invention, there is provided a system for providing help prompts relating to the use of human-machine interfaces within a vehicle, the system comprising: an input arranged to data relating to user interaction with the human-machine interfaces within the vehicle; processing means arranged to monitor user interaction with the human-machine interfaces within the vehicle and to assess a monitored user interaction against pre-determined available user interaction options; output arranged to output a help prompt to the user, the help prompt being generated in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
According to a first example useful for understanding the present invention there is provided a method of adapting a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interface within the vehicle; determining the frequency with which the user selects available user-selectable interface options within the human-machine interface; adapting the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
The present example useful for understanding the present invention provides a method for adapting a human-machine interface (HMI) within a vehicle based on the user's usage pattern. The frequency with which a user interacts with available user-selectable interface options is monitored and the HMI adapted accordingly.
Conveniently, the human-machine interface may be adapted between ignition key events. In this way the HMI presents a consistent interface to a user during a drive cycle.
The human-machine interface may comprise a display screen and adapting the interface may comprise changing the appearance of certain on-screen elements in comparison to other on-screen elements. Changing the appearance of on-screen elements may comprise changing the font colour of text within the element or changing the colour of the element.
Conveniently, changing the appearance of on-screen elements may comprise changing the appearance of high use interface options relative to low use interface options. For example, the font colour of high use elements may be made more visible while low use interface elements may be changed to a less readable font colour.
Changing the appearance of on-screen elements may comprise increasing the brightness of high use interface options relative to low use interface options. Alternatively or additionally, changing the appearance of on-screen elements comprises decreasing the brightness of low use interface options relative the high use interface options.
Changing the appearance of on-screen elements may comprise removing certain on-screen elements from a given menu level of a menu hierarchy (e.g. low use elements may be removed such that high use elements remain on a given menu level). Conveniently, the method may further comprise placing the removed on-screen elements in a lower menu level than the given menu level and providing a single access button on the given menu level in order to access the removed elements.
Changing the appearance of on-screen elements may also comprise moving certain on-screen elements within a menu level of a menu hierarchy or reordering certain on-screen elements within a menu level of a menu hierarchy.
In the event that the human-machine interface comprises a physical button array, adapting the interface may comprise changing the illumination levels of at least one button (or other physical control element such as a switch, dial, lever etc.) in the array.
According to a second example useful for understanding the present invention there is provided a system for adapting a human-machine interface within a vehicle, the method comprising: an input arranged to receive data relating to user interaction with the human-machine interface within the vehicle; processing means arranged to monitor user interaction with the human-machine interface within the vehicle and to determine the frequency with which the user selects available user-selectable interface options within the human-machine interface and to adapt the human-machine interface in dependence on the frequency with which the user selects the available user-selectable interface options within the human-machine interface.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a representation of a vehicle system including human-machine interfaces, control components and electronic control unit; Figures 2 to 9 show a human-machine interface within a vehicle operating in accordance with an embodiment of the present invention; Figure 10 is a flow chart of a control method in accordance with human-machine interactions depicted in Figures 2 to 9; Figures 11 to 15 show a human-machine interface within a vehicle operating in accordance with an example useful for understanding the present invention; Figure 16 is a flow chart of a control method in accordance with human-machine interactions depicted in Figure 11 to 15.
DETAILED DESCRIPTION
Figure 1 shows a representation of a number of vehicle sub-systems within a vehicle. Shown in Figure 1 are an electronic control unit (ECU) 2 which is in communication with two main types of user interfaces, a touch-enabled display screen and a number of hard user-operable controls 6 (e.g. buttons, switches, dials etc.). The display screen 4 and control switches 6 are part of an instrument cluster 8 within the vehicle and represent human-machine interfaces (HMIs) that enable the user to interact with the vehicle.
The display screen 4 may be used to display various information types to the user (e.g. time information, radio information, GPS mapping data etc.) as well enabling the user to interact with the vehicle via the use of on-screen buttons on the touch-enabled screen and via user gestures that may be captured by the touch-enabled screen. The display screen may also display interfaces in which a user may select contact names from an address book, enter a telephone number into a car-phone system, operate a radio or GPS mapping system etc. The control switches 6 may represent more traditional mechanisms for the user to operate various vehicle systems such as an air conditioning system, transmission mode selector etc. Information for display on the display means may be communicated to the display screen from the electronic control unit (ECU) 2. Control inputs entered either via the touch-enabled display screen or the control switches may be communicated to the electronic control unit for controlling vehicle sub-systems.
In addition to the human-machine interlaces it is noted that the electronic control unit is in communication with sensors (10, 12, 14, 16) providing a variety of sensor data about vehicle sub-systems. As shown in Figure 1 the ECU receives data regarding the steering wheel position 10, the vehicle speed 12 and the position of the accelerator and brake pedals 14.
The ECU may receive further sensor data from other on-vehicle sensors, e.g. passenger status data via passenger seat sensors 16. Potentially the ECU may also receive further sensor data from sensors that are not part of the vehicle, e.g. smartphone sensor data.
The ECU is further in communication with a data store 18 for storing historical data relating to user interaction with the various vehicle sub-systems.
The ECU may monitor user interaction with the HMIs (4, 6) and store the monitored interactions in the data store 18. The ECU 2 may monitor a number of different aspects of the user's interaction with the interfaces. For example, as far as the user's interaction with the display screen 4 is concerned, the ECU may, from a given starting interface, monitor the time taken for the user to reach and select a given function (end" interface point). The time spent on any given screen (in particular intermediate screens between start and end interface points) may be monitored. Any Back" button presses between the start and end interface points may also be monitored (this may provide an indication of incorrect" user selections made as the user attempts to navigate from the start interface point to the end interface point).
A user's stress level and workload may also be monitored during their interaction with the HMI. Workload monitoring may comprise monitoring vehicle speed and acceleration and steering wheel inputs. Stress levels may be measured by monitoring galvanic skin response, changes in skin temperature.
The ECU 2 may monitor user interaction with the vehicle's physical controls over time (e.g. over a period of a few days), over a number of vehicle journeys or over a number of ignition key cycles.
After having monitored the user's interaction with the HMI and stored the results of this monitoring in the data store 18, the results of the monitored interactions may subsequently be used to provide help to the user on the basis of their past behaviour.
For example, as a result of the monitored user interactions the ECU 2 may determine that the user is not using certain functionality that is available to them. As a consequence of this determination the ECU 2 may provide help prompts to the user to inform them of the unused functionality.
Figure 2 shows a standard interface 20 for a phonebook on a display screen 4 in a vehicle.
The phonebook may be stored within a data store on the vehicle or conveniently the vehicle may be arranged to dock (either via a physical connector or via a wireless connection, such as Bluetooth®) with a user's phone and display the contacts list from the user's phone on the vehicle display screen.
The phonebook shown in Figure 2 comprises five contact entries. On screen scroll buttons 22 are provided to the left of the interface and a "Back" button 24 is provided to the top left portion of the interface.
The interface 20 of Figure 2 may represent an intermediate interface between a "Home" screen (i.e. a starting interface) and a screen in which a call is initiated (e.g. selecting one of the contact entries in the interface shown in Figure 2 may bring up a user prompt asking if a call should be placed to the user in question. Such a screen could represent an end interface). Alternatively a call may be initiated directly from the interface shown in Figure 2 in which case the interface shown would represent the end interface in the user interaction with the HMI.
In order to reach the interface shown in Figure 2 the user may be required to select from the Home screen one or more on-screen buttons (e.g. select Phone", select "Contact list" etc.).
In the example of the interface shown in Figure 2, as noted above, "up" and "down" scroll buttons 22 have been provided in order to allow the user to move up and down the list of contacts in the Phonebook. However, further interaction gestures may be supported such as a "swipe up" or "swipe down" motion to allow the user to move through the list of contacts.
If the ECU 2 determines that the user is not using the available "swipe" gestures it may prompt the user via a number of different methods.
For example, as shown in Figure 3 an overlay dialog box 26 has been provided that provides a text based help prompt to the user to inform them of the additional functionality. In the example of Figure 3 the user is required to select a further on-screen button (the "Done" button 28) to dismiss the dialog box 26.
A further prompt method is shown in Figure 4 in which an overlay dialog box 30 has been provided with a mixture of text based help and a visual prompt (in Figure 4 the visual prompt is provided by the arrow to the right of the dialog box).
A yet further prompt method is shown in Figure 5 in which an overlay dialog box 32 that obscures the entire Phonebook interface is displayed. This prompt method represents a variation to that shown in Figure 3.
A still further prompt method is shown in Figure 6 in which a dialog box 34 has been overlaid in the bottom right corner of the interface 20 of Figure 2. Rather than providing the help instructions (as in Figures 3-5) the help prompt dialog box 34 of Figure 6 just indicates that help is available if the user wishes to access it.
The prompt methods of Figures 3-6 show on-screen prompt options that are related to the interface of Figure 2. It is noted that alternative prompt methods may be available, e.g. the user may be provided with an audio warning or a pre-recorded audio message.
As well as highlighting unused gestures (as in the example of Figures 2 to 6) the ECU may also highlight shortcut options that are available to the user. Figure 7 shows an on-screen dialog box 36 highlighting an alternative option to returning to a Home screen.
As well as help relating to the user's interaction with the HMI, the ECU may also provide help prompts that relate to unused vehicle functions. Figure 8 shows an on-screen dialog box 38 that has highlighted a "vehicle tip" relating to the use of the vehicle's traction control functions.
As with the help prompts highlighted with respect to Figures 2 to 6, the on-screen help prompts of Figures 7 and 8 may also be provided via alternative means, such as an audio prompt.
Figure 9 shows a physical button station 40 that has been adapted to display a help prompt.
In the Figure a number of buttons 42 have been highlighted by means of increased illumination in order to prompt the user as to user interaction options that are available to them. Alternatively, the user options that are highlighted may correspond to a suggestion to the user (e.g. if the outside temperature is low, the Winter" driving mode may be illuminated more than the other options on the button station).
Figure 10 is a flow chart of a control method according to the present invention for providing help prompts to a vehicle user.
In step 100, the ECU 2 (or other suitable processor within the vehicle) monitors a user's interaction with the various HMIs available to them (e.g. the physical buttons 6 or touch-enabled display screen 4 shown in Figure 1). The monitoring of the user's interaction with the HMIs may comprise monitoring the user between two menu points including time taken to traverse the menu options, any Back" button events (indicating an incorrect menu selection by the user), user workload, user stress.
In step 102, the user's interaction with the HMIs may be checked against the interactions available to the user. For example, the ECU may have the optimal route between any two menu points stored in the data store and the user's progression between two menu points may be compared to the optimal route.
In step 104, the ECU may provide a help prompt to the user based on the comparison of the user's interaction with the vehicle compared to pre-determined available/optimal user interactions. For example, the ECU may determine that there are unused interaction options available to the user and the help prompt may highlight such unused options. Such unused interaction options may comprise alternative menu interaction options (e.g. "swipe" gesture versus up"i'down" gesture) or hidden functionality (e.g. screen shortcuts) or unused vehicle related functionality (e.g. the user has selected function mode A but has not selected a related mode B). The ECU may also determine that a user is selecting on-screen buttons on the display screen in error.
The prompt may comprise an on-screen prompt (which may be text based, icon based or a mixture of the two) or an audio prompt. Additional prompt types may comprise altering the appearance of an HMI (e.g. illuniination of a physical button or illuminating an on-screen button or changing the size of an on-screen button if a user consistently misses an available on-screen button).
It is noted that a user may be presented with the help prompt at the end of a given user interaction or the next time the ECU determines that the user is performing the given user interaction again.
In an example useful for understanding the invention a human-machine interface may be configured such that user-selectable options with a high utilisation frequency are presented with a greater level of prominence to a user.
Figure 11 shows a typical interface 52 for a phone interface on a display screen 4 as shown in Figure 1. A 0-9 numeric dial pad 54 is provided along with a number of other user-selectable buttons such as a "settings" button 56, a mute" button 58, a "voicemail" button 60, a button 62 to switch from a hands-free mode of operation to a handset mode of operation, a delete" button 64, a "phonebook' button 66 to display a contacts list similar to that shown in Figure 2 and a "Back" button 68 to return to a higher menu level. A display area 70 is also provided to show the dialled number. The last 10 numbers may be accessed via button 72.
As discussed above the ECU may monitor a user's interaction with the human-machine interfaces within a vehicle. As well using the monitored interactions to determine when to display a help prompt, the monitored interactions may also be analysed to determine the user-selectable options that the user most often uses. In this manner, high use user-selectable options may be identified. If the ECU subsequently determines that the user's current usage pattern includes one or more high use user-selectable options then the display of such options may be modified such that they are more prominently visible to the user. The interface may therefore be adapted such that unwanted or unneeded options are given a lower prominence relative to the high use options thereby reducing the information load on a user of the vehicle.
Figure 12 shows the interface 52 of Figure 11 that has been modified to highlight the most commonly used user-selectable options and it can be seen that the numeric 0-9 buttons 54, the Delete" button 64, the Phonebook" button 66 and the Last 10' dialled number buttons 72 have been highlighted by changing the appearance of the buttons. In Figure 12 the change of appearance comprises a different background colour for the buttons and the removal of the text on the buttons (e.g. the letters ABC" have been removed from button 2" etc.).
Figure 13 shows the interface of Figure 11 that has also been modified to highlight the most commonly used user-selectable options. In this Figure the high use buttons (54, 64 66, 68, 72) have had the text shown in Figure 11 removed and the low use buttons (including buttons 56, 58, 60 and 62) have had their background colour changed to reduce their prominence.
Figure 14 shows a further modification of the user interlace of Figure 11 in which the font colour of the low use buttons 56, 58, 60, 62 has been changed to reduce their prominence.
In Figure 15 the low use buttons have been removed (see area 74) from the interface 52 entirely. A More' button 76 has been added to indicate a further level of menu options where the removed low use buttons may be located.
Figures 11 to 15 relate to an interface that is shown on a display screen within the vehicle. In the context of a physical button array (e.g. an array as shown in Figure 9) a regularly used feature may be illuminated with a greater level of intensity.
Figure 16 is a flow chart of a control method according to an example useful for understanding the present invention for highlighting high use user-selectable options to a vehicle user.
In Step 106, the ECU (or other suitable processor within the vehicle) monitors a user's interaction with the various HMIs available to them (e.g. the physical buttons or touch-enabled display screen shown in Figure 1). The monitoring of the user's interaction with the HMIs may comprise monitoring the user between two menu points including time taken to traverse the menu options, any Back' button events (indicating an incorrect menu selection by the user), user workload, user stress.
In step 108, the ECU 2 may determine user-selectable options that are frequently selected by a user. Such high use user-selectable options may be stored within the data store.
In Step 110, the ECU 2 adapts the display of a human-machine interface to highlight the high use user-selectable options to the user. It is noted that the prominence of user-selectable options may conveniently be re-configured at a key cycle to avoid the user interfaces changing during a journey. The re-configuring of the user-interfaces may comprise changing the appearance of a buttons or options within an interface. Additionally, lower use options may be removed from an interface screen and placed a further menu level down beneath a More" button 76.
Aspects of the invention extend to the features described in the following numbered clauses: Clause 1. A method of providing help prompts relating to the use of human-machine interfaces within a vehicle, the method comprising: monitoring user interaction with the human-machine interfaces within the vehicle; assessing a monitored user interaction against pre-determined available user interaction options; outputting a help prompt to the user in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
Clause 2. A method according to Clause 1, wherein the human-machine interface comprises a touch-enabled display screen.
Clause 3. A method according to Clause 2, wherein the monitoring the user interaction comprises monitoring when the user touches the display screen.
Clause 4. A method according to Clause 2, wherein monitoring user interaction comprises monitoring a user as they move from a start interface on the display screen to an end interface, the start and end interfaces being different menu levels within a menu hierarchy.
Clause 5. A method according to Clause 4, wherein monitoring user interaction comprises recording instances where the user moves up a menu level during a menu interaction where the end interface is at a lower menu level than the start interface.
Clause 6. A method according to Clause 4, wherein monitoring user interaction comprises recording the time taken for the user to move from the start to the end interface.
Clause 7. A method according to Clause 2, wherein pie-determined available user interaction options comprise optimal routes through a menu hierarchy between interface menu levels.
Clause 8. A method according to Clause 2, wherein pie-determined available user interaction options comprise shortcut buttons to jump between interface menu levels.
Clause 9. A method according to Clause 2, wherein pie-determined available user interaction options comprise alternative user input interaction methods.
Clause 10. A method according to Clause 9, wherein alternative user input interaction methods comprise swipe gestures versus on-screen scroll buttons.
Clause 11. A method according to Clause 2, wherein pie-determined available user interaction options comprise available user-selectable menu options at a given menu level within a menu hierarchy.
Clause 12. A method according to Clause 2, wherein the help prompt comprises a visual dialog box displayed on the display screen.
Clause 13. A method according to Clause 12, wherein the dialog box comprises text based help message.
Clause 14. A method according to Clause 12, wherein the dialog box comprises icon/graphical based help.
Clause 15. A method according to Clause 2, wherein the help prompt comprises an audio message.
16. A method according to Clause 2, wherein the help prompt relates to available user input interaction methods, unused user-selectable menu options, unused shortcut functions.
Clause 17. A method according to Clause 1, wherein the human-machine interface comprises a physical control array.
Clause 18. A method according to Clause 17, wherein monitoring user interaction comprises monitoring user interaction with physical controls within the array.
Clause 19. A method according to Clause 17, wherein the pre-determined available user interaction options comprise available user-selectable control settings for a given vehicle settings configuration.
Clause 20. A method according to Clause 17, wherein the help prompt comprises increasing illumination of certain controls relative to other controls in the array.
Clause 21. A method according to Clause 1, wherein monitoring user interaction comprises monitoring user workload.
Clause 22. A method according to Clause 21, wherein user workload is determined via one or more of: vehicle speed; vehicle acceleration; steering inputs to a vehicle steering wheel; traffic density; road situation; driver eye tracking, Clause 23. A method according to Clause 1, wherein monitoring user interaction comprises monitoring user stress.
Clause 24. A system for providing help prompts relating to the use of human-machine interfaces within a vehicle, the system comprising: an input arranged to data relating to user interaction with the human-machine interfaces within the vehicle; a processor arranged to monitor user interaction with the human-machine interfaces within the vehicle and to assess a monitored user interaction against pre-determined available user interaction options; output arranged to output a help prompt to the user, the help prompt being generated in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.

Claims (26)

  1. CLAIMS1. A method for operating a human-machine interface within a vehicle, the method comprising: monitoring user interaction with the human-machine interfaces within the vehicle; assessing a monitored user interaction against pre-determined available user interaction options; and outputting a help prompt to the user in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
  2. 2. A method as claimed in Claim 1, wherein the human-machine interface comprises a touch-enabled display screen.
  3. 3. A method as claimed in Claim 2, wherein the monitoring the user interaction comprises monitoring when the user touches the display screen.
  4. 4. A method as claimed in either Claim 2 or Claim 3, wherein monitoring user interaction comprises monitoring a user as they move from a start interface on the display screen to an end interface, the start and end interfaces being different menu levels within a menu hierarchy.
  5. 5. A method as claimed in Claim 4, wherein monitoring user interaction comprises recording instances where the user moves up a menu level during a menu interaction where the end interface is at a lower menu level than the start interface.
  6. 6. A method as claimed in either Claim 4 or Claim 5, wherein monitoring user interaction comprises recording the time taken for the user to move from the start to the end interface.
  7. 7. A method as claimed in any one of Claims 2 to 6, wherein pre-determined available user interaction options comprise optimal routes through a menu hierarchy between interface menu levels.
  8. 8. A method as claimed in any one of Claims 2 to 7, wherein pre-determined available user interaction options comprise shortcut buttons to jump between interface menu levels.
  9. 9. A method as claimed in any one of Claims 2 to 8, wherein pre-determined available user interaction options comprise alternative user input interaction methods.
  10. 10. A method as claimed in Claim 9, wherein alternative user input interaction methods comprise swipe gestures versus on-screen scroll buttons.
  11. 11. A method as claimed in any one of Claims 2 to 10, wherein pre-determined available user interaction options comprise available user-selectable menu options at a given menu level within a menu hierarchy.
  12. 12. A method as claimed in any one of Claims 2 to 11, wherein the help prompt comprises a visual dialog box displayed on the display screen.
  13. 13. A method as claimed in Claim 12, wherein the dialog box comprises text based help message.
  14. 14. A method as claimed in either one of Claims 12 or 13, wherein the dialog box comprises icon/graphical based help.
  15. 15. A method as claimed in any one of Claims 2 to 14, wherein the help prompt comprises an audio message.
  16. 16. A method as claimed in any one of Claims 2 to 15, wherein the help prompt relates to available user input interaction methods, unused user-selectable menu options, unused shortcut functions.
  17. 17. A method as claimed in any preceding claim, wherein the human-machine interface comprises a physical control array.
  18. 18. A method as claimed in Claim 17, wherein monitoring user interaction comprises monitoring user interaction with physical controls within the array.
  19. 19. A method as claimed in either Claim 17 or Claim 18, wherein the pre-determined available user interaction options comprise available user-selectable control settings for a given vehicle settings configuration.
  20. 20. A method as claimed in any one of Claims 17 to 19, wherein the help prompt comprises increasing illumination of certain controls relative to other controls in the array.
  21. 21. A method as claimed in any preceding claim, wherein monitoring user interaction comprises monitoring user workload.
  22. 22. A method as claimed in Claim 21, wherein user workload is determined via one or more of: vehicle speed; vehicle acceleration; steering inputs to a vehicle steering wheel; traffic density; road situation; driver eye tracking,
  23. 23. A method as claimed in any preceding claim, wherein monitoring user interaction comprises monitoring user stress.
  24. 24. A system for controlling the operation of one or more human-machine interfaces within a vehicle, the system comprising: an input arranged to data relating to user interaction with the human-machine interfaces within the vehicle; processing means arranged to monitor user interaction with the human-machine interfaces within the vehicle and to assess a monitored user interaction against pre-determined available user interaction options; an output arranged to output a help prompt to the user, the help prompt being generated in dependence on deviations between the monitored user interaction and the pre-determined available user interaction options.
  25. 25. A vehicle comprising a system or adapted to perform a method as claimed in any preceding claim.
  26. 26. A system, a method or a vehicle substantially as herein described in relation to Figures ito 10.
GB1315619.5A 2013-09-03 2013-09-03 Human-machine interface Active GB2517792B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1315619.5A GB2517792B (en) 2013-09-03 2013-09-03 Human-machine interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1315619.5A GB2517792B (en) 2013-09-03 2013-09-03 Human-machine interface

Publications (3)

Publication Number Publication Date
GB201315619D0 GB201315619D0 (en) 2013-10-16
GB2517792A true GB2517792A (en) 2015-03-04
GB2517792B GB2517792B (en) 2018-02-07

Family

ID=49397187

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1315619.5A Active GB2517792B (en) 2013-09-03 2013-09-03 Human-machine interface

Country Status (1)

Country Link
GB (1) GB2517792B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2559123A (en) * 2017-01-24 2018-08-01 Sony Interactive Entertainment Inc Interaction apparatus and method
CN114091817A (en) * 2021-10-15 2022-02-25 岚图汽车科技有限公司 Vehicle human-computer interaction intelligent degree evaluation method and related equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002007019A (en) * 2000-06-22 2002-01-11 Hitachi Ltd System for automatically displaying/erasing help guidance
WO2007108839A2 (en) * 2006-03-17 2007-09-27 Matsushita Electric Industrial Co., Ltd. Human machine interface method and device for automotive entertainment systems
CN101674375A (en) * 2009-09-02 2010-03-17 优视动景(北京)技术服务有限公司 Display method of helping prompt for mobile communication terminal and system thereof

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6998972B2 (en) * 2002-10-31 2006-02-14 General Motors Corporation Driving workload estimation
US7292152B2 (en) * 2003-06-12 2007-11-06 Temic Automotive Of North America, Inc. Method and apparatus for classifying vehicle operator activity state
US7428449B2 (en) * 2006-03-14 2008-09-23 Temic Automotive Of North America, Inc. System and method for determining a workload level of a driver
JP2013539572A (en) * 2010-07-29 2013-10-24 フォード グローバル テクノロジーズ、リミテッド ライアビリティ カンパニー Method for managing driver interface tasks and vehicle
GB2500581B (en) * 2012-03-23 2014-08-20 Jaguar Land Rover Ltd Method and system for controlling the output of information to a driver based on an estimated driver workload

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002007019A (en) * 2000-06-22 2002-01-11 Hitachi Ltd System for automatically displaying/erasing help guidance
WO2007108839A2 (en) * 2006-03-17 2007-09-27 Matsushita Electric Industrial Co., Ltd. Human machine interface method and device for automotive entertainment systems
CN101674375A (en) * 2009-09-02 2010-03-17 优视动景(北京)技术服务有限公司 Display method of helping prompt for mobile communication terminal and system thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2559123A (en) * 2017-01-24 2018-08-01 Sony Interactive Entertainment Inc Interaction apparatus and method
US11188358B2 (en) 2017-01-24 2021-11-30 Sony Interactive Entertainment Inc. Interaction apparatus and method
CN114091817A (en) * 2021-10-15 2022-02-25 岚图汽车科技有限公司 Vehicle human-computer interaction intelligent degree evaluation method and related equipment

Also Published As

Publication number Publication date
GB201315619D0 (en) 2013-10-16
GB2517792B (en) 2018-02-07

Similar Documents

Publication Publication Date Title
KR102176305B1 (en) Method and device for representing recommended operating actions of a proposal system and interaction with the proposal system
US8910086B2 (en) Method for controlling a graphical user interface and operating device for a graphical user interface
US9361000B2 (en) Information display device for vehicle
JP5565421B2 (en) In-vehicle operation device
US9238409B2 (en) Steering wheel and integrated touchpads for inputting commands
US8527900B2 (en) Motor vehicle
JP6747835B2 (en) Image display
US9933885B2 (en) Motor vehicle operating device controlling motor vehicle applications
KR102216299B1 (en) Methods and assemblies for interaction with proposed systems with automated manipulation actions
US10139988B2 (en) Method and device for displaying information arranged in lists
US20180307405A1 (en) Contextual vehicle user interface
CN107704184B (en) Method for operating a device and operating device
US20140181749A1 (en) User interface device and program for the same
JP2016097928A (en) Vehicular display control unit
KR101558354B1 (en) Blind control system for vehicle
JP2015007841A (en) Information display device for vehicle
GB2517792A (en) Human-machine interface
JP2018010472A (en) In-vehicle electronic equipment operation device and in-vehicle electronic equipment operation method
JP2014119917A (en) On-vehicle information processing device
US20240109418A1 (en) Method for operating an operating device for a motor vehicle, and motor vehicle having an operating device
KR20190134978A (en) Method for dynamically adapting an operating device in an motor vehicle and operating device and motor vehicle
GB2519936A (en) Human-machine interface
US10168858B2 (en) Method for displaying information in a vehicle, and a device for controlling the display
CN203766531U (en) Device for displaying and operating functional groups and/or functions
KR102349649B1 (en) The method for controlling display and operation of the graphic user interface