GB2528083A - System and method for automated device control for vehicles using driver emotion - Google Patents

System and method for automated device control for vehicles using driver emotion Download PDF

Info

Publication number
GB2528083A
GB2528083A GB1412166.9A GB201412166A GB2528083A GB 2528083 A GB2528083 A GB 2528083A GB 201412166 A GB201412166 A GB 201412166A GB 2528083 A GB2528083 A GB 2528083A
Authority
GB
United Kingdom
Prior art keywords
user
vehicle
emotion state
vehicle device
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1412166.9A
Other versions
GB201412166D0 (en
GB2528083B (en
Inventor
Thomas Popham
Linh Nguyen
Joan Barcelo Llado
Adam Grzywaczewski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1412166.9A priority Critical patent/GB2528083B/en
Publication of GB201412166D0 publication Critical patent/GB201412166D0/en
Priority to EP15733760.1A priority patent/EP3166833A1/en
Priority to PCT/EP2015/065240 priority patent/WO2016005289A1/en
Publication of GB2528083A publication Critical patent/GB2528083A/en
Application granted granted Critical
Publication of GB2528083B publication Critical patent/GB2528083B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/04Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to presence or absence of the driver, e.g. to weight or lack thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • B60W2050/0088Adaptive recalibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/043Identity of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

An automated device control system, for use in a vehicle, comprises a user identification module 102 which identifies a vehicle user, and a monitor module 106 that obtains a value of a parameter indicative of the user emotion state, e.g. alertness, tiredness etc. The system has an input 104 for obtaining control instructions from the user for control of a vehicle device 114, such as a cabin heater, air conditioning, radio, HMI feedback etc. A training module 110 is configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train a training algorithm configured to provide, based on training, an automated vehicle device 114 with the control instructions in response to the measured parameter indicative of the user emotion state. For example, if the training algorithm determines that the user turns on the radio in 90% of cases during one emotion state, the system assumes that the radio should turned on when that emotion state is detected. Reference is also made to a method and a computer/processor.

Description

SYSTEM AND METHOD FOR AUTOMATED DEVICE CONTROL FOR
VEHICLES USING DRIVER EMOTION
S TECHNICAL FIELD
The present invention relates to systems and methods for automated device control for vehicles using driver emotion, and vehicles using the same.
BACKGROUND OF THE INVENTION
Device control systems for vehicles are well known. Various vehicle devices and systems are accessible to vehicle users, for example electronically actuated devices, and devices managed by on-board computer systems. For example, a vehicle may provide a user with manually or electronically operated climate control, media selection options via a human-machine interface (HMI), seat adjustment and massage devices, navigation device settings, and the like.
Previous systems allow a car driver to choose settings for such features on a console or HMI, in response to which a control system actuates the relevant vehicle devices.
These systems demand user input for most instances where any of the devices are operated, and typically require complex user input in order to address each device in turn, and to obtain settings appropriate for the user. In addition, the number of devices operable by a user can be large, increasing complexity; typically lengthy and unwieldy HMI menu systems are required. Many actions are repeated frequently; for example, users typically press the same sequence of buttons in their daily commute.
Furthermore, previous systems are unresponsive to changes in the emotional state of the driver. For example, the amount of information presented and requested from the driver (e.g. via a console or HMI) remains fixed in all circumstances. This means that the control system may give too much unnecessary information to the user in certain situations, or insufficient information in others.
The present invention has been devised to mitigate or overcome at least some of the above-mentioned problems.
SUMMARY OF INVENTION
Aspects and embodiments of the invention are set out in the accompanying claims.
In general terms, one embodiment of an aspect of the invention can provide an automated device control system for use in a vehicle, comprising: a user identification module configured to identify the vehicle user; a user monitor module configured to obtain a value of a parameter indicative of a user emotion state; an input for obtaining an instruction from a user for control of a vehicle device; and a training module configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm configured to provide, based on the training, an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
This allows operation of vehicle devices to be automated, removing complexity and the need for repetitive user control instructions. In addition, vehicle device activity, instruction capability and response can be controlled according to an indicated emotion state of the user. This allows vehicle devices to be more responsive, or more appropriately responsive, in particular circumstances.
The system may obtain values for a plurality of user emotion state parameters, and obtain a plurality of user vehicle device control instructions, at each training step of the training module. The system may be integrated into a vehicle, or provided in a separate module or device, usable within or integratable into the vehicle. The emotion state parameter value measured may be associated with a particular or instant user instruction, or the parameter values may be for the respective user vehicle device instructions.
Preferably, the monitor module and the training module are configured to: obtain the value of a parameter from an input device; and retrieve a user emotion flag associated in a stored record with that value of the parameter. These features may be carried out by one or other of the monitor or training modules, or shared between the two.
More preferably, the monitor module and the training module are configured to retrieve for a plurality of parameter values obtained an associated user emotion flag; and process the returned user emotion flags in order to determine a representation of a user emotion state.
Suitably, the monitor module is configured to record the parameter value indicative of the user emotion state at the time of input of the user vehicle device control instruction.
In an embodiment, the input is configured to obtain a plurality of types of user vehicle device control instruction, and a type of instruction addresses one of: a user seat activity device; a climate control device; a media selection device; a navigation device; and an HMI device.
Suitably, the user monitor module is configured to obtain values of a plurality of types of parameter indicative of a user emotion state. The parameter indicative of a user emotion state may simply be a time or location.
In an embodiment, the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record a condition parameter. For example, the devices may record the time at instruction, a time elapsed (since a previous record or event), or a climate parameter, such as a temperature or humidity measurement, inside or outside the vehicle.
In an alternative, the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record user activity. Preferably, the parameter indicative of a user emotion state is obtained from one (or more) of: a camera; a smartphone; a vehicle management device; a network connection; a navigation device; and a location device.
The system may include a control module or processor configured to receive the automated vehicle device control instruction and to control the vehicle device in accordance with the automated vehicle device control instruction.
One embodiment of another aspect of the invention can provide a method of automating device control for a vehicle, comprising the steps of: obtaining a value of a parameter indicative of a user emotion state, obtaining an instruction from a user for control of a vehicle device, using user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm, and providing an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state based on the training.
One embodiment of another aspect of the invention can provide a vehicle comprising a system as described in any of the above described embodiments.
Further aspects of the invention comprise computer programs or applications which, when loaded into or run on a computer, cause the computer to become a system, or to carry out methods, according to the aspects described above.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible.
The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a diagram illustrating an automated vehicle device control system according to an embodiment of the invention; Figure 2 is a diagram illustrating steps associated with an automated vehicle device control system according to an embodiment of the invention; and Figure 3 is a diagram illustrating a number of options for inputs and outputs to and from an automated vehicle device control system according to an embodiment of the invention.
DETAILED DESCRIPTION
Embodiments of the invention automate the operation of devices around a vehicle in order to reduce user or driver workload. Some devices or features may be automated using simple rules, however knowing whether to automate a particular feature may depend upon the driver state (e.g. tired, busy, stressed).
In general terms, the system is trained or learns the preferences of the vehicle user, by recording inputs and interactions whilst monitoring parameters and factors which may indicate a mood or emotion of the user.
An advantage of these embodiments is that, because the system is trained based on user inputs, and on factors indicating the emotion state of the user, the automation will be more accurate, and the user is more likely to accept that the vehicle automatically does things for the user.
In embodiments, the solution involves three main processing steps: 1) determining the driver emotion state from sensor and other inputs; 2) learning the driver preferences for each vehicle feature given the driver emotion; 3) automating and adjusting vehicle features given the learned patterns and currently detected emotions.
If a pattern of user inputs against monitored user states is consistent enough, the system can ask the user to confirm if they wish automation of vehicle features to begin. The user can select automation of all or some of the features, or postpone automation until further learning/training has occurred. Once the user has approved a "self-learning" mode, the car device management system will, whenever that user uses the vehicle, detect the parameters to assess the emotion state of that user, and execute the sequence of device actuations according to the patterns established for an identified emotion state. For example, at the beginning of a journey if a certain emotion state is inferred, the system may set a cabin temperature, and select a particular radio station.
The monitored or determined emotion state indicators may be as simple as a time or location; for example a direct link could simply be established that at certain times or locations, certain device instructions will be expected. For example, the system may determine directly that the radio is always turned on in the early morning, and therefore on detecting such a time, instruct the radio to be activated.
Figure 1 illustrates an automated vehicle device control system according to an embodiment of the invention. The system in this embodiment is wholly contained within a vehicle 100, however in other embodiments parts of the system (particularly those indicated inside system 112) may be embodied in peripheral devices, on server systems providing cloud data and processing, or a mobile device.
The system includes an array of monitoring devices (102) to identify the user of the vehicle (i.e. the vehicle driver) and to estimate a mood or emotion state of the user. In this embodiment, this can include sensors, a camera (still or video), monitors, and connections to such devices, and connections to (or communication devices to connect to) user devices such as smartphones, and to a network. The system also includes the usual input systems (104) for a user to interact with vehicle devices. These can include console buttons, dials and levers and the like, the input from which can be measured by the monitoring system. The system also has a human-machine interface (HMI) device, with which the user interacts. Here the user can enter various vehicle device control instructions, such as setting a cabin temperature, turning on a massage seat, or setting a route via a navigation device. The monitoring system can also be updated by electronic feedback from manually adjusted devices, such as electric windows or a sunroof; these devices can be connected to the management system to feedback their use.
Before and/or whilst instructions are made, the monitoring devices 102 measure factors contributing to or influencing the user's mood. For example, the weather may be monitored (by on-board climate sensors, or by connection to a networked weather data resource), along with the cabin temperature, a duration of journey, and current driving characteristics of the user. The monitor module 106 receives measurements from the monitoring devices. Alternatively or additionally, the monitor module may record data from the input systems 104, such as the time of instruction, duration of the use of the device, or particular instructions given for
example via the HMI.
The system is also configured to generate an assessment of the user emotion state from the data collected from the monitoring inputs 102. In this embodiment this processing is done in the monitoring module 106 (in other embodiments, this process may be shared with the training module, or undertaken solely in a module providing the training function). Systems for assessment of user emotion states from such data have been previously considered. In a simple embodiment, each value of parameters assessed by the monitoring devices (such as temperature, driving statistics, images captured and recognised as facial expressions) will correspond to a user emotion flag in stored records. On input of a parameter, the flag will be returned, and processing can be done on the plurality of flags returned, to assess a mood. For example, if most of the parameters input are associated with a "tired" mood flag, the system may infer that the user emotion state is "tired". Flags may of course be less specific, for example a record may give a number on a scale of a possible mood, such as 2/10 for tiredness, 8/10 for perceived happiness, or probabilities of certain moods, such as 20% tiredness. Combinations of scaled factors and probabilities can be more accurately used to assess mood.
The user vehicle device instructions, monitored factors and parameters, and emotion state assessment from the monitoring module are passed to the training module 110. The training module gathers data during use of the vehicle, logging various vehicle device control instructions and other input data, and associated monitoring parameters and emotion states at the time. These are used to train an algorithm which is later used to predict the desired vehicle device statuses of a user when a given user emotion state arises during use of the vehicle.
The monitor module, training module and a processor 108 are in this embodiment contained in a computer or logic system 112.
Once trained, and the algorithm has been activated by a user agreeing to its use, the system assesses a current user emotion state, using the monitoring devices and the monitoring module. Via the processor or controller 108, instructs control instruction is issued to the vehicle devices 114 (cabin heater/air conditioning, music and speaker systems, HMI feedback, navigation devices and the like) to replicate the conditions predicted by the algorithm that the user would wish to have according to the measured emotion state.
For example, if a user is assessed as more active or less fatigued at the start of a journey early in the day, perhaps by facial recognition by the camera, or by recorded movement of the user's smartphone, the system will have been trained to instruct the vehicle devices appropriately, for example to maximise information feedback via HMI or other user feedback devices, or to set a more demanding route via a navigation device, which might not be chosen for a fatigued driver.
Determining driver emotion state There are several methods for determining the driver emotion state. Examples of the monitoring devices used to assess mood are: a) camera -a still or video camera can be used to obtain images of the vehicle user, and previously considered methods can be used to recognise facial expressions and gestures of the user to determine emotion or mood data. For example, lower than usual eyelids may be recorded as a fatigued emotion state.
b) smartphone calendar data from the user's smartphone or other device calendar can be assessed. For example, if the driver has been busy during a day, the mood may be assessed as tired; if a driver's meetings with a certain group or individual usually give rise to a stress response, this may be noted if the calendar has such a meeting that day. If the user is on a holiday, this may indicate a happier emotion state.
c) smartphone movement -motion sensors in the user's smartphone or other device may indicate how active the user has been (e.g. stationary for long periods, walking a lot), and indicate, for instance, a tired emotion state.
Alternatively, other sensors or records in the device may record the number of calls answered, or the number of times the device was picked up and put down during the day -these may indicate a tired or stressed response.
d) driver style and workload -feedback from various devices in the vehicle (steering wheel monitor, accelerator and brake feedback, gyroscopes or motion detectors monitoring handling and cornering) may be assessed as indicating mood such as happiness/anger, alertness/tiredness or stress/relaxation.
e) social media posts -a network connection, or data log on a user device, can be used to monitor the user's recent posting habits. Previously considered methods have been used to assess mood from such postings, such as analysing frequency of certain words, and lengths of sentences, or analysing number of posts in specific periods.
f) weather -climate devices, such as temperature, humidity, pressure gauges common in vehicles can be used to provide possible mood indicators. For example, overcast weather may usually be recorded and returned as a greater likelihood of lower mood of the user.
g) time -the time of day may indicate mood, for example it may be that the system has recorded tiredness indicators after 10pm regularly, so that once this time appears, the monitoring system may assess a higher likelihood of tiredness in the driver emotion state.
h) OPS location -certain locations may give rise to measurable responses. For example, it may be that the user at home is usually less fatigued, and happier.
i) Driving duration -usually long journeys result in fatigue, and this can be recorded as an indicator, while journey time is monitored by the system.
As noted above, methods of determining emotion states from such inputs have been previously considered. Certain of these factors are direct indicators of possible mood and may be assessed as such. For example, processing images from a camera may give reliable indications of mood. Others may be more indirect -smartphone activity or calendar input may give indications backed up by other factors assessed at other times, or input directly by a user. For example, it may be that certain calendar events are stressful for a user, but this link might only be established by direct entry by the user confirming this, or by assessment of other behavioural factors (such as (a) to (i) above). In addition, these indirect factors may be given less weighting in a monitoring system's calculation of mood probability than, for example, video image assessment.
Combinations of factors may of course also be used to indicate mood. For example, journey duration and location measurements may indicate heavy traffic, which may be associated with a flag for that user of higher probability of stress.
Driving style may be used in combination with, for example a weather assessment or calendar function to distinguish between types of higher level mood. For example, if a driver is handling the vehicle more robustly, this may be an indication either of stress or excitement; a second indicator could be used to determine which of these is appropriate.
Learning driver preferences The training algorithm is configured to learn actions for each vehicle feature given the driver emotion. When trying to determine patterns of when the user operates certain controls, the algorithm checks to see whether the feature operation is correlated with a particular emotion state. For example, User A tends to operate massage seats when they are tired, User B operates massage seats if they are stressed and have been driving for more than 30 minutes.
Determining the driver pattern also takes accounts of other inputs such as time of day, current location and set destination. The patterns are determined by a machine learning algorithm using standard or previously considered techniques.
The algorithm may be run on a neural network, or by support vector machine, or by other machine learning techniques, for example using k-nearest neighbour algorithms. The pattern determination may for example be by mathematical clustering. For example, if there are five different mood inputs, a five dimensional plot of the device instructions detected for these mood inputs can be generated.
It may therefore be that if three of the five mood inputs are noted at a low level, there may be one type of emotion state determination, differing from that if all five mood inputs are at a low level.
In an embodiment, the training algorithm can use the following steps to determine the automatic settings for each of the components.
1) The algorithm begins by logging the monitoring device inputs! suggested emotion states, and user inputs for every journey of the vehicle.
2) After around two weeks of data has been collected, histograms can be plotted for inputs and emotion states against vehicle device usage; for example, it may be that a user usually (but not always) on early starts displays tiredness indicators, which are assessed by the monitoring module, and turns on the radio.
3. Once the histogram is constructed, the probability of applying particular settings can be calculated and used as a basis for a decision function when applying automatic settings. For example, if the user turns on the radio in 90% of cases under that emotion state, the system may assume that this device instruction should be applied when this emotion state is detected. If observed histograms contain enough data points then the system can propose to the driver to automate certain, or all, device features.
4) In the next drive cycle the user emotion state is assessed at the beginning of or during the journey and compared to the recorded probability distributions, and if observed probabilities are above a certain threshold the automatic device instructions are applied.
5) Once in operation, the model is constantly updated based on user interactions, in particular when previously unobserved emotion states are apparent.
6) If a user interaction contradicts the learnt model consistently over a number of consecutive journeys, the model is updated.
7) The usage patterns can be linked with personalization, linking separate models to different users -for example, unique users can be identified by their unique key4obs which transmit a signal to the vehicle, instructing it to use that user's device control profile.
Figure 2 illustrates steps associated with an automated vehicle device control system according to an embodiment of the invention. Initially in training mode, monitoring device and monitoring module inputs (202) assessing the user emotion state are input to the training module (206); for example, the user may be assessed as alert. The user device control instruction (204), such as tuning the radio to a certain station, is also passed to the training module. The algorithm is then trained on the basis of these inputs. If not enough data has yet been collected (208) to establish any patterns in emotion states correlated with user instructions, training continues with more user inputs. If enough data has been collected (208), the model is now ready for implementation, typically on next use of the vehicle (although on a journey long enough for conditions to vary, the model could be used as soon as a new emotion state is detected). The monitoring devices assess the current emotion state of the user (210). If an emotion state can be determined sufficiently by the monitoring module, the algorithm then determines a vehicle device control instruction for the emotion state determined (212), to instruct the relevant vehicle device(s) (214). For example, the system may have determined that the user is alert, and the algorithm may have noted that this mood usually correlates to tuning to the certain radio station, so tuning the radio to that station is instructed.
Figure 3 is a diagram illustrating a number of options for inputs and outputs to and from an automated vehicle device control system according to an embodiment of the invention.
As listed above, there are various possible monitoring inputs and devices (302) which are available to contribute to the emotion state processing. In addition to those listed in Figure 3 and above, other monitoring devices may include a light sensor, which may help to indicate mood -this can be used with other weather sensors, or directly, for example to indicate a lower mood in overcast conditions or darkness. A microphone may record audio, and a processor detect speech patterns of the user to other passengers or on an in-vehicle telephone or radio, which may indicate mood. For example, slow speech patterns may indicate tiredness.
One or more of the device inputs are used by the monitoring system to determine a driver emotion state (304). In the meantime, vehicle device instructions are noted (306). Examples of devices and features actuable by the user are climate settings, media selection, route selection, seat adjustment (including massage on/off), adjustment of other in-vehicle devices such as windows, HMI and console inputs, and message services. Any device operable by the user may be assessed in this way. The instructions are noted alongside the emotion state determined, and in this way the device preferences of the driver in certain moods are learned (308). These preferences can then be used to automatically generate device instructions for actuating the devices (310).
One example of an output device is the messaging or information presented to the user via an HMI or by audio feedback. The user may usually select minimal information or feedback in certain detected emotion states, for example in a stressed mood. The algorithm will learn this, and if such an emotion state is detected, restrict information presented to the user to a basic level, or to a minimum required for safety.
The initialisation of the learning system will typically be prompted either by an HMI message suggesting that it be used, or that a pattern has been detected, or by the user activating the system manually. As noted above, the system will take notice of consistent (frequent and self-conforming) and contradictory (not in agreement with system predictions) settings made by the user, and re-train the algorithms accordingly. Alternatively the user can re-set the entire training data set to start from scratch.
The data collected by the system in the vehicle may be stored locally, or alternatively may be transmitted to and stored in a remote location, for example on a remote server via a cloud computing interface. The data can then easily be ported from vehicle to vehicle, for example multiple owned vehicles, or old and new ones. Similar porting can be done if the profile is stored on a user device, such as a mobile device or tablet.
The HMI in one embodiment is provided by a software application on a user's personal or mobile device, rather than or in addition to the HMI inside the vehicle.
The user's preferences may also be edited on the application on the device.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.
Further aspects of the invention are set out in the following set of numbered paragraphs: 1. An automated device control system for use in a vehicle, comprising: a user identification module to determine the identity of the vehicle user; a monitor module configured to obtain a value of a parameter indicative of a user emotion state; an input configured to obtain an instruction from a user for control of a vehicle device; and a training module configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm, the algorithm being configured to provide, based on the training, an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
2. A system according to paragraph 1, wherein the monitor module and the training module are configured to: obtain the value of a parameter from an input device; and retrieve a user emotion flag associated in a stored record with that value of the parameter.
3. A system according to paragraph 2, wherein the monitor module and the training module are configured to retrieve for a plurality of parameter values obtained an associated user emotion flag; and process the returned user emotion flags in order to determine a representation of a user emotion state.
4. A system according to paragraph 1, wherein the monitor module is configured to record the parameter value indicative of the user emotion state at the time of input of the user vehicle device control instruction.
5. A system according to paragraph 1, wherein the input is configured to obtain a plurality of types of user vehicle device control instruction, and wherein a type of instruction addresses one of: a user seat activity device; a climate control device; a media selection device; a navigation device; and an HMI device.
6. A system according to paragraph 1, wherein the user monitor module is configured to obtain values of a plurality of types of parameter indicative of a user emotion state.
7. A system according to paragraph 1, wherein the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record a condition parameter.
8. A system according to paragraph 1, wherein the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record user activity.
9. A system according to paragraph 8, wherein the parameter indicative of a user emotion state is obtained from one of: a camera; a smartphone; a vehicle management device; a network connection; a navigation device; and a location device.
10. A system according to paragraph 9, comprising a control module configured to receive the automated vehicle device control instruction and to control the vehicle device in accordance with the automated vehicle device control instruction.
11. A method of automating device control for a vehicle, comprising the steps of: determining an indication of the user of the vehicle; obtaining a value of a parameter indicative of a user emotion state; obtaining an instruction from a user for control of a vehicle device; using user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm; and providing an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state based on the training.
12. A media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a system according to paragraph 1.
13. A media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a method according to paragraph 11.
14. A vehicle comprising a system as claimed in paragraph 1.

Claims (15)

  1. CLAIMS1. An automated device control system for use in a vehicle, comprising: a user identification module configured to determine the identity of the user of the vehicle; a monitor module configured to obtain a value of a parameter indicative of a user emotion state; an input for obtaining an instruction from a user for control of a vehicle device; and a training module configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm configured to provide, based on the training, an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
  2. 2. A system according to Claim 1, wherein the monitor module and the training module are configured to: obtain the value of a parameter from an input device; and retrieve a user emotion flag associated in a stored record with that value of the parameter.
  3. 3. A system according to Claim 2, wherein the monitor module and the training module are configured to retrieve for a plurality of parameter values obtained an associated user emotion flag; and process the returned user emotion flags in order to determine a representation of a user emotion state.
  4. 4. A system according to any preceding claim, wherein the monitor module is configured to record the parameter value indicative of the user -20 -emotion state at the time of input of the user vehicle device control instruction.
  5. 5. A system according to any preceding claim, wherein the input is configured to obtain a plurality of types of user vehicle device control instruction, and wherein a type of instruction addresses one of: a user seat activity device; a climate control device; a media selection device; a navigation device; and an HMI device.
  6. 6. A system according to any preceding claim, wherein the user monitor module is configured to obtain values of a plurality of types of parameter indicative of a user emotion state.
  7. 7. A system according to any preceding claim, wherein the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record a condition parameter.
  8. 8. A system according to any of the Claims 1 to 6, wherein the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record user activity.
  9. 9. A system according to Claim 8, wherein the parameter indicative of a user emotion state is obtained from one of: a camera; a smartphone; a vehicle management device; a network connection; a navigation device; and a location device.
  10. 10. A system according to any preceding claim, comprising a control module configured to receive the automated vehicle device control instruction and to control the vehicle device in accordance with the automated vehicle device control instruction.
    -21 -
  11. 11. A method of automating device control for a vehicle, comprising the steps of: determining an identity of the vehicle user; obtaining a value of a parameter indicative of a user emotion state; obtaining an instruction from a user for control of a vehicle device; and using user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm and, providing an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state based on the training.
  12. 12. The method according to Claim 11, comprising controlling a vehicle device in accordance with the automated vehicle device control instruction.
  13. 13. A media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a system, or to carry out a method, according to any preceding claim.
  14. 14. A vehicle comprising a system as claimed in any of the Claims 1 to 10.
  15. 15. A method or apparatus substantially as herein described with reference to the accompanying figures.
GB1412166.9A 2014-07-08 2014-07-08 System and method for automated device control for vehicles using driver emotion Active GB2528083B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1412166.9A GB2528083B (en) 2014-07-08 2014-07-08 System and method for automated device control for vehicles using driver emotion
EP15733760.1A EP3166833A1 (en) 2014-07-08 2015-07-03 System and method for automated device control for vehicles using driver emotion
PCT/EP2015/065240 WO2016005289A1 (en) 2014-07-08 2015-07-03 System and method for automated device control for vehicles using driver emotion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1412166.9A GB2528083B (en) 2014-07-08 2014-07-08 System and method for automated device control for vehicles using driver emotion

Publications (3)

Publication Number Publication Date
GB201412166D0 GB201412166D0 (en) 2014-08-20
GB2528083A true GB2528083A (en) 2016-01-13
GB2528083B GB2528083B (en) 2017-11-01

Family

ID=51410830

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1412166.9A Active GB2528083B (en) 2014-07-08 2014-07-08 System and method for automated device control for vehicles using driver emotion

Country Status (3)

Country Link
EP (1) EP3166833A1 (en)
GB (1) GB2528083B (en)
WO (1) WO2016005289A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018059995A1 (en) * 2016-09-29 2018-04-05 Audi Ag Method for operating a motor vehicle with the help of physiological vital data, motor vehicle and mobile terminal device
EP3525141A4 (en) * 2016-11-16 2019-11-20 Honda Motor Co., Ltd. Emotion inference device and emotion inference system
GB2588969A (en) * 2019-11-18 2021-05-19 Jaguar Land Rover Ltd Apparatus and method for determining a cognitive state of a user of a vehicle
GB2606018A (en) * 2021-04-23 2022-10-26 Daimler Ag Emotion recognition for artificially-intelligent system
US11919531B2 (en) * 2018-01-31 2024-03-05 Direct Current Capital LLC Method for customizing motion characteristics of an autonomous vehicle for a user

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9688281B2 (en) * 2015-03-23 2017-06-27 Toyota Jidosha Kabushiki Kaisha Proactive autocomplete of a user's in-vehicle operations
CN106445349B (en) * 2016-10-18 2019-06-25 珠海格力电器股份有限公司 A kind of method, apparatus and electronic equipment adjusting mobile terminal system parameter
KR102625396B1 (en) * 2018-08-30 2024-01-17 현대자동차주식회사 Vehicle and controlling method thereof
DE102018127105A1 (en) * 2018-10-30 2020-04-30 Bayerische Motoren Werke Aktiengesellschaft Method and device for influencing a state of mind of a user of a vehicle
US11613217B2 (en) 2019-05-08 2023-03-28 Ford Global Technologies, Llc Vehicle identity access management
CN112568904B (en) * 2019-09-30 2022-05-13 比亚迪股份有限公司 Vehicle interaction method and device, computer equipment and storage medium
US20240104474A1 (en) * 2021-07-30 2024-03-28 Maxis Broadband Sdn. Bhd. Methods, systems, and devices for managing user vehicular operation, activity, and safety
DE102021131040B4 (en) * 2021-11-26 2024-01-11 Audi Aktiengesellschaft Method for at least partially automated driving of a motor vehicle and motor vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060149428A1 (en) * 2005-01-05 2006-07-06 Kim Jong H Emotion-based software robot for automobiles
US20080291032A1 (en) * 2007-05-23 2008-11-27 Toyota Engineering & Manufacturing North America, Inc. System and method for reducing boredom while driving
US20110224875A1 (en) * 2010-03-10 2011-09-15 Cuddihy Mark A Biometric Application of a Polymer-based Pressure Sensor
US20140118131A1 (en) * 2012-10-30 2014-05-01 Pixart Imaging Inc. Monitoring and warning system for vehicles

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009045511A1 (en) * 2009-10-09 2011-04-14 Robert Bosch Gmbh Device for learning function of operating assistance of motor vehicle, is formed for receiving current location position of motor vehicle from positioning system, where device activates or deactivates operating assistance
DE102010003251A1 (en) * 2010-03-25 2011-09-29 Bayerische Motoren Werke Aktiengesellschaft Method for adjusting e.g. driver assistance function, of motor car, involves setting regulation for function to adjust manipulated variable depending on input variable based on received values, and controlling function based on regulation
DE102011106357A1 (en) * 2011-07-02 2012-08-30 Daimler Ag Motor car operating method, involves adjusting operating condition of motor car based on received motor car data and by operation device, where motor car data depends on data of personal calendar of user of motor car
DE102011109564B4 (en) * 2011-08-05 2024-05-02 Mercedes-Benz Group AG Method and device for monitoring at least one vehicle occupant and method for operating at least one assistance device
DE102012218842A1 (en) * 2012-10-16 2014-04-17 Bayerische Motoren Werke Aktiengesellschaft Method for adapting driver assistance system of vehicle such as car and truck, involves adapting driver assistance system based on certain property such as biometric property of driver of vehicle
EP2942012A1 (en) * 2014-05-08 2015-11-11 Continental Automotive GmbH Driver assistance system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060149428A1 (en) * 2005-01-05 2006-07-06 Kim Jong H Emotion-based software robot for automobiles
US20080291032A1 (en) * 2007-05-23 2008-11-27 Toyota Engineering & Manufacturing North America, Inc. System and method for reducing boredom while driving
US20110224875A1 (en) * 2010-03-10 2011-09-15 Cuddihy Mark A Biometric Application of a Polymer-based Pressure Sensor
US20140118131A1 (en) * 2012-10-30 2014-05-01 Pixart Imaging Inc. Monitoring and warning system for vehicles

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018059995A1 (en) * 2016-09-29 2018-04-05 Audi Ag Method for operating a motor vehicle with the help of physiological vital data, motor vehicle and mobile terminal device
US11485366B2 (en) 2016-09-29 2022-11-01 Audi Ag Method for operating a motor vehicle with the help of vital physiological data, motor vehicle and mobile terminal device
EP3525141A4 (en) * 2016-11-16 2019-11-20 Honda Motor Co., Ltd. Emotion inference device and emotion inference system
US11186290B2 (en) 2016-11-16 2021-11-30 Honda Motor Co., Ltd. Emotion inference device and emotion inference system
US11919531B2 (en) * 2018-01-31 2024-03-05 Direct Current Capital LLC Method for customizing motion characteristics of an autonomous vehicle for a user
GB2588969A (en) * 2019-11-18 2021-05-19 Jaguar Land Rover Ltd Apparatus and method for determining a cognitive state of a user of a vehicle
WO2021099302A1 (en) * 2019-11-18 2021-05-27 Jaguar Land Rover Limited Apparatus and method for determining a cognitive state of a user of a vehicle
GB2588969B (en) * 2019-11-18 2022-04-20 Jaguar Land Rover Ltd Apparatus and method for determining a cognitive state of a user of a vehicle
GB2606018A (en) * 2021-04-23 2022-10-26 Daimler Ag Emotion recognition for artificially-intelligent system

Also Published As

Publication number Publication date
WO2016005289A1 (en) 2016-01-14
GB201412166D0 (en) 2014-08-20
EP3166833A1 (en) 2017-05-17
GB2528083B (en) 2017-11-01

Similar Documents

Publication Publication Date Title
GB2528083A (en) System and method for automated device control for vehicles using driver emotion
US11498388B2 (en) Intelligent climate control in vehicles
US11042350B2 (en) Intelligent audio control in vehicles
KR102215547B1 (en) Machine monitoring
US9878663B1 (en) Cognitive dialog system for driving safety
US10053113B2 (en) Dynamic output notification management for vehicle occupant
KR20190069421A (en) System and method for predictive failure detection in a vehicle
JP2020524632A (en) System and method for obtaining occupant feedback in response to an autonomous vehicle driving event
US10832148B2 (en) Cognitive dialog system for driving safety
DE102018200244B4 (en) Driver assistance system and procedure for individual determination of boredom
US20210349433A1 (en) System and method for modifying an initial policy of an input/output device
CN113119981B (en) Vehicle active safety control method, system and storage medium
US10198287B2 (en) System and method for improving motor vehicle safety
JP2022095768A (en) Method, device, apparatus, and medium for dialogues for intelligent cabin
CN109716411A (en) Method and apparatus to monitor the activity level of driver
US20240126409A1 (en) Vehicle having an intelligent user interface
CN112904852A (en) Automatic driving control method and device and electronic equipment
US20220371610A1 (en) Method for operating an assistance system depending on a personalised configuration set, assistance system, computer program and computer-readable medium
EP3472742A1 (en) Machine monitoring
US11485368B2 (en) System and method for real-time customization of presentation features of a vehicle
US11442874B2 (en) System and method for generating a modified input/output device policy for multiple users
US11907298B2 (en) System and method thereof for automatically updating a decision-making model of an electronic social agent by actively collecting at least a user response
US20240059303A1 (en) Hybrid rule engine for vehicle automation
US20200406467A1 (en) Method for adaptively adjusting a user experience interacting with an electronic device
WO2021124914A1 (en) State output system