EP3166833A1 - System und verfahren zur automatisierten vorrichtungssteuerung für fahrzeuge mit verwendung von fahreremotionen - Google Patents

System und verfahren zur automatisierten vorrichtungssteuerung für fahrzeuge mit verwendung von fahreremotionen

Info

Publication number
EP3166833A1
EP3166833A1 EP15733760.1A EP15733760A EP3166833A1 EP 3166833 A1 EP3166833 A1 EP 3166833A1 EP 15733760 A EP15733760 A EP 15733760A EP 3166833 A1 EP3166833 A1 EP 3166833A1
Authority
EP
European Patent Office
Prior art keywords
user
vehicle
emotion state
device control
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15733760.1A
Other languages
English (en)
French (fr)
Inventor
Thomas Popham
Linh Nguyen
Joan BARCELO LLADO
Adam GRZYWACZEWSKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Publication of EP3166833A1 publication Critical patent/EP3166833A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/04Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to presence or absence of the driver, e.g. to weight or lack thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • B60W2050/0088Adaptive recalibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/043Identity of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • the present invention relates to systems and methods for automated device control for vehicles using driver emotion, and vehicles using the same.
  • Device control systems for vehicles are well known.
  • Various vehicle devices and systems are accessible to vehicle users, for example electronically actuated devices, and devices managed by on-board computer systems.
  • a vehicle may provide a user with manually or electronically operated climate control, media selection options via a human-machine interface (HMI), seat adjustment and massage devices, navigation device settings, and the like.
  • HMI human-machine interface
  • Previous systems allow a car driver to choose settings for such features on a console or HMI, in response to which a control system actuates the relevant vehicle devices.
  • one embodiment of an aspect of the invention can provide an automated device control system for use in a vehicle, comprising: a user identification module configured to identify the vehicle user; a user monitor module configured to obtain a value of a parameter indicative of a user emotion state; an input for obtaining an instruction from a user for control of a vehicle device; and a training module configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm configured to provide, based on the training, an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
  • the system may obtain values for a plurality of user emotion state parameters, and obtain a plurality of user vehicle device control instructions, at each training step of the training module.
  • the system may be integrated into a vehicle, or provided in a separate module or device, usable within or integratable into the vehicle.
  • the emotion state parameter value measured may be associated with a particular or instant user instruction, or the parameter values may be for the respective user vehicle device instructions.
  • the monitor module and the training module are configured to: obtain the value of a parameter from an input device; and retrieve a user emotion flag associated in a stored record with that value of the parameter.
  • These features may be carried out by one or other of the monitor or training modules, or shared between the two.
  • the monitor module and the training module are configured to retrieve for a plurality of parameter values obtained an associated user emotion flag; and process the returned user emotion flags in order to determine a representation of a user emotion state.
  • the monitor module is configured to record the parameter value indicative of the user emotion state at the time of input of the user vehicle device control instruction.
  • the input is configured to obtain a plurality of types of user vehicle device control instruction, and a type of instruction addresses one of: a user seat activity device; a climate control device; a media selection device; a navigation device; and an HMI device.
  • the user monitor module is configured to obtain values of a plurality of types of parameter indicative of a user emotion state.
  • the parameter indicative of a user emotion state may simply be a time or location.
  • the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record a condition parameter.
  • the devices may record the time at instruction, a time elapsed (since a previous record or event), or a climate parameter, such as a temperature or humidity measurement, inside or outside the vehicle.
  • the parameter indicative of a user emotion state is obtained from one of a plurality of devices configured to record user activity.
  • the parameter indicative of a user emotion state is obtained from one (or more) of: a camera; a smartphone; a vehicle management device; a network connection; a navigation device; and a location device.
  • the system may include a control module or processor configured to receive the automated vehicle device control instruction and to control the vehicle device in accordance with the automated vehicle device control instruction.
  • control module described herein can comprise a control unit or computational device having one or more electronic processors.
  • a vehicle and/or a system thereof may comprise a single control unit or electronic controller or alternatively different functions of the control module may be embodied in, or hosted in, different control units or controllers.
  • control unit will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide the required control functionality.
  • a set of instructions could be provided which, when executed, cause said controller(s) or control unit(s) to implement the control techniques described herein (including the method(s) described below).
  • the set of instructions may be embedded in one or more electronic processors, or alternatively, the set of instructions could be provided as software to be executed by one or more electronic processor(s).
  • a first controller may be implemented in software run on one or more electronic processors, and one or more other controllers may also be implemented in software run on or more electronic processors, optionally the same one or more processors as the first controller. It will be appreciated, however, that other arrangements are also useful, and therefore, the present invention is not intended to be limited to any particular arrangement.
  • the set of instructions described above may be embedded in a computer-readable storage medium (e.g., a non-transitory storage medium) that may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM ad EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
  • a computer-readable storage medium e.g., a non-transitory storage medium
  • a magnetic storage medium e.g., floppy diskette
  • optical storage medium e.g., CD-ROM
  • magneto optical storage medium e.g., magneto optical storage medium
  • ROM read only memory
  • RAM random access memory
  • One embodiment of another aspect of the invention can provide a method of automating device control for a vehicle, comprising the steps of: obtaining a value of a parameter indicative of a user emotion state, obtaining an instruction from a user for control of a vehicle device, using user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm, and providing an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state based on the training.
  • One embodiment of another aspect of the invention can provide a vehicle comprising a system as described in any of the above described embodiments.
  • Further aspects of the invention comprise computer programs or applications which, when loaded into or run on a computer, cause the computer to become a system, or to carry out methods, according to the aspects described above.
  • the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible.
  • the applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
  • Figure 1 is a diagram illustrating an automated vehicle device control system according to an embodiment of the invention
  • Figure 2 is a diagram illustrating steps associated with an automated vehicle device control system according to an embodiment of the invention.
  • Figure 3 is a diagram illustrating a number of options for inputs and outputs to and from an automated vehicle device control system according to an embodiment of the invention.
  • Embodiments of the invention automate the operation of devices around a vehicle in order to reduce user or driver workload. Some devices or features may be automated using simple rules, however knowing whether to automate a particular feature may depend upon the driver state (e.g. tired, busy, stressed). In general terms, the system is trained or learns the preferences of the vehicle user, by recording inputs and interactions whilst monitoring parameters and factors which may indicate a mood or emotion of the user.
  • An advantage of these embodiments is that, because the system is trained based on user inputs, and on factors indicating the emotion state of the user, the automation will be more accurate, and the user is more likely to accept that the vehicle automatically does things for the user.
  • the solution involves three main processing steps:
  • the system can ask the user to confirm if they wish automation of vehicle features to begin. The user can select automation of all or some of the features, or postpone automation until further learning/training has occurred.
  • the car device management system will, whenever that user uses the vehicle, detect the parameters to assess the emotion state of that user, and execute the sequence of device actuations according to the patterns established for an identified emotion state. For example, at the beginning of a journey if a certain emotion state is inferred, the system may set a cabin temperature, and select a particular radio station.
  • the monitored or determined emotion state indicators may be as simple as a time or location; for example a direct link could simply be established that at certain times or locations, certain device instructions will be expected.
  • the system may determine directly that the radio is always turned on in the early morning, and therefore on detecting such a time, instruct the radio to be activated.
  • Figure 1 illustrates an automated vehicle device control system according to an embodiment of the invention.
  • the system in this embodiment is wholly contained within a vehicle 100, however in other embodiments parts of the system (particularly those indicated inside system 1 12) may be embodied in peripheral devices, on server systems providing cloud data and processing, or a mobile device.
  • the system includes an array of monitoring devices (102) to identify the user of the vehicle (i.e. the vehicle driver) and to estimate a mood or emotion state of the user.
  • this can include sensors, a camera (still or video), monitors, and connections to such devices, and connections to (or communication devices to connect to) user devices such as smartphones, and to a network.
  • the system also includes the usual input systems (104) for a user to interact with vehicle devices. These can include console buttons, dials and levers and the like, the input from which can be measured by the monitoring system.
  • the system also has a human-machine interface (HMI) device, with which the user interacts.
  • HMI human-machine interface
  • the user can enter various vehicle device control instructions, such as setting a cabin temperature, turning on a massage seat, or setting a route via a navigation device.
  • the monitoring system can also be updated by electronic feedback from manually adjusted devices, such as electric windows or a sunroof; these devices can be connected to the management system to feedback their use.
  • the monitoring devices 102 measure factors contributing to or influencing the user's mood.
  • the weather may be monitored (by on-board climate sensors, or by connection to a networked weather data resource), along with the cabin temperature, a duration of journey, and current driving characteristics of the user.
  • the monitor module 106 receives measurements from the monitoring devices.
  • the monitor module may record data from the input systems 104, such as the time of instruction, duration of the use of the device, or particular instructions given for example via the HMI.
  • the system is also configured to generate an assessment of the user emotion state from the data collected from the monitoring inputs 102.
  • this processing is done in the monitoring module 106 (in other embodiments, this process may be shared with the training module, or undertaken solely in a module providing the training function).
  • Systems for assessment of user emotion states from such data have been previously considered.
  • each value of parameters assessed by the monitoring devices (such as temperature, driving statistics, images captured and recognised as facial expressions) will correspond to a user emotion flag in stored records. On input of a parameter, the flag will be returned, and processing can be done on the plurality of flags returned, to assess a mood.
  • the system may infer that the user emotion state is "tired".
  • Flags may of course be less specific, for example a record may give a number on a scale of a possible mood, such as 2/10 for tiredness, 8/10 for perceived happiness, or probabilities of certain moods, such as 20% tiredness. Combinations of scaled factors and probabilities can be more accurately used to assess mood.
  • the user vehicle device instructions, monitored factors and parameters, and emotion state assessment from the monitoring module are passed to the training module 1 10.
  • the training module gathers data during use of the vehicle, logging various vehicle device control instructions and other input data, and associated monitoring parameters and emotion states at the time. These are used to train an algorithm which is later used to predict the desired vehicle device statuses of a user when a given user emotion state arises during use of the vehicle.
  • the monitor module, training module and a processor 108 are in this embodiment contained in a computer or logic system 1 12.
  • the system assesses a current user emotion state, using the monitoring devices and the monitoring module. Via the processor or controller 108, instructs control instruction is issued to the vehicle devices 1 14 (cabin heater/air conditioning, music and speaker systems, HMI feedback, navigation devices and the like) to replicate the conditions predicted by the algorithm that the user would wish to have according to the measured emotion state.
  • vehicle devices 1 14 cabin heater/air conditioning, music and speaker systems, HMI feedback, navigation devices and the like
  • the system will have been trained to instruct the vehicle devices appropriately, for example to maximise information feedback via HMI or other user feedback devices, or to set a more demanding route via a navigation device, which might not be chosen for a fatigued driver.
  • a) camera - a still or video camera can be used to obtain images of the vehicle user, and previously considered methods can be used to recognise facial expressions and gestures of the user to determine emotion or mood data. For example, lower than usual eyelids may be recorded as a fatigued emotion state.
  • smartphone calendar - data from the user's smartphone or other device calendar can be assessed. For example, if the driver has been busy during a day, the mood may be assessed as tired; if a driver's meetings with a certain group or individual usually give rise to a stress response, this may be noted if the calendar has such a meeting that day.
  • smartphone movement - motion sensors in the user's smartphone or other device may indicate how active the user has been (e.g. stationary for long periods, walking a lot), and indicate, for instance, a tired emotion state. Alternatively, other sensors or records in the device may record the number of calls answered, or the number of times the device was picked up and put down during the day - these may indicate a tired or stressed response.
  • driver style and workload - feedback from various devices in the vehicle may be assessed as indicating mood such as happiness/anger, alertness/tiredness or stress/relaxation.
  • social media posts - a network connection, or data log on a user device can be used to monitor the user's recent posting habits. Previously considered methods have been used to assess mood from such postings, such as analysing frequency of certain words, and lengths of sentences, or analysing number of posts in specific periods.
  • weather - climate devices such as temperature, humidity, pressure gauges common in vehicles can be used to provide possible mood indicators. For example, overcast weather may usually be recorded and returned as a greater likelihood of lower mood of the user.
  • time - the time of day may indicate mood, for example it may be that the system has recorded tiredness indicators after 10pm regularly, so that once this time appears, the monitoring system may assess a higher likelihood of tiredness in the driver emotion state.
  • GPS location - certain locations may give rise to measurable responses. For example, it may be that the user at home is usually less fatigued, and happier.
  • Driving duration usually long journeys result in fatigue, and this can be recorded as an indicator, while journey time is monitored by the system.
  • methods of determining emotion states from such inputs have been previously considered. Certain of these factors are direct indicators of possible mood and may be assessed as such. For example, processing images from a camera may give reliable indications of mood. Others may be more indirect - smartphone activity or calendar input may give indications backed up by other factors assessed at other times, or input directly by a user.
  • calendar events are stressful for a user, but this link might only be established by direct entry by the user confirming this, or by assessment of other behavioural factors (such as (a) to (i) above).
  • these indirect factors may be given less weighting in a monitoring system's calculation of mood probability than, for example, video image assessment.
  • journey duration and location measurements may indicate heavy traffic, which may be associated with a flag for that user of higher probability of stress.
  • Driving style may be used in combination with, for example a weather assessment or calendar function to distinguish between types of higher level mood. For example, if a driver is handling the vehicle more robustly, this may be an indication either of stress or excitement; a second indicator could be used to determine which of these is appropriate.
  • Learning driver preferences may indicate whether a driver is handling the vehicle more robustly, this may be an indication either of stress or excitement; a second indicator could be used to determine which of these is appropriate.
  • the training algorithm is configured to learn actions for each vehicle feature given the driver emotion.
  • the algorithm checks to see whether the feature operation is correlated with a particular emotion state. For example, User A tends to operate massage seats when they are tired, User B operates massage seats if they are stressed and have been driving for more than 30 minutes.
  • Determining the driver pattern also takes accounts of other inputs such as time of day, current location and set destination.
  • the patterns are determined by a machine learning algorithm using standard or previously considered techniques.
  • the algorithm may be run on a neural network, or by support vector machine, or by other machine learning techniques, for example using k-nearest neighbour algorithms.
  • the pattern determination may for example be by mathematical clustering. For example, if there are five different mood inputs, a five dimensional plot of the device instructions detected for these mood inputs can be generated. It may therefore be that if three of the five mood inputs are noted at a low level, there may be one type of emotion state determination, differing from that if all five mood inputs are at a low level.
  • the training algorithm can use the following steps to determine the automatic settings for each of the components.
  • the algorithm begins by logging the monitoring device inputs, suggested emotion states, and user inputs for every journey of the vehicle. 2) After around two weeks of data has been collected, histograms can be plotted for inputs and emotion states against vehicle device usage; for example, it may be that a user usually (but not always) on early starts displays tiredness indicators, which are assessed by the monitoring module, and turns on the radio.
  • the probability of applying particular settings can be calculated and used as a basis for a decision function when applying automatic settings. For example, if the user turns on the radio in 90% of cases under that emotion state, the system may assume that this device instruction should be applied when this emotion state is detected. If observed histograms contain enough data points then the system can propose to the driver to automate certain, or all, device features.
  • the user emotion state is assessed at the beginning of or during the journey and compared to the recorded probability distributions, and if observed probabilities are above a certain threshold the automatic device instructions are applied.
  • the model is constantly updated based on user interactions, in particular when previously unobserved emotion states are apparent.
  • FIG. 2 illustrates steps associated with an automated vehicle device control system according to an embodiment of the invention.
  • monitoring device and monitoring module inputs (202) assessing the user emotion state are input to the training module (206) ; for example, the user may be assessed as alert.
  • the user device control instruction (204), such as tuning the radio to a certain station, is also passed to the training module.
  • the algorithm is then trained on the basis of these inputs.
  • the model is now ready for implementation, typically on next use of the vehicle (although on a journey long enough for conditions to vary, the model could be used as soon as a new emotion state is detected).
  • the monitoring devices assess the current emotion state of the user (210). If an emotion state can be determined sufficiently by the monitoring module, the algorithm then determines a vehicle device control instruction for the emotion state determined (212), to instruct the relevant vehicle device(s) (214). For example, the system may have determined that the user is alert, and the algorithm may have noted that this mood usually correlates to tuning to the certain radio station, so tuning the radio to that station is instructed.
  • Figure 3 is a diagram illustrating a number of options for inputs and outputs to and from an automated vehicle device control system according to an embodiment of the invention.
  • monitoring inputs and devices which are available to contribute to the emotion state processing.
  • other monitoring devices may include a light sensor, which may help to indicate mood - this can be used with other weather sensors, or directly, for example to indicate a lower mood in overcast conditions or darkness.
  • a microphone may record audio, and a processor detect speech patterns of the user to other passengers or on an in-vehicle telephone or radio, which may indicate mood. For example, slow speech patterns may indicate tiredness.
  • One or more of the device inputs are used by the monitoring system to determine a driver emotion state (304). In the meantime, vehicle device instructions are noted (306).
  • Examples of devices and features actuable by the user are climate settings, media selection, route selection, seat adjustment (including massage on/off), adjustment of other in-vehicle devices such as windows, HMI and console inputs, and message services. Any device operable by the user may be assessed in this way.
  • the instructions are noted alongside the emotion state determined, and in this way the device preferences of the driver in certain moods are learned (308). These preferences can then be used to automatically generate device instructions for actuating the devices (310).
  • an output device is the messaging or information presented to the user via an HMI or by audio feedback.
  • the user may usually select minimal information or feedback in certain detected emotion states, for example in a stressed mood.
  • the algorithm will learn this, and if such an emotion state is detected, restrict information presented to the user to a basic level, or to a minimum required for safety.
  • the initialisation of the learning system will typically be prompted either by an HMI message suggesting that it be used, or that a pattern has been detected, or by the user activating the system manually.
  • the system will take notice of consistent (frequent and self-conforming) and contradictory (not in agreement with system predictions) settings made by the user, and re-train the algorithms accordingly.
  • the user can re-set the entire training data set to start from scratch.
  • the data collected by the system in the vehicle may be stored locally, or alternatively may be transmitted to and stored in a remote location, for example on a remote server via a cloud computing interface.
  • the data can then easily be ported from vehicle to vehicle, for example multiple owned vehicles, or old and new ones. Similar porting can be done if the profile is stored on a user device, such as a mobile device or tablet.
  • the HMI in one embodiment is provided by a software application on a user's personal or mobile device, rather than or in addition to the HMI inside the vehicle.
  • the user's preferences may also be edited on the application on the device.
  • An automated device control system for use in a vehicle comprising:
  • a user identification module to determine the identity of the vehicle user
  • a monitor module configured to obtain a value of a parameter indicative of a user emotion state
  • a training module configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm, the algorithm being configured to provide, based on the training, an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
  • monitor module and the training module are configured to:
  • the monitor module is configured to record the parameter value indicative of the user emotion state at the time of input of the user vehicle device control instruction.
  • the input is configured to obtain a plurality of types of user vehicle device control instruction, and wherein a type of instruction addresses one of: a user seat activity device; a climate control device; a media selection device; a navigation device; and an HMI device.
  • the user monitor module is configured to obtain values of a plurality of types of parameter indicative of a user emotion state.
  • a system according to paragraph 8 wherein the parameter indicative of a user emotion state is obtained from one of: a camera; a smartphone; a vehicle management device; a network connection; a navigation device; and a location device. 10.
  • a control module configured to receive the automated vehicle device control instruction and to control the vehicle device in accordance with the automated vehicle device control instruction.
  • a method of automating device control for a vehicle comprising the steps of:
  • a media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a system according to paragraph 1 .
  • a media device storing computer program code adapted, when loaded into or run on a computer or processor, to cause the computer or processor to become a method according to paragraph 1 1 .
  • a vehicle comprising a system as claimed in paragraph 1 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
EP15733760.1A 2014-07-08 2015-07-03 System und verfahren zur automatisierten vorrichtungssteuerung für fahrzeuge mit verwendung von fahreremotionen Withdrawn EP3166833A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1412166.9A GB2528083B (en) 2014-07-08 2014-07-08 System and method for automated device control for vehicles using driver emotion
PCT/EP2015/065240 WO2016005289A1 (en) 2014-07-08 2015-07-03 System and method for automated device control for vehicles using driver emotion

Publications (1)

Publication Number Publication Date
EP3166833A1 true EP3166833A1 (de) 2017-05-17

Family

ID=51410830

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15733760.1A Withdrawn EP3166833A1 (de) 2014-07-08 2015-07-03 System und verfahren zur automatisierten vorrichtungssteuerung für fahrzeuge mit verwendung von fahreremotionen

Country Status (3)

Country Link
EP (1) EP3166833A1 (de)
GB (1) GB2528083B (de)
WO (1) WO2016005289A1 (de)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9688281B2 (en) * 2015-03-23 2017-06-27 Toyota Jidosha Kabushiki Kaisha Proactive autocomplete of a user's in-vehicle operations
DE102016218877A1 (de) 2016-09-29 2018-03-29 Audi Ag Verfahren zum Betreiben eines Kraftfahrzeugs mit Hilfe von physiologischen Vitaldaten, Kraftfahrzeug und mobiles Endgerät
CN106445349B (zh) * 2016-10-18 2019-06-25 珠海格力电器股份有限公司 一种调整移动终端系统参数的方法、装置和电子设备
CN109906461B (zh) * 2016-11-16 2022-10-14 本田技研工业株式会社 情感估计装置和情感估计系统
US11919531B2 (en) * 2018-01-31 2024-03-05 Direct Current Capital LLC Method for customizing motion characteristics of an autonomous vehicle for a user
KR102625396B1 (ko) * 2018-08-30 2024-01-17 현대자동차주식회사 차량 및 그 제어방법
DE102018127105A1 (de) * 2018-10-30 2020-04-30 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Beeinflussung eines Gemütszustands eines Benutzers eines Fahrzeuges
US11613217B2 (en) 2019-05-08 2023-03-28 Ford Global Technologies, Llc Vehicle identity access management
CN112568904B (zh) * 2019-09-30 2022-05-13 比亚迪股份有限公司 车辆交互方法、装置、计算机设备及存储介质
GB2588969B (en) * 2019-11-18 2022-04-20 Jaguar Land Rover Ltd Apparatus and method for determining a cognitive state of a user of a vehicle
GB2606018A (en) * 2021-04-23 2022-10-26 Daimler Ag Emotion recognition for artificially-intelligent system
US20240104474A1 (en) * 2021-07-30 2024-03-28 Maxis Broadband Sdn. Bhd. Methods, systems, and devices for managing user vehicular operation, activity, and safety
DE102021131040B4 (de) * 2021-11-26 2024-01-11 Audi Aktiengesellschaft Verfahren zur zumindest teilautomatisierten Führung eines Kraftfahrzeugs und Kraftfahrzeug

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060080317A (ko) * 2005-01-05 2006-07-10 현대자동차주식회사 감성기반을 갖는 자동차용 소프트웨어 로봇
US7982620B2 (en) * 2007-05-23 2011-07-19 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for reducing boredom while driving
DE102009045511A1 (de) * 2009-10-09 2011-04-14 Robert Bosch Gmbh Erlernen einer Bedienhilfefunktion
US20110224875A1 (en) * 2010-03-10 2011-09-15 Cuddihy Mark A Biometric Application of a Polymer-based Pressure Sensor
DE102010003251A1 (de) * 2010-03-25 2011-09-29 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Einstellen mindestens einer durch einen Fahrzeuginsassen veränderbaren Funktion eines Fahrzeugs, Steuergerät und Fahrzeug
DE102011106357A1 (de) * 2011-07-02 2012-08-30 Daimler Ag Betreiben und Fernkonfigurieren eines Kraftfahrzeugs mit Hilfe eines persönlichen Kalenders
DE102011109564B4 (de) * 2011-08-05 2024-05-02 Mercedes-Benz Group AG Verfahren und Vorrichtung zur Überwachung zumindest eines Fahrzeuginsassen und Verfahren zum Betrieb zumindest einer Assistenzvorrichtung
DE102012218842A1 (de) * 2012-10-16 2014-04-17 Bayerische Motoren Werke Aktiengesellschaft Anpassen eines Fahrerassistenzsystems eines Fahrzeugs
TWI545041B (zh) * 2012-10-30 2016-08-11 原相科技股份有限公司 車用監測暨警示系統
EP2942012A1 (de) * 2014-05-08 2015-11-11 Continental Automotive GmbH Fahrerassistenzsystem

Also Published As

Publication number Publication date
GB2528083B (en) 2017-11-01
GB2528083A (en) 2016-01-13
GB201412166D0 (en) 2014-08-20
WO2016005289A1 (en) 2016-01-14

Similar Documents

Publication Publication Date Title
WO2016005289A1 (en) System and method for automated device control for vehicles using driver emotion
US11498388B2 (en) Intelligent climate control in vehicles
KR102322838B1 (ko) 차량내 예측적 고장 검출을 위한 시스템 및 방법
US10053113B2 (en) Dynamic output notification management for vehicle occupant
KR102000132B1 (ko) 정보 제공 장치 및 정보 제공 프로그램을 저장하는 기록 매체
CN109313104B (zh) 机器监测
JP2020524632A (ja) 自律車両運転イベントに応答して乗車者フィードバックを取得するシステムおよび方法
CN108749596B (zh) 车机端启动方法、系统及装置
DE102018200244B4 (de) Fahrerassistenzsystem und Verfahren zur individuellen Ermittlung der Langeweile
CN113119981B (zh) 车辆主动安全控制方法、系统及存储介质
CN103373300B (zh) 自适应人机系统和方法
US10198287B2 (en) System and method for improving motor vehicle safety
US20210349433A1 (en) System and method for modifying an initial policy of an input/output device
CN109716411A (zh) 用以监测驾驶员的活动水平的方法和设备
CN112904852B (zh) 一种自动驾驶控制方法、装置及电子设备
US20240126409A1 (en) Vehicle having an intelligent user interface
US20210375141A1 (en) Systems and methods for flight performance parameter computation
US20150321604A1 (en) In-vehicle micro-interactions
US20210245367A1 (en) Customizing setup features of electronic devices
US20220371610A1 (en) Method for operating an assistance system depending on a personalised configuration set, assistance system, computer program and computer-readable medium
EP3472742A1 (de) Überwachung von maschinen
WO2021124914A1 (ja) 状態出力システム
US11485368B2 (en) System and method for real-time customization of presentation features of a vehicle
CN115416671B (zh) 一种情绪调节方法、装置及设备
EP4325395A2 (de) Hybridregelmaschine für die fahrzeugautomatisierung

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170208

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20200312