WO2016202450A1 - A method for controlling an interface device of a motor vehicle - Google Patents

A method for controlling an interface device of a motor vehicle Download PDF

Info

Publication number
WO2016202450A1
WO2016202450A1 PCT/EP2016/000985 EP2016000985W WO2016202450A1 WO 2016202450 A1 WO2016202450 A1 WO 2016202450A1 EP 2016000985 W EP2016000985 W EP 2016000985W WO 2016202450 A1 WO2016202450 A1 WO 2016202450A1
Authority
WO
WIPO (PCT)
Prior art keywords
mental condition
vehicle occupant
output
occupant
vehicle
Prior art date
Application number
PCT/EP2016/000985
Other languages
French (fr)
Inventor
Carsten KAUSCH
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Publication of WO2016202450A1 publication Critical patent/WO2016202450A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0008Feedback, closed loop systems or details of feedback error signal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs

Definitions

  • the invention relates to a method for controlling an interface device of a motor vehicle. Furthermore the invention relates to an interface device.
  • the document US 6 773 344 B1 describes a method and an apparatus for integrating interactive toys with interactive television and cellular telecommu- nication systems.
  • the described interactive toys have real time conversations with users, preferably employing speech recognition. Due to provided content the toys are enabled to form relationships with users.
  • Interactive toys further utilize the user knowledge basis to match entertainment, education and sales promotion content to user histories, behaviors and habits.
  • the document US 8 799 506 B2 discloses a gesture sensor device for capturing and transferring gestures. Furthermore, it discloses a mood sensor device used to quantify a mood of the primary user based on sensed characteristics of the primary user, such as a captured facial recognition corre- sponding with a quantified mood.
  • Document US 8 094 891 B2 discloses a method for playing a song on a device, capturing an image of a user, performing facial expression recognition of the user based on the image, and selecting another song based on a facial expression of the user.
  • Document US 8 838 382 B2 discloses an improved and automated tool guide system and method, intended to be run on mobile computerized devices such as a smart phone and a GPS equipped vehicle device, designed to bet- ter mimic the natural ability of a human tour guide for customers according to variations in user interests and preferences.
  • the safety and comfort of vehicle occupants is strongly dependant on their mental condition. For example, an agitated driver of a motor vehicle could drive less careful and be less alert to his surroundings. An angry co-driver could possibly distract the driver and/or make other occupants of the vehicle uncomfortable by making angry remarks. It is an object of the present invention to increase the safety and comfort when operating a motor vehicle.
  • the method for controlling an interface device of a motor vehicle comprises the following steps:
  • an output device of the interface device to give at least one output in dependency on the analyzed mental condition of the occupant in order to influence his mental condition.
  • the driving safety can be increased by calming an agitated driver down.
  • the method can increase the comfort for a vehicle occupant by entertaining him.
  • the output device of the interface device can be controlled to play a song de- pendant on the determined mental condition in order to cheer the vehicle occupant up.
  • the output device can even try to encourage the vehicle occupant to react in a certain way. For example the vehicle occupant can be encouraged to sing along to a song currently being played by the interface device or the radio of the motor vehicle.
  • the output given can for example be audible and/or visible, especially a song, a sentence, a video, an animation and/or a picture.
  • the output of the control output device can also be a control command to another motor vehicle device.
  • the control command can limit the engine power output and/or activate a speed limiter in order to increase the driving safety.
  • the control command could also change the settings of a climate control system or a radio in order to increase the vehicle occupant's comfort in dependency on his determined mental condition.
  • the mental condition can for example be determined in form of a number, a score, a key index and/or an array.
  • the output given can also additionally be controlled in dependency on an input by the vehicle occupant captured by the means of the sensor device.
  • the output can be a reaction to an input in form of a question by the vehicle occupant.
  • the sensor device can include a microphone, a camera, especially an infrared camera, a resistance meter, especially integrated into a steering wheel of the motor vehicle, a microphone, and/or an input device such as a switch, touchpad or touch screen.
  • the mental condition of the occupant can also additionally be determined in dependency on the occupant input captured by the means of the sensor device.
  • the physiological parame- ter of the occupant is captured by recording his skin temperature, the electrical resistance of his skin, his facial expression, his gesture, his voice, his eyes and/or the force of the grip of his hands of a steering wheel and or other surfaces.
  • the voice can be analyzed according to characteristics such as a volume, pitch, accentuation, choice of words and/or a sentence structure.
  • text recognition software can be employed for evaluating the recorded voice.
  • the recording of the eyes can include the pupils and their dilation, eye movements and/or eyelids and their movement.
  • the captured force of the grip of the hands on the steering wheel can be used to determine the hand movement on the steering wheel and/or their placement thereon.
  • EEG electroencephalography
  • To determine the mental condition of the driver it is also possible to record his control commands such as steering wheel turning rate, gear shifting behaviour and changes of the paddle control dynamic.
  • To determine the mental condition of the driver it is also possible to record his control commands such as steering wheel turning rate, gear shifting behaviour and changes of the paddle control dynamic.
  • the output comprises an avatar.
  • An avatar can be a visual representation of a human by means of a display device.
  • An avatar can also be a controllable doll, especially with actuated limbs for pointing.
  • the doll can also compromise a screen and can display an image with simulated human mimic and gesture.
  • the avatar can also be called mental companion.
  • the mental companion can preferably initiate intelligent conversation and/or initiate actions mimicking a human be- haviour according to the determined mental condition of the vehicle occupant.
  • the avatar is especially well-suited to influence the mental condition of the vehicle occupant due to the possibility to mimic human emotions, speech and/or gestures. This especially increases the effectiveness of the emotional influence of the given output.
  • an avatar can be especially enter- taining for the vehicle occupant.
  • one mood category is predetermined in a data base and the mental condition is determined by allocating each mood category an index number in dependency on the captured physiological parameter.
  • these parameters are correlated with one or several categories of mental conditions in a percentage.
  • the mental condition is characterized with at least one index number relating to a certain mood.
  • the stored physiological parameters can especially be stored in a form of a table.
  • a recorded facial expression can correspond to a mood category "angry" with an index number of 80%.
  • a recorded voice pitch and volume can correspond to the mood category "angry" with an index number of 60%.
  • the mental condition of the occupant is also determined in dependency on at least one preceeding mental condition of the occupant. This allows for a revision if the given output is influencing the mood of the vehicle occupant in the right way. If not, the given output can be changed and/or adapted. Furthermore, this allows the method to construct an interaction chain with the vehicle occupant.
  • At least the steps of capturing the physiological parameters, determining the mental condition, and controlling the output device are repeated until the determined mental condition has changed to a predetermined mental condition.
  • the predetermined mental condition can be in the form of a threshold value, a certain bandwidth and/or be determined as a certain predetermined difference to a previous determined mental condition.
  • a change of the mental condition of the vehicle occupant as a reaction to a first output is captured by the means of the sensor device and a second output afterwards is given in dependency on this reaction of the vehicle occupant to the first output.
  • a control loop of mental condition influencing Additionally or alternatively to the change of his mental condition an input of the vehicle occupant can also be considered as the reaction of the vehicle occupant.
  • the first output can be a question or a test. This can improve the in- formation about the mental condition of the vehicle occupant and/or the information on how certain outputs influence the mental condition of the vehicle occupant.
  • a first output can be to play a song.
  • the second output could be another, especially similar song. If the reaction of the vehicle occupant is a sentence indicating his annoyance by the song, the second output could for example be the deactivation of the radio. Another possibility could be that the first output is the indication of a point of interest, the reaction of the vehicle occupant is a question about this point of interest and the second output given is this further information. This could reduce possible boredom of the vehicle occupant especially well.
  • At least one deter- mined mental condition and/or reaction of the vehicle occupant is saved to a data base device and future outputs are also given in dependency on the saved mental condition and/or reaction of the vehicle occupant.
  • This can be used to create a vehicle occupant profile.
  • the effectiveness of the given outputs in regard to influencing the mental condition of the vehicle occupant can be increased due to this vehicle occupant profile.
  • the data base device can for example be in form of a cloud saving space, internally as a part of the interface device of the vehicle and/or externally such as a memory stick.
  • the outputs of the output device given can be adjusted according to the saved vehicle occupant profile.
  • the method can comprise a step to identify an occupant of the motor vehicle by mean of the sensor device and/or a recognition device in order to use his specific stored occupant profile.
  • the saved user profile can improve the analysis of captured physiological parameters to determine the mental condition of a certain vehicle occupant more precisely due to a learning effect.
  • the output is also given in dependency on a driving condition, vehicle condition and/or vehicle environment.
  • a driving condition vehicle condition and/or vehicle environment.
  • this can be a current vehicle speed, a planned route, an outdoor or indoor temperature, a wind speed and/or direction, a speed limit, a road condition, a pedestrian on the road and/or a point of interest such as a landmark, a fuel station, a hotel and/or a historic sight.
  • the output could for example be a comment of the avatar on the current weather. This can lead to engaging the vehicle occupant in a conversation with the avatar, thus increasing his comfort.
  • the driving condition, vehi- cle condition and/or vehicle environment could also be considered when determining the mental condition of the vehicle occupant.
  • Another aspect of the invention concerns an interface device designed to perform a method according to any embodiment of the previously described method. Any feature and advantage described for the method can also be considered to be a feature and advantage for the interface device. The reverse also applies. Further advantages, features, and details of the invention derive from the following description of preferred embodiments as well as from the drawing. The features and feature combinations previously mentioned in the descrip- tion as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in respective indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
  • the drawings show in:
  • Fig. 1 a schematic front view of an interface device for a motor vehicle
  • Fig. 2 illustrates a method for controlling the interface device according to Fig. 1.
  • Fig. 1 shows in a schematic front view an example of an interface device 10 for a motor vehicle.
  • This interface device 10 is designed to give an output in the form of an avatar.
  • Fig. 2 illustrates a method for controlling the interface device 10, whereby certain blocks illustrate certain steps of the method.
  • a physiological parameter of a vehicle occupant is captured by means of a sensor device 14.
  • the sensor device 14 according to Fig. 1 comprises a high resolution infrared camera 16 to record the skin temperature, facial expressions and gesture of the vehicle occupant as his physiological parameters.
  • the sensor device 14 includes a microphone 18 to record the voice of the vehicle occupant.
  • the interface device 10 comprises a touch screen 20, which can be used to record an input of the vehicle occupant in addition to the physiological parameters.
  • a mental condition of the vehicle occupant is determined by analyzing the captured physiological parameters by the means of an evaluation device 24.
  • the evaluation device 24 can be in the form of a computer.
  • the captured physiological parameters can be compared with stored physiological parameters on a database device 38.
  • the mental condition can be categorized according to mood categories by assigning an index number in dependency on the captured physiological parameters to each mood category.
  • the infrared camera 16 can record a skin temperature distribution of the vehicle occupant and the evalua- tion device 24 can analyze this recording by comparing it to a table of stored skin temperature distribution. This can lead to the determination that the vehicle occupant is 80% angry.
  • a sentence recorded by the microphone 18 can be compared with stored keywords. Due to the matching of specific keywords in one sentence, it can be determined that the vehicle occupant is only 50% angry.
  • the mental condition can be determined by calculating the median of those index numbers. For the given example, this mean the mental condition would for example be determined as being 65% angry.
  • the determination of the mental condition of the vehicle occupant can be further modified by considering an input of the vehicle occupant on the touch screen 20 or a touchpad. For example, this input could be a request of the vehicle occupant to cheer him up. This could lead to the determination that the mental condition of the vehicle occupant belongs at least to certain de- gree to the mood category sad.
  • an output device 28 of the interface device 0 can be controlled to give at least one output in dependency on the determined mental condition of the occupant in order to influence his mental condition.
  • the output device 28 can for example include a flexible display 30 and the touch screen 20.
  • the display 30 can be used to mimic a human facial expression to cheer the vehicle occupant up and/or calm him down. This can both be advantageous to driving safety and/or vehicle occupant comfort and/or entertainment.
  • the output device 28 can also include robotic limbs 32. These robotic limbs 32 can be used to mimic human gesture in order to also influence the mental condition of the occupant.
  • the output device 28 can also include a loudspeaker 34 to give an audible output such as playing back music, a song and/or voicing a question.
  • Overall the interface device 10 can be considered to be an avatar, as it can create and/or contain an artificial character.
  • the output device 28 can also control further car functions such as a climate control system or a radio.
  • the method can repeat the steps according to block 12, 22 and 26 until the determined mental condition has changed by a certain degree or has reached a certain predetermined threshold value. During such a cycle the method can capture a change of the mental condi- tion of the vehicle occupant as a reaction to a first output in the steps illustrated according to block 12 and 22.
  • a second output is afterwards given in a step according to block 26 in dependency on this reaction of the vehicle occupant to the first output.
  • This allows the interface device 10 to hold a conversation with the vehicle occupant or provide some other form of interaction. Furthermore, this allows to verify if the influencing of the mental condition of the vehicle occupant has been successful and/or by which degree it has been successful.
  • the mental condition of the vehicle occupant is determined also in dependency on a vehicle occupant profile.
  • the vehicle occupant profile mainly consists off previously saved determined mental conditions and/or reactions of the vehicle occupant, especially a change of mental condition due to a certain output. This allows the interface device 10 to become a self learning system that learns from the interaction with a certain vehicle occupant. This further improves the effectiveness of the influencing of the mental condition of the vehicle occupant. Furthermore, future outputs can also be given in dependency on the saved mental conditions and/or reactions of the vehicle occupant.
  • the vehicle occupant profile can be stored on the database device 38. Additionally or alternatively, the vehicle occupant profile can be stored on a mobile device such as a memory stick, a mobile phone and/or on the internet. This allows the transfer of the vehicle occupant profile from one motor vehicle to another.
  • the output is also given in dependency on a driving condition, and/or vehicle environment.
  • a driving condition for example, for the output given a traffic jam and/or the weather could be considered.
  • the mental condition can also be determined in dependency on driving condition, and/or vehicle environment.
  • the interface device 10 can also include customizable parts.
  • the interface device 10 includes exchangeable hats such as a hat 36.
  • a face displayed by the display 30 can be customizable.
  • the faces of Hollywood stars could be displayed as a purchasable feature.
  • more than one such interface device 10 could be installed in one motor vehicle.
  • Each interface device 10 could then be designed to consider the other interface device 10 as a vehicle occupant. This would allow two interface devices 10 to interact with each other to provide entertainment to human vehicle occupants.
  • the described method and interface device 10 can be used to provide a mental companion in a motor vehicle that initiates intelligent conversation and actions in order to influence the mental condition of vehicle occupants according to their determined mental condition.
  • the mental condition can be determined by measuring of a heartbeat, a breathe frequency, eye movement, skin temperature changes, registered keywords or keyword combinations and/or by means of an EEG. Multiple adequate reactions in form of an output can be prepared according to a determined mental status.
  • system warnings and information can be put in some individual communication style output, with special language, voices, tone, questions, remarks, sing songs together, recommending breaks, food, sunglasses, drinks, making faces, telling jokes, reading, books, mails, short messages, making gestures to running videos and/or making comments after driving actions.
  • the mental companion could also start conversations between passengers and drivers with funny questions and by telling about already known behaviors of the driver, making remarks, making jokes, making an imitation of a star, making complements for good acting, or by showing dependent behavior or using some other form of physiological projection.
  • the mental companion can be provided by a little animated robot on a dashboard with a face, arms, hands, little touch displays, symbols and/or smart graphics. However, it could also be simply provided by a display 30.
  • the benefits for the occupants of the motor vehicle are plentiful.
  • the mental companion can stabilize and improve the mental condition of the occupants, especially in a dangerous situation and/or to encourage safe driving behavior.
  • the mental companion could also diagnose abnormal behaviors and/or in- elude a therapy for diagnosed abnormal behaviors.
  • the interaction with the vehicle occupants can be entertaining.
  • the outputs given by the interface device 10 can be in dependency on current and/or past behaviors and/or can also consider other information about the vehicle occupant such as his reaction to certain outputs given and/or how his mental condition can be influenced most effectively.

Abstract

The invention concerns a method for controlling an interface device (10) of a motor vehicle comprising the following steps: - capturing at least one physiological parameter of a vehicle occupant, especially a driver of the motor vehicle, by means of a sensor device(14); (12) - determining a mental condition of the vehicle occupant by analyzing the captured physiological parameter; (22) - controlling an output device (28) of the interface device (10) to give at least one output in dependency on the analyzed mental condition of the occupant in order to influence his mental condition. (26) Furthermore the invention concerns an interface device (10) designed to perform such a method.

Description

A Method for Controlling an Interface Device of a Motor Vehicle
DESCRIPTION: The invention relates to a method for controlling an interface device of a motor vehicle. Furthermore the invention relates to an interface device.
The document US 6 773 344 B1 describes a method and an apparatus for integrating interactive toys with interactive television and cellular telecommu- nication systems. The described interactive toys have real time conversations with users, preferably employing speech recognition. Due to provided content the toys are enabled to form relationships with users. Interactive toys further utilize the user knowledge basis to match entertainment, education and sales promotion content to user histories, behaviors and habits.
The document US 8 799 506 B2 discloses a gesture sensor device for capturing and transferring gestures. Furthermore, it discloses a mood sensor device used to quantify a mood of the primary user based on sensed characteristics of the primary user, such as a captured facial recognition corre- sponding with a quantified mood.
Document US 8 094 891 B2 discloses a method for playing a song on a device, capturing an image of a user, performing facial expression recognition of the user based on the image, and selecting another song based on a facial expression of the user.
Document US 8 838 382 B2 discloses an improved and automated tool guide system and method, intended to be run on mobile computerized devices such as a smart phone and a GPS equipped vehicle device, designed to bet- ter mimic the natural ability of a human tour guide for customers according to variations in user interests and preferences.
The safety and comfort of vehicle occupants is strongly dependant on their mental condition. For example, an agitated driver of a motor vehicle could drive less careful and be less alert to his surroundings. An angry co-driver could possibly distract the driver and/or make other occupants of the vehicle uncomfortable by making angry remarks. It is an object of the present invention to increase the safety and comfort when operating a motor vehicle.
This object is solved by a method for controlling an interface device of a motor vehicle having the features of patent claim 1. Furthermore this object is solved by an interface device having the features of patent claim 10. Advantageous embodiments with expedient developments of the invention are indicated in the other patent claims.
The method for controlling an interface device of a motor vehicle according to the invention comprises the following steps:
- capturing at least one physiological parameter of a vehicle occupant, especially a driver of the motor vehicle, by means of a sensor device;
- determining a mental condition of the vehicle occupant by analyzing the captured physiological parameter;
- controlling an output device of the interface device to give at least one output in dependency on the analyzed mental condition of the occupant in order to influence his mental condition. This makes it possible to influence emotions of the vehicle occupant by means of an interface device. For example, the driving safety can be increased by calming an agitated driver down. Furthermore, the method can increase the comfort for a vehicle occupant by entertaining him. For example, the output device of the interface device can be controlled to play a song de- pendant on the determined mental condition in order to cheer the vehicle occupant up. Furthermore, the output device can even try to encourage the vehicle occupant to react in a certain way. For example the vehicle occupant can be encouraged to sing along to a song currently being played by the interface device or the radio of the motor vehicle.
The output given can for example be audible and/or visible, especially a song, a sentence, a video, an animation and/or a picture. The output of the control output device can also be a control command to another motor vehicle device. For example, the control command can limit the engine power output and/or activate a speed limiter in order to increase the driving safety. The control command could also change the settings of a climate control system or a radio in order to increase the vehicle occupant's comfort in dependency on his determined mental condition.
The mental condition can for example be determined in form of a number, a score, a key index and/or an array. The output given can also additionally be controlled in dependency on an input by the vehicle occupant captured by the means of the sensor device. For example, the output can be a reaction to an input in form of a question by the vehicle occupant. This allows the interface device to interact with the vehicle occupant. The sensor device can include a microphone, a camera, especially an infrared camera, a resistance meter, especially integrated into a steering wheel of the motor vehicle, a microphone, and/or an input device such as a switch, touchpad or touch screen. The mental condition of the occupant can also additionally be determined in dependency on the occupant input captured by the means of the sensor device.
In an advantageous embodiment of the invention, the physiological parame- ter of the occupant is captured by recording his skin temperature, the electrical resistance of his skin, his facial expression, his gesture, his voice, his eyes and/or the force of the grip of his hands of a steering wheel and or other surfaces. The voice can be analyzed according to characteristics such as a volume, pitch, accentuation, choice of words and/or a sentence structure. Furthermore, text recognition software can be employed for evaluating the recorded voice. The recording of the eyes can include the pupils and their dilation, eye movements and/or eyelids and their movement. The captured force of the grip of the hands on the steering wheel can be used to determine the hand movement on the steering wheel and/or their placement thereon. Furthermore, it can be useful to record the heart frequency, breath frequency, a temperature of the face or even use the recording of brain waves by means of a small or big electroencephalography (EEG). It can also be useful to record a posture and body tension of the vehicle occupant. It can also be advantageous to measure the reaction times of the vehicle occupant to stimulation, which could be in the form of the given output. To determine the mental condition of the driver it is also possible to record his control commands such as steering wheel turning rate, gear shifting behaviour and changes of the paddle control dynamic. Furthermore it can be recorded how the vehicle occupant controls a multimedia device, including velocity and force of operating commands such as the pressing of buttons. The more physiological parameters are captured, the more precise a mental condition of the vehicle occupant can be determined by analyzing said captured physiological parameters. A more precise determination of the mental condi- tion of the vehicle occupant allows for more efficient influencing of his mental condition.
In another advantageous embodiment of the invention, the output comprises an avatar. An avatar can be a visual representation of a human by means of a display device. An avatar can also be a controllable doll, especially with actuated limbs for pointing. The doll can also compromise a screen and can display an image with simulated human mimic and gesture. The avatar can also be called mental companion. The mental companion can preferably initiate intelligent conversation and/or initiate actions mimicking a human be- haviour according to the determined mental condition of the vehicle occupant. The avatar is especially well-suited to influence the mental condition of the vehicle occupant due to the possibility to mimic human emotions, speech and/or gestures. This especially increases the effectiveness of the emotional influence of the given output. Furthermore, an avatar can be especially enter- taining for the vehicle occupant.
In another advantageous embodiment of the invention, at least , one mood category is predetermined in a data base and the mental condition is determined by allocating each mood category an index number in dependency on the captured physiological parameter. For example, for the analysis of the physiological parameters, these parameters are correlated with one or several categories of mental conditions in a percentage. This means that the mental condition is characterized with at least one index number relating to a certain mood. This can be achieved by comparing the captured physiological parameter with at least one stored physiological parameter. The stored physiological parameters can especially be stored in a form of a table. For example, a recorded facial expression can correspond to a mood category "angry" with an index number of 80%. Furthermore, a recorded voice pitch and volume can correspond to the mood category "angry" with an index number of 60%. Overall, this could lead to the determination that the current mental condition of the vehicle occupant relates to the mood category "angry" by calculating the mean of all allocated index numbers to this mood category. For the example this results in an overall index number of 70% for the mood category "angry". Further examples for a mood category can for example be "sad", "bored", "happy", "entertained", "distracted" and/or "concentrated".
In another advantageous embodiment of the invention, the mental condition of the occupant is also determined in dependency on at least one preceeding mental condition of the occupant. This allows for a revision if the given output is influencing the mood of the vehicle occupant in the right way. If not, the given output can be changed and/or adapted. Furthermore, this allows the method to construct an interaction chain with the vehicle occupant.
In another advantageous embodiment of the invention, at least the steps of capturing the physiological parameters, determining the mental condition, and controlling the output device are repeated until the determined mental condition has changed to a predetermined mental condition. This allows the method to influence the mental condition of the vehicle occupant to reach an especially safe and/or comfortable mental condition. The predetermined mental condition can be in the form of a threshold value, a certain bandwidth and/or be determined as a certain predetermined difference to a previous determined mental condition.
In another advantageous embodiment of the invention, a change of the mental condition of the vehicle occupant as a reaction to a first output is captured by the means of the sensor device and a second output afterwards is given in dependency on this reaction of the vehicle occupant to the first output. This allows for a control loop of mental condition influencing. Additionally or alternatively to the change of his mental condition an input of the vehicle occupant can also be considered as the reaction of the vehicle occupant. This allows the method to extensively interact with the vehicle occupant. For example, the first output can be a question or a test. This can improve the in- formation about the mental condition of the vehicle occupant and/or the information on how certain outputs influence the mental condition of the vehicle occupant. For example, a first output can be to play a song. If the reaction of the vehicle occupant to the song is to sing along, the second output could be another, especially similar song. If the reaction of the vehicle occupant is a sentence indicating his annoyance by the song, the second output could for example be the deactivation of the radio. Another possibility could be that the first output is the indication of a point of interest, the reaction of the vehicle occupant is a question about this point of interest and the second output given is this further information. This could reduce possible boredom of the vehicle occupant especially well.
In another advantageous embodiment of the invention, at least one deter- mined mental condition and/or reaction of the vehicle occupant is saved to a data base device and future outputs are also given in dependency on the saved mental condition and/or reaction of the vehicle occupant. This can be used to create a vehicle occupant profile. The effectiveness of the given outputs in regard to influencing the mental condition of the vehicle occupant can be increased due to this vehicle occupant profile. The data base device can for example be in form of a cloud saving space, internally as a part of the interface device of the vehicle and/or externally such as a memory stick. The outputs of the output device given can be adjusted according to the saved vehicle occupant profile. Furthermore the method can comprise a step to identify an occupant of the motor vehicle by mean of the sensor device and/or a recognition device in order to use his specific stored occupant profile. The saved user profile can improve the analysis of captured physiological parameters to determine the mental condition of a certain vehicle occupant more precisely due to a learning effect.
In another advantageous embodiment of the invention, the output is also given in dependency on a driving condition, vehicle condition and/or vehicle environment. For example, this can be a current vehicle speed, a planned route, an outdoor or indoor temperature, a wind speed and/or direction, a speed limit, a road condition, a pedestrian on the road and/or a point of interest such as a landmark, a fuel station, a hotel and/or a historic sight. The output could for example be a comment of the avatar on the current weather. This can lead to engaging the vehicle occupant in a conversation with the avatar, thus increasing his comfort. Furthermore, the driving condition, vehi- cle condition and/or vehicle environment could also be considered when determining the mental condition of the vehicle occupant. For example, a traffic jam could indicate that the vehicle occupant might be bored, angry and/or annoyed. Another aspect of the invention concerns an interface device designed to perform a method according to any embodiment of the previously described method. Any feature and advantage described for the method can also be considered to be a feature and advantage for the interface device. The reverse also applies. Further advantages, features, and details of the invention derive from the following description of preferred embodiments as well as from the drawing. The features and feature combinations previously mentioned in the descrip- tion as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in respective indicated combination but also in any other combination or taken alone without leaving the scope of the invention. The drawings show in:
Fig. 1 a schematic front view of an interface device for a motor vehicle;
and Fig. 2 illustrates a method for controlling the interface device according to Fig. 1.
Fig. 1 shows in a schematic front view an example of an interface device 10 for a motor vehicle. This interface device 10 is designed to give an output in the form of an avatar. Fig. 2 illustrates a method for controlling the interface device 10, whereby certain blocks illustrate certain steps of the method.
In the first step of the method as illustrated by a block 12 in Fig. 2, at least one physiological parameter of a vehicle occupant, especially a driver of the motor vehicle, is captured by means of a sensor device 14. The sensor device 14 according to Fig. 1 comprises a high resolution infrared camera 16 to record the skin temperature, facial expressions and gesture of the vehicle occupant as his physiological parameters. Furthermore, the sensor device 14 includes a microphone 18 to record the voice of the vehicle occupant. Also, the interface device 10 comprises a touch screen 20, which can be used to record an input of the vehicle occupant in addition to the physiological parameters.
In a second step illustrated by a block 22, a mental condition of the vehicle occupant is determined by analyzing the captured physiological parameters by the means of an evaluation device 24. For example, the evaluation device 24 can be in the form of a computer. For determining the mental condition of the motor vehicle occupant, the captured physiological parameters can be compared with stored physiological parameters on a database device 38. The mental condition can be categorized according to mood categories by assigning an index number in dependency on the captured physiological parameters to each mood category. For example, the infrared camera 16 can record a skin temperature distribution of the vehicle occupant and the evalua- tion device 24 can analyze this recording by comparing it to a table of stored skin temperature distribution. This can lead to the determination that the vehicle occupant is 80% angry. Additionally, for example a sentence recorded by the microphone 18 can be compared with stored keywords. Due to the matching of specific keywords in one sentence, it can be determined that the vehicle occupant is only 50% angry. The mental condition can be determined by calculating the median of those index numbers. For the given example, this mean the mental condition would for example be determined as being 65% angry. The determination of the mental condition of the vehicle occupant can be further modified by considering an input of the vehicle occupant on the touch screen 20 or a touchpad. For example, this input could be a request of the vehicle occupant to cheer him up. This could lead to the determination that the mental condition of the vehicle occupant belongs at least to certain de- gree to the mood category sad.
In a further step illustrated by block 26, an output device 28 of the interface device 0 can be controlled to give at least one output in dependency on the determined mental condition of the occupant in order to influence his mental condition. The output device 28 can for example include a flexible display 30 and the touch screen 20. For example, the display 30 can be used to mimic a human facial expression to cheer the vehicle occupant up and/or calm him down. This can both be advantageous to driving safety and/or vehicle occupant comfort and/or entertainment. The output device 28 can also include robotic limbs 32. These robotic limbs 32 can be used to mimic human gesture in order to also influence the mental condition of the occupant. Furthermore, the output device 28 can also include a loudspeaker 34 to give an audible output such as playing back music, a song and/or voicing a question. Overall the interface device 10 can be considered to be an avatar, as it can create and/or contain an artificial character. Additionally, the output device 28 can also control further car functions such as a climate control system or a radio. After the output has been given, the method can repeat the steps according to block 12, 22 and 26 until the determined mental condition has changed by a certain degree or has reached a certain predetermined threshold value. During such a cycle the method can capture a change of the mental condi- tion of the vehicle occupant as a reaction to a first output in the steps illustrated according to block 12 and 22. A second output is afterwards given in a step according to block 26 in dependency on this reaction of the vehicle occupant to the first output. This allows the interface device 10 to hold a conversation with the vehicle occupant or provide some other form of interaction. Furthermore, this allows to verify if the influencing of the mental condition of the vehicle occupant has been successful and/or by which degree it has been successful.
Preferably, the mental condition of the vehicle occupant is determined also in dependency on a vehicle occupant profile. The vehicle occupant profile mainly consists off previously saved determined mental conditions and/or reactions of the vehicle occupant, especially a change of mental condition due to a certain output. This allows the interface device 10 to become a self learning system that learns from the interaction with a certain vehicle occupant. This further improves the effectiveness of the influencing of the mental condition of the vehicle occupant. Furthermore, future outputs can also be given in dependency on the saved mental conditions and/or reactions of the vehicle occupant. The vehicle occupant profile can be stored on the database device 38. Additionally or alternatively, the vehicle occupant profile can be stored on a mobile device such as a memory stick, a mobile phone and/or on the internet. This allows the transfer of the vehicle occupant profile from one motor vehicle to another.
According to another advantageous embodiment of the method, the output is also given in dependency on a driving condition, and/or vehicle environment. For example, for the output given a traffic jam and/or the weather could be considered. Together with the captured physiological parameter, the mental condition can also be determined in dependency on driving condition, and/or vehicle environment.
The interface device 10 can also include customizable parts. For example, the interface device 10 includes exchangeable hats such as a hat 36. Also a face displayed by the display 30 can be customizable. For example, the faces of Hollywood stars could be displayed as a purchasable feature. In a further advantageous embodiment of the interface device 10, more than one such interface device 10 could be installed in one motor vehicle. Each interface device 10 could then be designed to consider the other interface device 10 as a vehicle occupant. This would allow two interface devices 10 to interact with each other to provide entertainment to human vehicle occupants.
Overall, the described method and interface device 10 can be used to provide a mental companion in a motor vehicle that initiates intelligent conversation and actions in order to influence the mental condition of vehicle occupants according to their determined mental condition. For example, the mental condition can be determined by measuring of a heartbeat, a breathe frequency, eye movement, skin temperature changes, registered keywords or keyword combinations and/or by means of an EEG. Multiple adequate reactions in form of an output can be prepared according to a determined mental status. Furthermore, system warnings and information can be put in some individual communication style output, with special language, voices, tone, questions, remarks, sing songs together, recommending breaks, food, sunglasses, drinks, making faces, telling jokes, reading, books, mails, short messages, making gestures to running videos and/or making comments after driving actions. The mental companion could also start conversations between passengers and drivers with funny questions and by telling about already known behaviors of the driver, making remarks, making jokes, making an imitation of a star, making complements for good acting, or by showing dependent behavior or using some other form of physiological projection.
The mental companion can be provided by a little animated robot on a dashboard with a face, arms, hands, little touch displays, symbols and/or smart graphics. However, it could also be simply provided by a display 30. The benefits for the occupants of the motor vehicle are plentiful. The mental companion can stabilize and improve the mental condition of the occupants, especially in a dangerous situation and/or to encourage safe driving behavior. The mental companion could also diagnose abnormal behaviors and/or in- elude a therapy for diagnosed abnormal behaviors. Furthermore, the interaction with the vehicle occupants can be entertaining. Due to the storage of determined mental conditions and/or reactions of the vehicle occupant in a data base device, especially in the form of a vehicle occupant profile, the outputs given by the interface device 10 can be in dependency on current and/or past behaviors and/or can also consider other information about the vehicle occupant such as his reaction to certain outputs given and/or how his mental condition can be influenced most effectively.

Claims

PATENT CLAIMS:
A method for controlling an interface device (10) of a motor vehicle comprising the following steps:
- capturing at least one physiological parameter of a vehicle occupant, especially a driver of the motor vehicle, by means of a sensor de- vice(14); (12)
- determining a mental condition of the vehicle occupant by analyzing the captured physiological parameter; (22)
- controlling an output device (28) of the interface device (10) to give at least one output in dependency on the analyzed mental condition of the occupant in order to influence his mental condition. (26)
The method according to claim 1 ,
characterized in that
the physiological parameter of the occupant is captured by recording his skin temperature, his skin resistance, his facial expression, his gesture, his voice, his eyes, and/or the force of the grip of his hands on a steering wheel and/or other surfaces,
The method according to claim 1 or 2,
characterized in that
the output comprises an avatar.
The method according to any one of the preceding claims,
characterized in that
at least one mood category is predetermined in a database and the mental condition is determined by allocating each mood category an index number in dependency on the captured physiological parameter.
The method according to any one of the preceding claims,
characterized in that
the mental condition of the occupant is also determined in dependency on at least one preceding mental condition of the occupant.
The method according to any one of the preceding claims,
characterized in that
at least the steps of capturing the physiological parameters (12), determining the mental condition (22), and controlling the output device (28, 26) are repeated until the determined mental condition has changed to a predetermined mental condition.
7. The method according to claim 6,
characterized in that
a change of the mental condition of the vehicle occupant as a reaction to a first output is captured by the means of the sensor device (14) and a second output afterwards is given in dependency on this reaction of the vehicle occupant to the first output.
8. The method according to any one of the preceding claims,
characterized in that
at least one determined mental condition and/or reaction of the vehicle occupant is saved to a database device (38) and future outputs are also given in dependency on this saved mental condition and/or reaction of the vehicle occupant.
9. The method according to any one of the preceding claims,
characterized in that
the output is also given in dependency on a driving condition, vehicle condition and/or vehicle environment.
10. An interface device (10) designed to perform a method according to any one of the preceding claims.
PCT/EP2016/000985 2015-06-19 2016-06-14 A method for controlling an interface device of a motor vehicle WO2016202450A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510347671.X 2015-06-19
CN201510347671.XA CN106293383A (en) 2015-06-19 2015-06-19 For the method controlling the interface device of motor vehicles

Publications (1)

Publication Number Publication Date
WO2016202450A1 true WO2016202450A1 (en) 2016-12-22

Family

ID=56203305

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/000985 WO2016202450A1 (en) 2015-06-19 2016-06-14 A method for controlling an interface device of a motor vehicle

Country Status (2)

Country Link
CN (1) CN106293383A (en)
WO (1) WO2016202450A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017216837A1 (en) * 2017-09-22 2019-03-28 Audi Ag Gesture and facial expression control for a vehicle
FR3088604B1 (en) * 2018-11-21 2021-07-23 Valeo Systemes Thermiques Interactive system with an occupant of a motor vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1374773A1 (en) * 2002-06-27 2004-01-02 Pioneer Corporation System for informing of driver's mental condition
US20060149428A1 (en) * 2005-01-05 2006-07-06 Kim Jong H Emotion-based software robot for automobiles
DE102006060849A1 (en) * 2006-12-22 2008-07-03 Daimler Ag Vehicle driver condition e.g. fatigue, determining method, involves developing probabilistic model representing cumulated probabilities, and determining condition as driver condition based on cumulated probabilities using model
DE102007037073A1 (en) * 2007-08-06 2009-02-12 Bayerische Motoren Werke Aktiengesellschaft Information i.e. direction information, relaying device for motor vehicle, has drive section of display device controllable by control device for adjusting directional signal element in displayed spatial direction
DE102011109564A1 (en) * 2011-08-05 2013-02-07 Daimler Ag Method and device for monitoring at least one vehicle occupant and method for operating at least one assistance device
US20130131905A1 (en) * 2011-11-17 2013-05-23 GM Global Technology Operations LLC System and method for closed-loop driver attention management
US20140240132A1 (en) * 2013-02-28 2014-08-28 Exmovere Wireless LLC Method and apparatus for determining vehicle operator performance

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9839856B2 (en) * 2008-03-11 2017-12-12 Disney Enterprises, Inc. Method and system for providing interactivity based on sensor measurements
EP2214121B1 (en) * 2009-01-30 2012-05-02 Autoliv Development AB Safety system for a motor vehicle
US8698639B2 (en) * 2011-02-18 2014-04-15 Honda Motor Co., Ltd. System and method for responding to driver behavior
US20150302718A1 (en) * 2014-04-22 2015-10-22 GM Global Technology Operations LLC Systems and methods for interpreting driver physiological data based on vehicle events

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1374773A1 (en) * 2002-06-27 2004-01-02 Pioneer Corporation System for informing of driver's mental condition
US20060149428A1 (en) * 2005-01-05 2006-07-06 Kim Jong H Emotion-based software robot for automobiles
DE102006060849A1 (en) * 2006-12-22 2008-07-03 Daimler Ag Vehicle driver condition e.g. fatigue, determining method, involves developing probabilistic model representing cumulated probabilities, and determining condition as driver condition based on cumulated probabilities using model
DE102007037073A1 (en) * 2007-08-06 2009-02-12 Bayerische Motoren Werke Aktiengesellschaft Information i.e. direction information, relaying device for motor vehicle, has drive section of display device controllable by control device for adjusting directional signal element in displayed spatial direction
DE102011109564A1 (en) * 2011-08-05 2013-02-07 Daimler Ag Method and device for monitoring at least one vehicle occupant and method for operating at least one assistance device
US20130131905A1 (en) * 2011-11-17 2013-05-23 GM Global Technology Operations LLC System and method for closed-loop driver attention management
US20140240132A1 (en) * 2013-02-28 2014-08-28 Exmovere Wireless LLC Method and apparatus for determining vehicle operator performance

Also Published As

Publication number Publication date
CN106293383A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
Zepf et al. Driver emotion recognition for intelligent vehicles: A survey
US11249544B2 (en) Methods and systems for using artificial intelligence to evaluate, correct, and monitor user attentiveness
Eyben et al. Emotion on the road: necessity, acceptance, and feasibility of affective computing in the car
Riener et al. Driver in the loop: Best practices in automotive sensing and feedback mechanisms
US9165280B2 (en) Predictive user modeling in user interface design
US20170365277A1 (en) Emotional interaction apparatus
US20220203996A1 (en) Systems and methods to limit operating a mobile phone while driving
US20220224963A1 (en) Trip-configurable content
Czerwinski et al. Building an AI that feels: AI systems with emotional intelligence could learn faster and be more helpful
Kashevnik et al. Multimodal corpus design for audio-visual speech recognition in vehicle cabin
EP3882097A1 (en) Techniques for separating driving emotion from media induced emotion in a driver monitoring system
WO2020163801A1 (en) Intra-vehicle games
Su et al. Study of human comfort in autonomous vehicles using wearable sensors
WO2021067380A1 (en) Methods and systems for using artificial intelligence to evaluate, correct, and monitor user attentiveness
WO2016202450A1 (en) A method for controlling an interface device of a motor vehicle
Liu et al. Buddy: A virtual life coaching system for children and adolescents with high functioning autism
Tement et al. Assessment and profiling of driving style and skills
US20230347903A1 (en) Sensor-based in-vehicle dynamic driver gaze tracking
US20220413785A1 (en) Content presentation system
Burleson Affective learning companions and the adoption of metacognitive strategies
US10915768B2 (en) Vehicle and method of controlling the same
JP7443908B2 (en) Control device, information processing system, and control method
JP2001282539A (en) Method and device for structurizing concept and device provided with concept structure
Li et al. Review and Perspectives on Human Emotion for Connected Automated Vehicles
Peralta et al. A methodology for generating driving styles for autonomous cars

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16731801

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16731801

Country of ref document: EP

Kind code of ref document: A1