WO2017189203A1 - System and method for identifying and responding to passenger interest in autonomous vehicle events - Google Patents

System and method for identifying and responding to passenger interest in autonomous vehicle events Download PDF

Info

Publication number
WO2017189203A1
WO2017189203A1 PCT/US2017/026406 US2017026406W WO2017189203A1 WO 2017189203 A1 WO2017189203 A1 WO 2017189203A1 US 2017026406 W US2017026406 W US 2017026406W WO 2017189203 A1 WO2017189203 A1 WO 2017189203A1
Authority
WO
WIPO (PCT)
Prior art keywords
passenger
attentiveness
driving
event
change
Prior art date
Application number
PCT/US2017/026406
Other languages
French (fr)
Inventor
Mona Singh
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017189203A1 publication Critical patent/WO2017189203A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/223Posture, e.g. hand, foot, or seat position, turned or inclined
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/225Direction of gaze
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • the term "passenger” identifies a passenger in the vehicle.
  • the passenger may be the "driver” (person in charge of operating the autonomous vehicle and driving the autonomous vehicle if there is a manual mode available).
  • the passenger may be someone else besides the "driver” in the autonomous vehicle.
  • the systems and methods of the present disclosure address the problem of a passenger receiving too much information from an autonomous vehicle, much of which may not be of interest to the passenger.
  • the present systems and methods monitor passenger interest and attentiveness, detect abnormal driving events that are of interest to the passenger, and identify and present vehicle-sensor data to the passenger in a manner that quickly informs the passenger regarding a recent driving event of the autonomous vehicle.
  • autonomous vehicles may be switchable between an autonomous driving mode and a manual driving mode. Transitions from autonomous to human driving call for a transition period to bring the user back into full vehicle control engagement and situational awareness.
  • Exemplary systems and methods involve (i) a passenger's observed response being matched to a prior driving event of an autonomous vehicle, and (ii) information recorded prior to the matched driving event being selected and presented to the passenger. Exemplary embodiments thus operate to determine whether the passenger's response was likely triggered by an abnormal driving event then to select information relevant to that event, for example creating a media presentation based on the passenger's anticipated interest.
  • One embodiment takes the form of a method for identifying and displaying driving- relevant information to a passenger, including (a) continually recording driving events as well as sensor data (including images) collected while the vehicle is operating, (b) identifying two correlated situations: an abnormal driving event and the passenger's spike in interest in that event including (i) identifying a driving event that would be noticeable to a passenger (e.g., a sufficiently high acceleration or deceleration, swerve, sharp turn, and sudden lane change), (ii) identifying a change in the passenger's attentiveness of a sufficiently large magnitude, including posture, facial indications of emotion, gaze direction, physiological state (e.g., heart rate), and (iii) identifying the co-occurrence of these with the event occurring slightly prior (e.g.
  • the passenger's change in attentiveness (c) determining whether the abnormal event is among the predetermined set of events, (d) determining the sensor data recorded immediately prior to the abnormal event and pertaining to the abnormal event, and (e) presenting the sensor data in suitable form, e.g., as images or graphs.
  • a method includes maintaining driving-event data with respect to operation of an autonomous vehicle, maintaining passenger-attentiveness data reflecting a degree of passenger attentiveness with respect to the operation of the autonomous vehicle, making a passenger-explanation determination at least in part by determining (i) that the driving- event data indicates that an abnormal driving event occurred at a first time, (ii) that the passenger- attentiveness data indicates that a spike in passenger attentiveness occurred at a second time, and (iii) that the second time is less than a threshold time delta after the first time, and responsive to making the passenger-explanation determination, presenting via a user interface of the autonomous vehicle, a contemporaneous subset of the maintained driving-event data, the contemporaneous subset being associated with the abnormal driving event.
  • FIG. 1 depicts a circumplex model of emotions.
  • FIG. 2 is a functional block diagram of an exemplary system architecture, in accordance with some embodiments.
  • FIG. 3 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display during normal driving conditions.
  • FIG. 4 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by an animal in the road.
  • FIG. 5 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by icy driving conditions.
  • FIG. 6 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by a vehicle entering an intersection.
  • FIG. 7 is a flowchart of a method, in accordance with some embodiments.
  • FIG. 8 illustrates an exemplary wireless transmit/receive unit (WTRU) that may be employed as a vehicular computing system in some embodiments.
  • WTRU wireless transmit/receive unit
  • Systems and methods are disclosed herein to determine passenger interest in autonomous vehicle driving events and present relevant information.
  • the present systems and methods relate to improvements to user interfaces to reduce the cognitive overload on a passenger.
  • the present systems and methods provide an approach that (1) gauges a passenger's attentiveness with respect to operation of an autonomous vehicle on an ongoing basis; (2) determines the abnormal driving events that provoke a spike in passenger attentiveness; (3) determines sensor data relevant to those events; and (4) presents such data to the passenger.
  • At least one benefit of the system is in improving passenger experience by bringing forth relevant information that serves to explain abnormal driving events of the autonomous vehicle. In this manner, the system helps to enhance trust in an autonomous vehicle and can help to improve passenger acceptance of autonomous vehicles.
  • the present systems and methods select a contemporaneous set of maintained driving- event data to present to a passenger based on what would be relevant in the particular situation.
  • the present systems and methods track the autonomous vehicle's driving on a continual basis. Any change in the autonomous vehicle's driving corresponds to the autonomous vehicle's driving event, e.g., accelerating, braking, changing lanes, swerving around an object, sounding a horn, and so on.
  • the vehicle identifies facts relevant to that change—that is, contributing factors in the abnormal driving event.
  • the autonomous vehicle may then display these contributing facts as a contemporaneous subset of driving-event data to serve as an explanation of the change.
  • the present systems and methods determine that the change is relevant based on the passenger's reaction to the change.
  • Exemplary techniques that may be employed to detect a passenger's reaction include techniques for using facial expressions, gaze direction, and facial flushing to determine a person's level of alertness, direction of attention, and emotional status, as described in Olderbak et al., "Psychometric challenges and proposed solutions when scoring facial emotion expression codes," Behavioral Research Methods, 2014; 46(4): 992-1006.
  • Additional techniques that may be employed to determine a passenger's level of attention are those described in European Patent Application Publication No. EP 1843591 Al, entitled “Intelligent media content playing device with passenger attention detection, corresponding method and carrier medium.”
  • FIG. 1 depicts a circumplex model of emotions.
  • FIG. 1 depicts Russell's circumplex model of emotions as described in James A. Russell, A Circumplex Model of Affect, Journal of Personality and Social Psychology, 39(6): 1161-1178, 1980.
  • This model represents an emotional reaction as a pair of dimensions (arousal and valence). Each dimension may range from -1 to +1 (or across some other selected range).
  • sleepy is 0 on the X-axis (valence) and -1 on the Y-axis (arousal).
  • alarmed is -0.1 on the X-axis and 0.95 on the Y-axis.
  • Some embodiments employ posture recognition, which may be performed using cameras or other sensors.
  • FIG. 2 is a functional block diagram of an exemplary architecture of a passenger alert system, in accordance with some embodiments. As shown, FIG. 2 depicts the components of a system comprising passenger sensors 202, a Passenger Attentiveness Estimation Module 204, an Abnormal Event Detection Module 206, a Vehicle State Estimation Module 208, a Vehicle Sensor Log 210, a Presentation Controller 212, and a display 214.
  • the Passenger Attentiveness Estimation Module 204 receives input from various passenger sensors 202 to maintain passenger-attentiveness data reflecting a degree of the passenger's attentiveness with respect to the operation of the autonomous vehicle.
  • the passenger's attentiveness may include factors such as the passenger's emotional state, attitude, wakefulness, attention, and engagement.
  • the Passenger Attentiveness Estimation Module 204 determines if there is a sufficiently rapid change in the passenger's attentiveness to the operation of the autonomous vehicle. For example, the Passenger Attentiveness Estimation Module 204 may determine if the passenger wakes up, looks up sharply from his or her reading, turns around from a conversation with a second passenger, appears startled, proclaims a vocal response, among others.
  • the Vehicle State Estimation Module 208 receives input from various vehicle sensors (not shown) to maintain driving-event data with respect to operation of the autonomous vehicle to determine if the vehicle has a sufficiently large change in state elements that would be noticeable to a passenger. These elements include acceleration or deceleration, swerving, lane changes, noise increase or reduction, and so on.
  • the Abnormal Event Detection Module 206 receives the passenger-attentiveness data from the Passenger Attentiveness Estimation Module 204 and the driving-event data from the Vehicle State Estimation Module 208 on a continual basis.
  • the Abnormal Event Detection Module 206 may make a passenger-explanation determination by determining (i) that the driving-event data indicates that an abnormal driving event occurred at a first time, (ii) that the passenger- attentiveness indicates that a spike in passenger attentiveness occurred at a second time, and (iii) that the second time is less than a threshold time delta after the first time.
  • the Abnormal Event Detection Module 206 may determine whether the change in the passenger's attentiveness at the second time occurred within a short period (e.g. corresponding to a human reaction time of approximately 1 second) of the occurrence of the abnormal driving event at the first time. In response to making the passenger-explanation determination, the Abnormal Event Detection Module 206 determines what instructions to send to the Presentation Controller 212 based on the nature of the abnormal event it detects and based on the available and relevant sensor data.
  • a short period e.g. corresponding to a human reaction time of approximately 1 second
  • the Presentation Controller 212 provides a contemporaneous set of driving-event data to the display 214 to display the appropriate sensor data to the passenger.
  • the Presentation Controller 212 preferably displays the contemporaneous set of driving-event data for a sufficient period of time such that the passenger can process the displayed data.
  • the Presentation Controller 212 also ensures that the display is sufficiently current to remain relevant.
  • the Presentation Controller 212 presents the data for about 10 seconds.
  • the contemporaneous set of driving-event data corresponds to driving-event data from before the time corresponding to the abnormal driving event occurring up to the time corresponding to the spike in passenger attentiveness.
  • Identifying an abnormal driving event that would be noticeable to a passenger may be performed using a set of selected relevant parameters of the vehicle state and for each selected parameter, a rate of change threshold.
  • a rate of change threshold is the speed of the vehicle and the rate of change threshold may be 25 ft/s 2 .
  • Identifying a change of degree in the passenger's attentiveness with respect to the operation of the autonomous vehicle of a sufficiently large magnitude, including posture, facial indications of emotion, gaze direction, physiological state (e.g., heart rate) includes assuming the surprise (or startle) response of a passenger may be determined automatically using techniques such as those employed by, for example, the FACET 2.1 software development kit from Emotient for detecting emotion using video images of a subject, as described in the Emotient press release "Emotient Launches New Software Development Kit for Real-Time Emotion Recognition" PR Newswire, Dec. 12, 2013.
  • the determination of a change in passenger attentiveness is represented, using Russell's model.
  • Passenger attentiveness may be determined based on a passenger's attitude, which in Russell's model is represented as a pair of two dimensions (arousal and valence).
  • determining that a passenger experiences a spike in attentiveness includes computing a score according to Russell's model.
  • the computed score is the sum of the following scores and if the score is 1 or greater, then consider that as being of a sufficient magnitude: a. The net change in the arousal value, Y-axis in Russell' s model, positive or negative. b. The net change in the valence value X-axis in Russell's model, positive or negative. c. Change in the passenger' s posture where sleeping or relaxed is 0 and concentrating and sitting up is 1. d.
  • the computed score may be used to identify a spike in passenger attentiveness, and thus to determine whether to display information regarding an abnormal driving event. For example, if the change in passenger's attentiveness is of magnitude 1 or greater, and occurs within 1 second of an abnormal driving event, then a contemporaneous set of information gathered from the vehicle sensors regarding the abnormal driving event may be displayed to the passenger.
  • Some advantages for the system may include (a) reducing a passenger's cognitive overload, (b) providing a passenger with the most relevant information at the right time, and (c) enhancing a passenger's trust in an autonomous vehicle.
  • an autonomous vehicle is equipped with a heads-up display.
  • the heads-up display may, for example provide an overlay view indicating where the autonomous vehicle is headed.
  • the overlay may highlight the road ahead of the vehicle.
  • the heads-up display may also display an aggregate safety score, e.g. a score from 0-100.
  • a high score e.g. near 100
  • a high level of confidence e.g. its sensor readings are reliable, consistent with one another, and do not indicate any elevated risks
  • FIG. 3 illustrates an exemplary map display 300, which may for example be provided on a dashboard display or other interior display of an autonomous vehicle.
  • the map display of FIG. 3 corresponds to an ordinary driving condition, e.g. where no abnormal driving event has occurred.
  • the dashboard map display may highlight the important roads (e.g. the roads along a route being navigated by the autonomous vehicle shown as vehicle 302).
  • the system may determine contemporaneous subsets of driving-event data for display on the heads-up display. Several examples of the identifying the contemporaneous subsets are given below.
  • the autonomous vehicle may have to brake suddenly, and the passenger looks up.
  • the system makes a passenger-explanation determination by determining both (i) that the passenger has a spike in attentiveness with respect to the driving of the autonomous vehicle and (ii) that an abnormal driving event occurred to which the passenger's attentiveness may be attributed.
  • the system changes the dashboard display to show a map with an indication of where the abnormal driving event occurred as well as an image or video (or other key data) describing what led to that event, as shown in FIGs. 4-6.
  • FIG. 4 illustrates an exemplary map display 400 under conditions in which an abnormal driving event has been detected.
  • an icon 404 e.g. an exclamation point
  • the map display further provides a contemporaneous subset of the driving-event data indicating the cause of the event.
  • the autonomous vehicle 402 may have swerved suddenly, and the map display includes an image 406 of an obstacle that caused the vehicle to swerve (e.g. a deer).
  • the image 406 may be obtained via sensors and/or cameras disposed on the vehicle.
  • FIG. 5 illustrates another exemplary map display 500 under conditions in which an abnormal driving event has been detected.
  • an icon 504 may be used to indicate the location on the map at which the abnormal event occurred.
  • the map display further includes a contemporaneous subset of the driving-event data indicating the cause of the event.
  • the autonomous vehicle 502 may have begun operating at an abnormally low speed (e.g. at a speed less than 80% of the posted speed limit) or may be maintaining an abnormally large gap with the vehicle ahead (e.g. a 4-second gap rather than a 2-second gap), and the map display may indicate 506 that the cause of the abnormally low speed is icy road conditions.
  • the vehicle may have knowledge of speed limits for a certain road, and may determine that the vehicle is traveling more than 80% below the speed limit.
  • the vehicle may be equipped with depth sensors to determine that the vehicle is maintaining an unusually large distance with respect to the vehicle ahead.
  • FIG. 6 illustrates a further exemplary map display 600 under conditions in which an abnormal driving event has been detected.
  • an icon 604 may be used to indicate the location on the map at which the abnormal event occurred.
  • the map display further includes a contemporaneous subset of the driving-event data indicating the cause of the event.
  • the autonomous vehicle 602 may have braked suddenly because the vehicle detected a risk that another vehicle would enter the intersection.
  • the contemporaneous subset of driving-event data takes the form of a video 606 has been recorded using a forward-facing camera of the vehicle.
  • a passenger in the autonomous vehicle may have the opportunity to select playback of the video 606.
  • playback may be initiated by pressing a triangular "play" icon on a touch screen.
  • the video associated with the abnormal driving event may be automatically limited to, for example, a period running from two seconds before the abnormal event to two seconds after the abnormal event. Other time periods may be used as well.
  • the system responds to an emotional reaction of any passengers in the vehicle.
  • the system provides a contemporaneous subset of driving-event data using data from sensors other than video cameras where appropriate.
  • the system may show some sensor reading indicating black ice, which may not necessarily be captured by a camera.
  • the system automatically adjusts to passenger responses. For example, if passenger interest is not detected in response to an abnormal driving event, then a threshold for considering an event of that type to be "abnormal" may be raised. For example, if the passenger does not exhibit a response when the vehicle decelerates at a rate of 0.2g, then a threshold for abnormal deceleration may be raised to 2. lg, for example. Similarly, if the passenger does not exhibit a response when the vehicle turns with a lateral acceleration of O. lg, then a threshold for abnormal turning may be raised to 1. lg, for example.
  • a passenger is driving in an autonomous vehicle.
  • the system recognizes the passenger's sudden change in posture, e.g., his/her head has moved up.
  • the system also recognizes that his/her face shows a different expression from his/her previous more neutral expression, indicating that the passenger may have been startled by some event.
  • the system determines that the autonomous vehicle demonstrated a behavior in the moments leading up to the passenger's visible reaction. The vehicle had braked suddenly one second before the passenger looked up.
  • the system identifies a contemporaneous set of data from the maintained driving event data that had been recorded in a time interval before the braking event.
  • This contemporaneous set of data includes a video stream showing an object in front along with a proximity sensor showing the distance to that object.
  • the system presents the video stream and a graph of the proximity sensor on a map-like aerial view of where it braked.
  • the video stream shows a deer darting across the roadway at high speed. The deer may have been gone by the time the passenger looked up but seeing the video makes it clear to the passenger as to why the autonomous vehicle had to brake suddenly. The system thereby helps enhance a passenger's trust in the decision making of the autonomous vehicle.
  • FIG. 7 is a flowchart of a method 700, in accordance with some embodiments.
  • method 700 includes maintaining driving-event data with respect to operation of an autonomous vehicle (AV) at step 702. Further, at step 704 passenger-attentiveness data is maintained reflecting a degree of passenger attentiveness with respect to the operation of the AV.
  • a passenger-explanation determination is made at least in part by determining that the driving- event data indicates that an abnormal driving event occurred at a first time at step 706 and determining that the passenger-attentiveness data indicates that a spike in passenger attentiveness occurred at a second time at step 708.
  • the method determines that the second time is less than a threshold time delta after the first time.
  • a contemporaneous subset of the maintained driving-event data is presented 712 via a user interface of the AV, the contemporaneous subset associated with the abnormal driving- event.
  • the occurrence of the abnormal driving event is determined by identifying at least one change-in-state indication from the driving-event data.
  • the change-in-state indication represents a change in vehicle velocity, a change in vehicle direction, a change in vehicle audible output, or a combination.
  • the method determines if the change-in-state indication exceeds a corresponding change-rate threshold to identify the at least one change-in-state indication.
  • the contemporaneous subset of the maintained driving-event data comprises driving-event data corresponding to a time frame prior to the first time up to the second time.
  • determining the spike in passenger attentiveness includes identifying a change-in-state of the degree of passenger attentiveness above a change-rate threshold.
  • the change-in-state of the degree of passenger attentiveness includes one or more of posture, facial indications of emotion, gaze direction, voice, and physiological state.
  • the passenger-attentiveness data indicating a spike in passenger attentiveness is determined in response to determining an abnormal driving event occurred.
  • the abnormal driving event is determined in response to determining that passenger- attentiveness data indicates a spike in passenger attentiveness.
  • modules that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules.
  • a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
  • FIG. 8 is a system diagram of an exemplary WTRU 802, which may be employed as a vehicular computing system in embodiments described herein.
  • the WTRU 802 may include a processor 818, a communication interface 819 including a transceiver 820, a transmit/receive element 822, a speaker/microphone 824, a keypad 826, a display/touchpad 828, a non-removable memory 830, a removable memory 832, a power source 834, a global positioning system (GPS) chipset 836, and sensors 838.
  • GPS global positioning system
  • the processor 818 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 818 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 802 to operate in a wireless environment.
  • the processor 818 may be coupled to the transceiver 820, which may be coupled to the transmit/receive element 822.
  • the transmit/receive element 822 may be configured to transmit signals to, or receive signals from, a base station over the air interface 816.
  • the transmit/receive element 822 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 822 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples.
  • the transmit/receive element 822 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 822 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 802 may include any number of transmit/receive elements 822. More specifically, the WTRU 802 may employ MTMO technology. Thus, in one embodiment, the WTRU 802 may include two or more transmit/receive elements 822 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 816.
  • the WTRU 802 may include two or more transmit/receive elements 822 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 816.
  • the transceiver 820 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 822 and to demodulate the signals that are received by the transmit/receive element 822.
  • the WTRU 802 may have multi-mode capabilities.
  • the transceiver 820 may include multiple transceivers for enabling the WTRU 802 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
  • the processor 818 of the WTRU 802 may be coupled to, and may receive user input data from, the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 818 may also output user data to the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828.
  • the processor 818 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 830 and/or the removable memory 832.
  • the non-removable memory 830 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 832 may include a subscriber identity module (SFM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • the processor 818 may access information from, and store data in, memory that is not physically located on the WTRU 802, such as on a server or a home computer (not shown).
  • the processor 818 may receive power from the power source 834, and may be configured to distribute and/or control the power to the other components in the WTRU 802.
  • the power source 834 may be any suitable device for powering the WTRU 802.
  • the power source 834 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel- zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li -ion), and the like), solar cells, fuel cells, and the like.
  • the processor 818 may also be coupled to the GPS chipset 836, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 802.
  • location information e.g., longitude and latitude
  • the WTRU 802 may receive location information over the air interface 816 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 802 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 818 may further be coupled to other peripherals 838, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 838 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Psychiatry (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Automation & Control Theory (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Emergency Management (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system and methods are described for maintaining driving-event data with respect to operation of an autonomous vehicle (AV), maintaining passenger-attentiveness data reflecting a degree of passenger attentiveness with respect to the operation of the AV, making a passenger-explanation determination at least in part by determining (i) that the driving-event data indicates that an abnormal driving event occurred at a first time, (ii) that the passenger-attentiveness data indicates that a spike in passenger attentiveness occurred at a second time, and (iii) that the second time is less than a threshold time delta after the first time, and responsive to making the passenger-explanation determination, presenting via a user interface of the AV a contemporaneous subset of the maintained driving-event data, the contemporaneous subset being associated with the abnormal driving event.

Description

SYSTEM AND METHOD FOR IDENTIFYING AND RESPONDING TO PASSENGER INTEREST IN AUTONOMOUS VEHICLE EVENTS
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is a non-provisional filing of, and claims benefit under 35 U.S.C. §119(e) from, U.S. Provisional Patent Application Serial No. 62/328,080, entitled "System and Method for Determining Driver Interest in Autonomous Vehicle Maneuvers and Presenting Information Thereon," filed April 27, 2016, the entirety of which is hereby incorporated herein by reference.
BACKGROUND
[0002] Autonomously driven vehicles, such as self-driving cars, are expected to increase in popularity in the coming years. One barrier to widespread acceptance of autonomous vehicles is a lack of trust by users that users will be safe despite ceding vehicular control to a computing system. It will continue to be important that users develop trust in these vehicles.
SUMMARY
[0003] Systems and methods are disclosed herein to determine passenger interest in an autonomous vehicle driving event and to present relevant information. Herein, the term "passenger" identifies a passenger in the vehicle. For example, the passenger may be the "driver" (person in charge of operating the autonomous vehicle and driving the autonomous vehicle if there is a manual mode available). Alternatively, the passenger may be someone else besides the "driver" in the autonomous vehicle. Among other problems, the systems and methods of the present disclosure address the problem of a passenger receiving too much information from an autonomous vehicle, much of which may not be of interest to the passenger. Moreover, the present systems and methods monitor passenger interest and attentiveness, detect abnormal driving events that are of interest to the passenger, and identify and present vehicle-sensor data to the passenger in a manner that quickly informs the passenger regarding a recent driving event of the autonomous vehicle. In addition, autonomous vehicles may be switchable between an autonomous driving mode and a manual driving mode. Transitions from autonomous to human driving call for a transition period to bring the user back into full vehicle control engagement and situational awareness.
[0004] Exemplary systems and methods involve (i) a passenger's observed response being matched to a prior driving event of an autonomous vehicle, and (ii) information recorded prior to the matched driving event being selected and presented to the passenger. Exemplary embodiments thus operate to determine whether the passenger's response was likely triggered by an abnormal driving event then to select information relevant to that event, for example creating a media presentation based on the passenger's anticipated interest.
[0005] One embodiment takes the form of a method for identifying and displaying driving- relevant information to a passenger, including (a) continually recording driving events as well as sensor data (including images) collected while the vehicle is operating, (b) identifying two correlated situations: an abnormal driving event and the passenger's spike in interest in that event including (i) identifying a driving event that would be noticeable to a passenger (e.g., a sufficiently high acceleration or deceleration, swerve, sharp turn, and sudden lane change), (ii) identifying a change in the passenger's attentiveness of a sufficiently large magnitude, including posture, facial indications of emotion, gaze direction, physiological state (e.g., heart rate), and (iii) identifying the co-occurrence of these with the event occurring slightly prior (e.g. within one second) to the passenger's change in attentiveness, (c) determining whether the abnormal event is among the predetermined set of events, (d) determining the sensor data recorded immediately prior to the abnormal event and pertaining to the abnormal event, and (e) presenting the sensor data in suitable form, e.g., as images or graphs.
[0006] In at least one embodiment, a method includes maintaining driving-event data with respect to operation of an autonomous vehicle, maintaining passenger-attentiveness data reflecting a degree of passenger attentiveness with respect to the operation of the autonomous vehicle, making a passenger-explanation determination at least in part by determining (i) that the driving- event data indicates that an abnormal driving event occurred at a first time, (ii) that the passenger- attentiveness data indicates that a spike in passenger attentiveness occurred at a second time, and (iii) that the second time is less than a threshold time delta after the first time, and responsive to making the passenger-explanation determination, presenting via a user interface of the autonomous vehicle, a contemporaneous subset of the maintained driving-event data, the contemporaneous subset being associated with the abnormal driving event.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] FIG. 1 depicts a circumplex model of emotions.
[0008] FIG. 2 is a functional block diagram of an exemplary system architecture, in accordance with some embodiments.
[0009] FIG. 3 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display during normal driving conditions. [0010] FIG. 4 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by an animal in the road.
[0011] FIG. 5 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by icy driving conditions.
[0012] FIG. 6 is a schematic illustration of a map view displayed in some embodiments on a vehicle dashboard display after detection of an abnormal driving event caused, for example, by a vehicle entering an intersection.
[0013] FIG. 7 is a flowchart of a method, in accordance with some embodiments.
[0014] FIG. 8 illustrates an exemplary wireless transmit/receive unit (WTRU) that may be employed as a vehicular computing system in some embodiments.
DETAILED DESCRIPTION
[0015] Systems and methods are disclosed herein to determine passenger interest in autonomous vehicle driving events and present relevant information. Among other aspects, the present systems and methods relate to improvements to user interfaces to reduce the cognitive overload on a passenger. The present systems and methods provide an approach that (1) gauges a passenger's attentiveness with respect to operation of an autonomous vehicle on an ongoing basis; (2) determines the abnormal driving events that provoke a spike in passenger attentiveness; (3) determines sensor data relevant to those events; and (4) presents such data to the passenger. At least one benefit of the system is in improving passenger experience by bringing forth relevant information that serves to explain abnormal driving events of the autonomous vehicle. In this manner, the system helps to enhance trust in an autonomous vehicle and can help to improve passenger acceptance of autonomous vehicles.
[0016] Emerging self-driving (including assisted-driving) vehicles create special challenges for passenger interfaces. One potential existing solution is to provide a large amount of information on the available displays. This, however, may lead to passengers being overwhelmed with information. Whereas a traditional vehicle's human passenger may monitor the road and an instrument panel, in the case of self-driving vehicles, the passenger may also monitor a display showing some combination of an aerial view, various dynamically changing graphs, and various augmented views in the windshield. [0017] One difficulty with the strategy of displaying a large amount of information is that it contributes to the problem by overloading the passenger with information that is mostly irrelevant to the passenger much of the time. Traditionally, vehicles do present data such as speed and fuel available. However, this information is not related to the driving as such and is not presented in response to an implicit expression of a passenger's interest.
[0018] The present systems and methods select a contemporaneous set of maintained driving- event data to present to a passenger based on what would be relevant in the particular situation. The present systems and methods track the autonomous vehicle's driving on a continual basis. Any change in the autonomous vehicle's driving corresponds to the autonomous vehicle's driving event, e.g., accelerating, braking, changing lanes, swerving around an object, sounding a horn, and so on. Whenever any of a predetermined set of changes occur in the autonomous vehicle's driving, the vehicle identifies facts relevant to that change— that is, contributing factors in the abnormal driving event. The autonomous vehicle may then display these contributing facts as a contemporaneous subset of driving-event data to serve as an explanation of the change.
[0019] The present systems and methods determine that the change is relevant based on the passenger's reaction to the change. Exemplary techniques that may be employed to detect a passenger's reaction include techniques for using facial expressions, gaze direction, and facial flushing to determine a person's level of alertness, direction of attention, and emotional status, as described in Olderbak et al., "Psychometric challenges and proposed solutions when scoring facial emotion expression codes," Behavioral Research Methods, 2014; 46(4): 992-1006. Additional techniques that may be employed to determine a passenger's level of attention are those described in European Patent Application Publication No. EP 1843591 Al, entitled "Intelligent media content playing device with passenger attention detection, corresponding method and carrier medium."
[0020] FIG. 1 depicts a circumplex model of emotions. In particular, FIG. 1 depicts Russell's circumplex model of emotions as described in James A. Russell, A Circumplex Model of Affect, Journal of Personality and Social Psychology, 39(6): 1161-1178, 1980. This model represents an emotional reaction as a pair of dimensions (arousal and valence). Each dimension may range from -1 to +1 (or across some other selected range). In the model of FIG. 1, sleepy is 0 on the X-axis (valence) and -1 on the Y-axis (arousal). Similarly, alarmed is -0.1 on the X-axis and 0.95 on the Y-axis. Some embodiments employ posture recognition, which may be performed using cameras or other sensors.
[0021] FIG. 2 is a functional block diagram of an exemplary architecture of a passenger alert system, in accordance with some embodiments. As shown, FIG. 2 depicts the components of a system comprising passenger sensors 202, a Passenger Attentiveness Estimation Module 204, an Abnormal Event Detection Module 206, a Vehicle State Estimation Module 208, a Vehicle Sensor Log 210, a Presentation Controller 212, and a display 214.
[0022] The Passenger Attentiveness Estimation Module 204 receives input from various passenger sensors 202 to maintain passenger-attentiveness data reflecting a degree of the passenger's attentiveness with respect to the operation of the autonomous vehicle. The passenger's attentiveness may include factors such as the passenger's emotional state, attitude, wakefulness, attention, and engagement. In particular, the Passenger Attentiveness Estimation Module 204 determines if there is a sufficiently rapid change in the passenger's attentiveness to the operation of the autonomous vehicle. For example, the Passenger Attentiveness Estimation Module 204 may determine if the passenger wakes up, looks up sharply from his or her reading, turns around from a conversation with a second passenger, appears startled, proclaims a vocal response, among others.
[0023] The Vehicle State Estimation Module 208 receives input from various vehicle sensors (not shown) to maintain driving-event data with respect to operation of the autonomous vehicle to determine if the vehicle has a sufficiently large change in state elements that would be noticeable to a passenger. These elements include acceleration or deceleration, swerving, lane changes, noise increase or reduction, and so on.
[0024] The Abnormal Event Detection Module 206 receives the passenger-attentiveness data from the Passenger Attentiveness Estimation Module 204 and the driving-event data from the Vehicle State Estimation Module 208 on a continual basis. The Abnormal Event Detection Module 206 may make a passenger-explanation determination by determining (i) that the driving-event data indicates that an abnormal driving event occurred at a first time, (ii) that the passenger- attentiveness indicates that a spike in passenger attentiveness occurred at a second time, and (iii) that the second time is less than a threshold time delta after the first time. In some embodiments, the Abnormal Event Detection Module 206 may determine whether the change in the passenger's attentiveness at the second time occurred within a short period (e.g. corresponding to a human reaction time of approximately 1 second) of the occurrence of the abnormal driving event at the first time. In response to making the passenger-explanation determination, the Abnormal Event Detection Module 206 determines what instructions to send to the Presentation Controller 212 based on the nature of the abnormal event it detects and based on the available and relevant sensor data.
[0025] The Presentation Controller 212 provides a contemporaneous set of driving-event data to the display 214 to display the appropriate sensor data to the passenger. The Presentation Controller 212 preferably displays the contemporaneous set of driving-event data for a sufficient period of time such that the passenger can process the displayed data. The Presentation Controller 212 also ensures that the display is sufficiently current to remain relevant. In an exemplary embodiment, the Presentation Controller 212 presents the data for about 10 seconds. In some embodiments, the contemporaneous set of driving-event data corresponds to driving-event data from before the time corresponding to the abnormal driving event occurring up to the time corresponding to the spike in passenger attentiveness.
[0026] Identifying an abnormal driving event that would be noticeable to a passenger (e.g., a sufficiently high acceleration or deceleration, swerve, sharp turn, and sudden lane change) may be performed using a set of selected relevant parameters of the vehicle state and for each selected parameter, a rate of change threshold. In an embodiment, for deceleration, the relevant parameter is the speed of the vehicle and the rate of change threshold may be 25 ft/s2.
[0027] Identifying a change of degree in the passenger's attentiveness with respect to the operation of the autonomous vehicle of a sufficiently large magnitude, including posture, facial indications of emotion, gaze direction, physiological state (e.g., heart rate) includes assuming the surprise (or startle) response of a passenger may be determined automatically using techniques such as those employed by, for example, the FACET 2.1 software development kit from Emotient for detecting emotion using video images of a subject, as described in the Emotient press release "Emotient Launches New Software Development Kit for Real-Time Emotion Recognition" PR Newswire, Dec. 12, 2013.
[0028] In some embodiments, the determination of a change in passenger attentiveness is represented, using Russell's model. Passenger attentiveness may be determined based on a passenger's attitude, which in Russell's model is represented as a pair of two dimensions (arousal and valence).
[0029] In some embodiments, determining that a passenger experiences a spike in attentiveness includes computing a score according to Russell's model. In some embodiments, the computed score is the sum of the following scores and if the score is 1 or greater, then consider that as being of a sufficient magnitude: a. The net change in the arousal value, Y-axis in Russell' s model, positive or negative. b. The net change in the valence value X-axis in Russell's model, positive or negative. c. Change in the passenger' s posture where sleeping or relaxed is 0 and concentrating and sitting up is 1. d. Change in the passenger's gaze, where not gazing at vehicle or road is 0, looking at the vehicle is 0.5, and gazing at the road in the front is 1. e If there is a sudden head movement, then 0.5. f. If there is a loud exclamation by the passenger (e.g., "Whoa"), then 0.5.
[0030] The computed score may be used to identify a spike in passenger attentiveness, and thus to determine whether to display information regarding an abnormal driving event. For example, if the change in passenger's attentiveness is of magnitude 1 or greater, and occurs within 1 second of an abnormal driving event, then a contemporaneous set of information gathered from the vehicle sensors regarding the abnormal driving event may be displayed to the passenger.
[0031] If the system were to display all the information all the time, the system would potentially overwhelm the passenger, and the passenger would simply tune it out. Therefore, taking the passenger's interest into account may be helpful. In addition, by detecting the passenger's interest dynamically, embodiments disclosed herein operate to identify an abnormal event and to provide information that reassures the passenger that the abnormal event was justified.
[0032] Some advantages for the system may include (a) reducing a passenger's cognitive overload, (b) providing a passenger with the most relevant information at the right time, and (c) enhancing a passenger's trust in an autonomous vehicle.
[0033] In some embodiments, an autonomous vehicle is equipped with a heads-up display. The heads-up display may, for example provide an overlay view indicating where the autonomous vehicle is headed. The overlay may highlight the road ahead of the vehicle. The heads-up display may also display an aggregate safety score, e.g. a score from 0-100. A high score (e.g. near 100) may indicate that the autonomous vehicle has a high level of confidence (e.g. its sensor readings are reliable, consistent with one another, and do not indicate any elevated risks) and the passenger has no need for concern.
[0034] FIG. 3 illustrates an exemplary map display 300, which may for example be provided on a dashboard display or other interior display of an autonomous vehicle. The map display of FIG. 3 corresponds to an ordinary driving condition, e.g. where no abnormal driving event has occurred. The dashboard map display may highlight the important roads (e.g. the roads along a route being navigated by the autonomous vehicle shown as vehicle 302). In response to making a passenger-explanation determination, the system may determine contemporaneous subsets of driving-event data for display on the heads-up display. Several examples of the identifying the contemporaneous subsets are given below. [0035] In an exemplary scenario, the autonomous vehicle may have to brake suddenly, and the passenger looks up. In at least one embodiment, the system makes a passenger-explanation determination by determining both (i) that the passenger has a spike in attentiveness with respect to the driving of the autonomous vehicle and (ii) that an abnormal driving event occurred to which the passenger's attentiveness may be attributed. The system changes the dashboard display to show a map with an indication of where the abnormal driving event occurred as well as an image or video (or other key data) describing what led to that event, as shown in FIGs. 4-6.
[0036] FIG. 4 illustrates an exemplary map display 400 under conditions in which an abnormal driving event has been detected. In the example of FIG. 4, an icon 404 (e.g. an exclamation point) may be used to indicate the location on the map at which the abnormal driving event occurred. The map display further provides a contemporaneous subset of the driving-event data indicating the cause of the event. For example, in the example of FIG. 4, the autonomous vehicle 402 may have swerved suddenly, and the map display includes an image 406 of an obstacle that caused the vehicle to swerve (e.g. a deer). In such an embodiment, the image 406 may be obtained via sensors and/or cameras disposed on the vehicle.
[0037] FIG. 5 illustrates another exemplary map display 500 under conditions in which an abnormal driving event has been detected. In the example of FIG. 5, an icon 504 may be used to indicate the location on the map at which the abnormal event occurred. The map display further includes a contemporaneous subset of the driving-event data indicating the cause of the event. In the example of FIG. 5, the autonomous vehicle 502 may have begun operating at an abnormally low speed (e.g. at a speed less than 80% of the posted speed limit) or may be maintaining an abnormally large gap with the vehicle ahead (e.g. a 4-second gap rather than a 2-second gap), and the map display may indicate 506 that the cause of the abnormally low speed is icy road conditions. In such an embodiment, the vehicle may have knowledge of speed limits for a certain road, and may determine that the vehicle is traveling more than 80% below the speed limit. Alternatively, the vehicle may be equipped with depth sensors to determine that the vehicle is maintaining an unusually large distance with respect to the vehicle ahead.
[0038] FIG. 6 illustrates a further exemplary map display 600 under conditions in which an abnormal driving event has been detected. In the example of FIG. 6, an icon 604 may be used to indicate the location on the map at which the abnormal event occurred. The map display further includes a contemporaneous subset of the driving-event data indicating the cause of the event. For example, in the example of FIG. 6, the autonomous vehicle 602 may have braked suddenly because the vehicle detected a risk that another vehicle would enter the intersection. In this example, the contemporaneous subset of driving-event data takes the form of a video 606 has been recorded using a forward-facing camera of the vehicle. A passenger in the autonomous vehicle may have the opportunity to select playback of the video 606. In some embodiments, playback may be initiated by pressing a triangular "play" icon on a touch screen. The video associated with the abnormal driving event may be automatically limited to, for example, a period running from two seconds before the abnormal event to two seconds after the abnormal event. Other time periods may be used as well.
[0039] In at least one embodiment, the system responds to an emotional reaction of any passengers in the vehicle.
[0040] In one embodiment, the system provides a contemporaneous subset of driving-event data using data from sensors other than video cameras where appropriate. For example, the system may show some sensor reading indicating black ice, which may not necessarily be captured by a camera.
[0041] In one embodiment, the system automatically adjusts to passenger responses. For example, if passenger interest is not detected in response to an abnormal driving event, then a threshold for considering an event of that type to be "abnormal" may be raised. For example, if the passenger does not exhibit a response when the vehicle decelerates at a rate of 0.2g, then a threshold for abnormal deceleration may be raised to 2. lg, for example. Similarly, if the passenger does not exhibit a response when the vehicle turns with a lateral acceleration of O. lg, then a threshold for abnormal turning may be raised to 1. lg, for example.
Exemplary Embodiment Use Case: Passenger Reading.
[0042] In an exemplary use case, a passenger is driving in an autonomous vehicle. (S)he is comfortable and begins to read notes for an upcoming presentation on a handheld device. Suddenly, the autonomous vehicle brakes. The passenger looks up with a surprised look but sees nothing. In this scenario, the passenger would be confused as to what happened. (S)he would wonder whether the autonomous vehicle braked for an actual reason or whether there was a subtle malfunction.
[0043] In at least one embodiment, the system recognizes the passenger's sudden change in posture, e.g., his/her head has moved up. The system also recognizes that his/her face shows a different expression from his/her previous more neutral expression, indicating that the passenger may have been startled by some event. The system determines that the autonomous vehicle demonstrated a behavior in the moments leading up to the passenger's visible reaction. The vehicle had braked suddenly one second before the passenger looked up. The system identifies a contemporaneous set of data from the maintained driving event data that had been recorded in a time interval before the braking event. This contemporaneous set of data includes a video stream showing an object in front along with a proximity sensor showing the distance to that object. The system presents the video stream and a graph of the proximity sensor on a map-like aerial view of where it braked. The video stream shows a deer darting across the roadway at high speed. The deer may have been gone by the time the passenger looked up but seeing the video makes it clear to the passenger as to why the autonomous vehicle had to brake suddenly. The system thereby helps enhance a passenger's trust in the decision making of the autonomous vehicle.
[0044] FIG. 7 is a flowchart of a method 700, in accordance with some embodiments. As shown, method 700 includes maintaining driving-event data with respect to operation of an autonomous vehicle (AV) at step 702. Further, at step 704 passenger-attentiveness data is maintained reflecting a degree of passenger attentiveness with respect to the operation of the AV. A passenger-explanation determination is made at least in part by determining that the driving- event data indicates that an abnormal driving event occurred at a first time at step 706 and determining that the passenger-attentiveness data indicates that a spike in passenger attentiveness occurred at a second time at step 708. At step 710, the method determines that the second time is less than a threshold time delta after the first time. In response to making the passenger-explanation determination, a contemporaneous subset of the maintained driving-event data is presented 712 via a user interface of the AV, the contemporaneous subset associated with the abnormal driving- event.
[0045] In some embodiments, the occurrence of the abnormal driving event is determined by identifying at least one change-in-state indication from the driving-event data. In such embodiments, the change-in-state indication represents a change in vehicle velocity, a change in vehicle direction, a change in vehicle audible output, or a combination. In some embodiments, the method determines if the change-in-state indication exceeds a corresponding change-rate threshold to identify the at least one change-in-state indication.
[0046] In some embodiments, the contemporaneous subset of the maintained driving-event data comprises driving-event data corresponding to a time frame prior to the first time up to the second time.
[0047] In some embodiments, determining the spike in passenger attentiveness includes identifying a change-in-state of the degree of passenger attentiveness above a change-rate threshold. In such embodiments, the change-in-state of the degree of passenger attentiveness includes one or more of posture, facial indications of emotion, gaze direction, voice, and physiological state. [0048] In some embodiments, the passenger-attentiveness data indicating a spike in passenger attentiveness is determined in response to determining an abnormal driving event occurred. Alternatively, the abnormal driving event is determined in response to determining that passenger- attentiveness data indicates a spike in passenger attentiveness.
[0049] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
[0050] FIG. 8 is a system diagram of an exemplary WTRU 802, which may be employed as a vehicular computing system in embodiments described herein. As shown in FIG. 8, the WTRU 802 may include a processor 818, a communication interface 819 including a transceiver 820, a transmit/receive element 822, a speaker/microphone 824, a keypad 826, a display/touchpad 828, a non-removable memory 830, a removable memory 832, a power source 834, a global positioning system (GPS) chipset 836, and sensors 838. It will be appreciated that the WTRU 802 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0051] The processor 818 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 818 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 802 to operate in a wireless environment. The processor 818 may be coupled to the transceiver 820, which may be coupled to the transmit/receive element 822. While FIG. 8 depicts the processor 818 and the transceiver 820 as separate components, it will be appreciated that the processor 818 and the transceiver 820 may be integrated together in an electronic package or chip. [0052] The transmit/receive element 822 may be configured to transmit signals to, or receive signals from, a base station over the air interface 816. For example, in one embodiment, the transmit/receive element 822 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 822 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples. In yet another embodiment, the transmit/receive element 822 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 822 may be configured to transmit and/or receive any combination of wireless signals.
[0053] In addition, although the transmit/receive element 822 is depicted in FIG. 8 as a single element, the WTRU 802 may include any number of transmit/receive elements 822. More specifically, the WTRU 802 may employ MTMO technology. Thus, in one embodiment, the WTRU 802 may include two or more transmit/receive elements 822 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 816.
[0054] The transceiver 820 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 822 and to demodulate the signals that are received by the transmit/receive element 822. As noted above, the WTRU 802 may have multi-mode capabilities. Thus, the transceiver 820 may include multiple transceivers for enabling the WTRU 802 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
[0055] The processor 818 of the WTRU 802 may be coupled to, and may receive user input data from, the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 818 may also output user data to the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828. In addition, the processor 818 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 830 and/or the removable memory 832. The non-removable memory 830 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 832 may include a subscriber identity module (SFM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 818 may access information from, and store data in, memory that is not physically located on the WTRU 802, such as on a server or a home computer (not shown).
[0056] The processor 818 may receive power from the power source 834, and may be configured to distribute and/or control the power to the other components in the WTRU 802. The power source 834 may be any suitable device for powering the WTRU 802. As examples, the power source 834 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel- zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li -ion), and the like), solar cells, fuel cells, and the like.
[0057] The processor 818 may also be coupled to the GPS chipset 836, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 802. In addition to, or in lieu of, the information from the GPS chipset 836, the WTRU 802 may receive location information over the air interface 816 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 802 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0058] The processor 818 may further be coupled to other peripherals 838, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 838 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
[0059] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Claims

1. A method comprising: maintaining driving-event data with respect to operation of an autonomous vehicle (AV); maintaining passenger-attentiveness data reflecting a degree of passenger attentiveness with respect to the operation of the AV; making a passenger-explanation determination at least in part by determining: that the driving-event data indicates that an abnormal driving event occurred at a first time; that the passenger-attentiveness data indicates that a spike in passenger attentiveness occurred at a second time; and that the second time is less than a threshold time delta after the first time; and responsive to making the passenger-explanation determination, presenting via a user interface of the AV a contemporaneous subset of the maintained driving-event data, the contemporaneous subset being associated with the abnormal driving event.
2. The method of claim 1, wherein determining that the abnormal driving event occurred comprises identifying at least one change-in-state indication from the driving-event data.
3. The method of claim 2, wherein the change-in-state indication represents one of: a change in vehicle velocity, a change in vehicle direction, and a change in vehicle audible output.
4. The method of claim 2, wherein identifying the at least one change-in-state indication comprises determining if the change-in-state indication exceeds a corresponding change-rate threshold.
5. The method as in any of claims 1-4, wherein the contemporaneous subset of the maintained driving-event data comprises driving-event data corresponding to a time frame prior to the first time up to the second time.
6. The method as in any of claims 1-5, wherein determining the spike in passenger attentiveness comprises identifying a change-in-state of the degree of passenger attentiveness above a change-rate threshold.
7. The method of claim 6, wherein the change-in-state of the degree of passenger attentiveness comprises one of: posture, facial indications of emotion, gaze direction, voice, and physiological state.
8. The method as in any of claims 1-7, wherein the passenger-attentiveness data indicating a spike in passenger attentiveness is determined in response to determining an abnormal driving event occurred.
9. The method as in any of claims 1-7, wherein the abnormal driving event is determined in response to determining that passenger-attentiveness data indicates a spike in passenger attentiveness.
10. The method as in any of claims 1-9, wherein the driving-event data and the passenger- attentiveness data are obtained via vehicle sensors and passenger sensors, respectively.
11. An apparatus comprising: a vehicle state estimation module configured to maintain driving-event data with respect to operation of an autonomous vehicle (AV); a passenger attentiveness estimation module configured to maintain passenger- attentiveness data reflecting a degree of passenger attentiveness with respect to the operation of the AV; an abnormal event detection module configured to make a passenger-explanation determination at least in part by determining: that the driving-event data indicates that an abnormal driving event occurred at a first time; that the passenger-attentiveness data indicates that a spike in passenger attentiveness occurred at a second time; and that the second time is less than a threshold time delta after the first time; and a presentation controller configured to present, via a user interface of the AV, a contemporaneous subset of the maintained driving-event data to a display in response to the abnormal event detection module making the passenger-explanation determination, the contemporaneous subset being associated with the abnormal driving event.
12. The apparatus of claim 11, wherein the abnormal event detection module is configured to identifying at least one change-in-state indication from the driving-event data to determine the abnormal driving event occurred.
13. The apparatus of claim 12, wherein the change-in-state indication represents one of: a change in vehicle velocity, a change in vehicle direction, and a change in vehicle audible output.
14. The apparatus of claim 12, wherein the abnormal event detection module is configured to determine if the change-in-state indication exceeds a corresponding change-rate threshold to identify the at least one change-in-state indication.
15. The apparatus as in any of claims 11-14, wherein the contemporaneous subset of the maintained driving-event data corresponds to driving-event data obtained prior to the first time up to the second time.
16. The apparatus as in any of claims 11-15, wherein the spike in passenger attentiveness corresponds to a change-in-state of the degree of passenger attentiveness above a change-rate threshold.
17. The apparatus of claim 16, wherein the change-in-state of the degree of passenger attentiveness comprises one of: posture, facial indications of emotion, gaze direction, voice, and physiological state.
18. The apparatus as in any of claims 11-17, wherein the abnormal event detection module is configured to determine the spike in passenger attentiveness in response to determining an abnormal driving event occurred.
19. The apparatus as in any of claims 11-17, wherein the abnormal event detection module is configured to determine that the abnormal driving event occurred in response to determining that the passenger-attentiveness data indicates a spike in passenger attentiveness.
20. The apparatus as in any of claims 11-19, wherein the driving-event data and the passenger- attentiveness data are obtained via vehicle sensors and passenger sensors, respectively.
PCT/US2017/026406 2016-04-27 2017-04-06 System and method for identifying and responding to passenger interest in autonomous vehicle events WO2017189203A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662328080P 2016-04-27 2016-04-27
US62/328,080 2016-04-27

Publications (1)

Publication Number Publication Date
WO2017189203A1 true WO2017189203A1 (en) 2017-11-02

Family

ID=58578990

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/026406 WO2017189203A1 (en) 2016-04-27 2017-04-06 System and method for identifying and responding to passenger interest in autonomous vehicle events

Country Status (1)

Country Link
WO (1) WO2017189203A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019122993A1 (en) * 2017-12-18 2019-06-27 PlusAI Corp Method and system for adaptive motion planning based on passenger reaction to vehicle motion in autonomous driving vehicles
CN112141116A (en) * 2019-06-26 2020-12-29 现代自动车株式会社 Method and apparatus for controlling moving body using error monitoring
US10994741B2 (en) 2017-12-18 2021-05-04 Plusai Limited Method and system for human-like vehicle control prediction in autonomous driving vehicles
US11273836B2 (en) 2017-12-18 2022-03-15 Plusai, Inc. Method and system for human-like driving lane planning in autonomous driving vehicles
US11919531B2 (en) * 2018-01-31 2024-03-05 Direct Current Capital LLC Method for customizing motion characteristics of an autonomous vehicle for a user
US11986309B2 (en) 2021-04-09 2024-05-21 Toyota Motor Engineering & Manufacturing North America, Inc. Passenger identification and profile mapping via brainwave monitoring

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1843591A1 (en) 2006-04-05 2007-10-10 British Telecommunications Public Limited Company Intelligent media content playing device with user attention detection, corresponding method and carrier medium
US20140156133A1 (en) * 2012-11-30 2014-06-05 Google Inc. Engaging and disengaging for autonomous driving
EP2752348A1 (en) * 2013-01-04 2014-07-09 Continental Automotive Systems, Inc. Adaptive emergency brake and steer assist system based on driver focus
US20160068103A1 (en) * 2014-09-04 2016-03-10 Toyota Motor Engineering & Manufacturing North America, Inc. Management of driver and vehicle modes for semi-autonomous driving systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1843591A1 (en) 2006-04-05 2007-10-10 British Telecommunications Public Limited Company Intelligent media content playing device with user attention detection, corresponding method and carrier medium
US20140156133A1 (en) * 2012-11-30 2014-06-05 Google Inc. Engaging and disengaging for autonomous driving
EP2752348A1 (en) * 2013-01-04 2014-07-09 Continental Automotive Systems, Inc. Adaptive emergency brake and steer assist system based on driver focus
US20160068103A1 (en) * 2014-09-04 2016-03-10 Toyota Motor Engineering & Manufacturing North America, Inc. Management of driver and vehicle modes for semi-autonomous driving systems

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Emotient Launches New Software Development Kit for Real-Time Emotion Recognition", PR NEWSWIRE, 12 December 2013 (2013-12-12)
JAMES A. RUSSELL: "A Circumplex Model of Affect", JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, vol. 39, no. 6, 1980, pages 1161 - 1178
OLDERBAK ET AL.: "Psychometric challenges and proposed solutions when scoring facial emotion expression codes", BEHAVIORAL RESEARCH METHODS, vol. 46, no. 4, 2014, pages 992 - 1006

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019122993A1 (en) * 2017-12-18 2019-06-27 PlusAI Corp Method and system for adaptive motion planning based on passenger reaction to vehicle motion in autonomous driving vehicles
US10994741B2 (en) 2017-12-18 2021-05-04 Plusai Limited Method and system for human-like vehicle control prediction in autonomous driving vehicles
US11130497B2 (en) 2017-12-18 2021-09-28 Plusai Limited Method and system for ensemble vehicle control prediction in autonomous driving vehicles
US11273836B2 (en) 2017-12-18 2022-03-15 Plusai, Inc. Method and system for human-like driving lane planning in autonomous driving vehicles
US11299166B2 (en) 2017-12-18 2022-04-12 Plusai, Inc. Method and system for personalized driving lane planning in autonomous driving vehicles
US11643086B2 (en) 2017-12-18 2023-05-09 Plusai, Inc. Method and system for human-like vehicle control prediction in autonomous driving vehicles
US11650586B2 (en) 2017-12-18 2023-05-16 Plusai, Inc. Method and system for adaptive motion planning based on passenger reaction to vehicle motion in autonomous driving vehicles
US12060066B2 (en) 2017-12-18 2024-08-13 Plusai, Inc. Method and system for human-like driving lane planning in autonomous driving vehicles
US12071142B2 (en) 2017-12-18 2024-08-27 Plusai, Inc. Method and system for personalized driving lane planning in autonomous driving vehicles
US11919531B2 (en) * 2018-01-31 2024-03-05 Direct Current Capital LLC Method for customizing motion characteristics of an autonomous vehicle for a user
CN112141116A (en) * 2019-06-26 2020-12-29 现代自动车株式会社 Method and apparatus for controlling moving body using error monitoring
US11986309B2 (en) 2021-04-09 2024-05-21 Toyota Motor Engineering & Manufacturing North America, Inc. Passenger identification and profile mapping via brainwave monitoring

Similar Documents

Publication Publication Date Title
US11904852B2 (en) Information processing apparatus, information processing method, and program
WO2017189203A1 (en) System and method for identifying and responding to passenger interest in autonomous vehicle events
US11027608B2 (en) Driving assistance apparatus and driving assistance method
US20230219580A1 (en) Driver and vehicle monitoring feedback system for an autonomous vehicle
WO2020010822A1 (en) Adaptive driver monitoring for advanced driver-assistance systems
JP7092116B2 (en) Information processing equipment, information processing methods, and programs
JP6972629B2 (en) Information processing equipment, information processing methods, and programs
JP6690649B2 (en) Information processing apparatus, information processing method, and program
US20180105108A1 (en) System and method for vehicle collision mitigation with vulnerable road user context sensing
US20200310528A1 (en) Vehicle system for providing driver feedback in response to an occupant's emotion
EP3675121A2 (en) Two-way in-vehicle virtual personal assistant
US10642266B2 (en) Safe warning system for automatic driving takeover and safe warning method thereof
US20180357493A1 (en) Information processing apparatus, information processing method, and program
TW202325049A (en) Vehicle and mobile device interface for vehicle occupant assistance
TW202323931A (en) Vehicle and mobile device interface for vehicle occupant assistance
CN110027567A (en) The driving condition of driver determines method, apparatus and storage medium
US11211095B1 (en) Modifying media content playback based on user mental state
JP2016016853A (en) Drive support system and drive support method
US9886034B2 (en) Vehicle control based on connectivity of a portable device
JP2020086907A (en) Careless driving determination device
JP2019129441A (en) Music provision system and music provision method
Yu et al. Sensing Vehicle Conditions for Detecting Driving Behaviors
JP2021122655A (en) Abnormality detection system, abnormality detection device, and abnormality detection method
WO2018163536A1 (en) Driver body condition recovery support device, method and program
US20240326878A1 (en) Method and system for switching from autonomous to driver's mode in a motor vehicle

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17718457

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17718457

Country of ref document: EP

Kind code of ref document: A1