WO2016115053A1 - Cognitive load driving assistant - Google Patents
Cognitive load driving assistant Download PDFInfo
- Publication number
- WO2016115053A1 WO2016115053A1 PCT/US2016/012908 US2016012908W WO2016115053A1 WO 2016115053 A1 WO2016115053 A1 WO 2016115053A1 US 2016012908 W US2016012908 W US 2016012908W WO 2016115053 A1 WO2016115053 A1 WO 2016115053A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cognitive load
- driving
- driver
- cognitive
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
- A61B3/112—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6893—Cars
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
- B60K28/066—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/65—Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/85—Arrangements for transferring vehicle- or driver-related data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3617—Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096805—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
- G08G1/096827—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096833—Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02438—Measuring pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/164—Infotainment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/167—Vehicle dynamics information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/168—Target or limit values
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/178—Warnings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/195—Blocking or enabling display functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/197—Blocking or enabling of input functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/40—Hardware adaptations for dashboards or instruments
- B60K2360/48—Sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/589—Wireless data transfers
- B60K2360/5915—Inter vehicle communication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/592—Data transfer involving external databases
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/21—Voice
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096855—Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Definitions
- the various embodiments relate generally to automotive systems and, more specifically, to a cognitive load driving assistant.
- drivers fail to focus an appropriate amount of attention on the task of driving. For example, drivers may not adjust their focus to adequately address complex driving situations attributable to traffic, the manner in which others are driving, pedestrians, road conditions, weather conditions, volume of traffic, and the like. Further, drivers typically engage in multiple, secondary in-vehicle activities that divert their attention from the primary task of driving. Such secondary in-vehicle activities may include listening to loud music, participating in conversations, texting, soothing a crying child, and so forth. [0004] "Distracted" driving attributable to complex driving situations and secondary in- vehicle activities increases the likelihood of collisions and accidents.
- a driver who is driving on a winding road at night while talking with a passenger and operating an entertainment system is more likely to become involved in an automobile accident than a driver who is focused solely on the task of driving along a straight road during the day.
- a driver who is focused solely on the task of driving along a straight road during the day has increased.
- Some examples of prevalent in-vehicle technologies are navigation systems and entertainments systems.
- driver distractions are visual distractions, manual distractions, and cognitive distractions.
- texting is associated with a visual distraction that causes the driver to take his or her eyes off the road, a manual distraction that causes the driver to take his or her hands off the steering wheel, and a cognitive distraction that causes the driver to take his or her mind off the task of driving.
- One embodiment sets forth a computer-readable storage medium including instructions that, when executed by a processor, cause the processor to perform the steps of computing a current cognitive load associated with a driver while the driver is operating a vehicle based on data received via one or more sensors; determining that the current cognitive load exceeds a baseline cognitive load; and in response, causing one or more actions to occur that are intended to reduce the current cognitive load associated with the driver.
- At least one advantage of the disclosed techniques is that conveying cognitive load levels to drivers and/or adjusting vehicle behavior based on cognitive load levels may increase driver safety.
- a cognitive load driving assistant may take action to reduce the complexity of the driving situation and/or the secondary tasks that the driver is performing, thereby allowing the driver to devote appropriate mental resources to the primary driving task.
- Figure 1 illustrates a passenger compartment of a vehicle that is configured to implement one or more aspects of the various embodiments
- Figure 2 is a more detailed illustration of the head unit of Figure 1, according to various embodiments
- FIG. 3 is a more detailed illustration of the cognitive load driving assistant of Figure 2, according to various embodiments.
- Figure 4 illustrates the relationship between the current driving context and the current cognitive load of Figure 3, according to various embodiments
- Figure 5 is a flow diagram of method steps for managing cognitive load while driving, according to various embodiments.
- FIG. 1 illustrates a passenger compartment 100 of a vehicle that is configured to implement one or more aspects of the various embodiments.
- the passenger compartment 100 includes, without limitation, a windshield 1 10 and a head unit 130 positioned proximate to a dashboard 120.
- the passenger compartment 100 may include any number of additional components that implement any technically feasible functionality.
- the passenger compartment 100 may include a rear- view camera.
- the head unit 130 is located in the center of the dashboard 120.
- the head unit 130 may be mounted at any location within the passenger compartment 100 in any technically feasible fashion that does not block the windshield 1 10.
- the head unit 130 may include any number and type of instrumentation and applications, and may provide any number of input and output mechanisms.
- the head unit 130 typically enables the driver and/or passengers to control entertainment functionality.
- the head unit 130 may include navigation functionality and/or advanced driver assistance functionality designed to increase driver safety, automate driving tasks, and the like.
- the head unit 130 may support any number of input and output data types and formats as known in the art.
- the head unit 130 may include built-in Bluetooth for hands-free calling and audio streaming, universal serial bus (USB) connections, speech recognition, rear-view camera inputs, video outputs for any number and type of displays, and any number of audio outputs.
- USB universal serial bus
- any number of sensors, displays, receivers, transmitters, etc. may be integrated into the head unit 130 or may be implemented externally to the head unit 130. External devices may communicate with the head unit 130 in any technically feasible fashion.
- the driver of the vehicle While driving, the driver of the vehicle is exposed to a variety of stimuli that are related to either the primary driving task and/or any number of secondary tasks. For example, and without limitation, the driver could see lane markers 142, a pedestrian 144, a cyclist 146, and a police car 148 via the windshield 1 10. In response, the driver could steer the vehicle to track the lane markers 142 while avoiding the pedestrian 144 and the cyclist 146 and apply the brake pedal to allow the police car 148 to cross the road in front of the vehicle. Further, and without limitation, the driver could concurrently participate in a conversation 152, listen to music 154, and attempt to soothe a crying baby 156.
- the head unit 130 includes functionality to enable the driver to efficiently perform both the primary driving task and certain secondary tasks as well as functionality designed to increase driver safety while performing such tasks.
- FIG. 2 is a more detailed illustration of the head unit 130 of Figure 1, according to various embodiments.
- the head unit 130 includes, without limitation, a processor 270 and a system memory 240.
- the processor 270 and the system memory 240 may be implemented in any technically feasible fashion.
- any combination of the processor 270 and the system memory 240 may be implemented as a stand-alone chip or as part of a more comprehensive solution that is implemented as an application-specific integrated circuit (ASIC) or a system-on-a-chip (SoC).
- ASIC application-specific integrated circuit
- SoC system-on-a-chip
- the processor 270 generally comprises a programmable processor that executes program instructions to manipulate input data.
- the processor 270 may include any number of processing cores, memories, and other modules for facilitating program execution.
- the processor 270 may receive input from drivers and/or passengers of the vehicle via any number of user input devices 212 and generate pixels for display on the display device 214.
- the user input devices 212 may include various types of input devices, such as buttons, a microphone, cameras, a touch-based input device integrated with a display device 214 (i.e., a touch screen), and other input devices for providing input data to the head unit 130.
- the system memory 240 generally comprises storage chips such as random access memory (RAM) chips that store application programs and data for processing by the processor 270.
- the system memory 240 includes non-volatile memory such as optical drives, magnetic drives, flash drives, or other storage.
- a storage 220 may supplement or replace the system memory 240.
- the storage 220 may include any number and type of external memories that are accessible to the processor 170.
- the storage 220 may include a Secure Digital Card, an external Flash memory, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
- the system memory 240 includes, without limitation, an entertainment subsystem 244, a navigation subsystem 246, and an advanced driver assistance system (ADAS) 250.
- the entertainment subsystem 244 includes software that controls any number and type of entertainment components, such as an AM/FM radio, a satellite radio, an audio and video computer files player (e.g., MP3 audio files player), an optical media player (e.g., compact disc (CD) player), and so forth.
- any number of entertainment components may be included in the head unit 130 and any number of entertainment components may be implemented as stand-alone devices.
- the navigation subsystem 246 includes any number and type of applications that enable a driver to efficiently navigate the vehicle.
- the navigation subsystem 246 may include maps, direction routing software, and the like.
- the ADAS 250 includes functionality that is designed to increase driver safety and/or automate driving tasks.
- the ADAS 250 may provide hill descent control, automatic parking, and the like.
- the functionality included in the ADAS 250 may supplement, enhance, and/or automate functionality provided by other components included in the vehicle to decrease the likelihood of accidents or collisions in challenging conditions and/or driving scenarios.
- the ADAS 250 includes, without limitation, a cognitive load driving assistant 260.
- the cognitive load driving assistant 260 continually estimates the current cognitive load of the driver and determines whether the current cognitive load indicates an abnormally stressful driving environment and/or an abnormal number of distractions. If the cognitive load driving assistant 260 determines that the current cognitive load indicates an abnormally stressful driving environment and/or an abnormal number of distractions, then the cognitive load driving assistant 260 attempts to indirectly or direct modify the driving environment to reduce the cognitive load of the driver. For example, and without limitation, the cognitive load driving assistant 260 could notify the driver of an atypically high cognitive load and suggest alternate driving routes that are less congested than the current driving route.
- the cognitive load driving assistant 260 may process any type of input data and implement any technically feasible algorithm to estimate current cognitive load and/or determine whether the current cognitive load negatively impacts the driver's ability to safely operate the vehicle.
- the head unit 130 including the cognitive load driving assistant 260, receives data via any number of driver-facing sensors 232 and non-driver-facing sensors 234.
- the driver-facing sensors 232 may include devices capable of detecting and relaying physiological data associated with the driver. More specifically, the driver-facing sensors 232 may measure physiological change in the body related to cognitive load.
- the non-driver-facing sensors 232 may include any devices capable of detecting and relaying that data that does not reflect the physiology of the driver but are related to the driving environment.
- the driver-facing sensors 232 and the non-driver-facing sensors 234 may include any type of sensors designed to measure any characteristic and may be implemented in any technically feasible fashion.
- the driver-facing sensors 232 may, without limitation, track specific features of the driver, such as hands, fingers, head, eye gaze, feet, facial expression, voice tone, and the like.
- the driver-facing sensors 232 could include sensors that measure brain activity, heart rate, skin conductance, steering-wheel grip force, muscle activity, skin/body temperature, and so forth.
- the driver-facing sensors 232 may include, without limitation, microphones that detect conversational context, conversational turn taking, voice tone and affect, other auditory distractions, and the like.
- the driver-facing sensors 232 could detect that the driver is engaged in conversation with a passenger, the driver is currently speaking, the driver's voice tone indicates that the driver is drowsy, and two other passengers are engaged in a second conversation.
- the driver-facing sensors 232 may include visual imagers that detect head position and orientation, facial features, hands movements, etc.
- the driving facing sensors 232 may include depth sensors that detect finger and hand gestures, body posture, and as forth and/or eye gaze and pupil size tracking sensors.
- the non-driver-facing sensors 234 may track any features of the vehicle and/or environment surrounding the vehicle that are relevant to the driver.
- the non-driver-facing sensors 234 may track vehicle control elements, such as the position of the throttle, the position of the clutch, gear selection, the location of the brake pedal, the angle of the steering wheel, and so forth.
- the non-driver-facing sensors 234 may include any number of sensors for tracking vehicle speed, position, orientation, and dynamics, such as inertial and magnetic sensors. Further, the non-driver-facing sensors 234 may include devices that detect and/or track stationary and/or moving objects surrounding the vehicle.
- Such detection sensors may include, without limitation, a front-mounted visible light imager, an infrared imager, a radio detection and ranging (RADAR) sensor, a light detection and ranging (LIDAR) sensor, a dedicated short range communication (DSRC) sensor, thermal and motion sensors, depth sensors, sonar and acoustic sensors, and the like.
- the non-driver-facing sensors 234 may include remote sensors that provide information regarding local weather, traffic, etc.
- the driver-facing sensors 232 and the non-driver-facing sensors 234 may be deployed in any technically feasible fashion.
- the driver-facing sensors 232 and the non-driver-facing sensors 234 may include any number and combination of vehicle-integrated sensors, vehicle-integrated imagers, wearable devices (affixed to or worn by the driver), and remote sensors.
- the driver-facing sensors 232 could include steering wheel-mounted sensors that measure heart rate, skin conductance, and grip force
- the non-driver facing sensors 234 could include include a front-mounted visible light imager, an infrared imager, and a LIDAR sensor.
- the cognitive load driving assistant 260 may receive additional input data, referred to herein as advanced driver assistance system (ADAS) data.
- ADAS data may include, without limitation, data received from a global navigation satellite system (GNSS) receiver 236, data received from the navigation subsystem 245, and data received from the entertainment subsystem 244.
- GNSS global navigation satellite system
- the global navigation satellite system (GNSS) receiver 236 determines global position of the vehicle.
- the GNSS receiver 236 operates based on one or more of the global positioning system of manmade Earth satellites, various electromagnetic spectrum signals (such as cellular tower signals, wireless internet signals, and the like), or other signals or measurements, and/or on a combination of the above items.
- the cognitive load driving assistant 260 accesses global positioning data from GNSS receiver 236 in order to determine a current location of the vehicle. Further, in some embodiments, the cognitive load driving assistant 260 accesses data provided by the navigation subsystem 246 in order to determine a likely future location of the vehicle. In some embodiments, the cognitive load driving assistant 260 accesses data provided by entertainment subsystem 244 to assess the impact of secondary tasks, such as listening to music, on the cognitive load of the driver. [0034] In yet other embodiments, the cognitive load driving assistant 260 may receive and transmit additional ADAS data including, and without limitation, automotive vehicle-to- everything (V2X) data 238.
- V2X automotive vehicle-to- everything
- the vehicle-to-every thing (V2X) data 238 may include vehicle- to-vehicle (V2V) data, vehicle-to-infrastructure (V2I) data, and so forth.
- V2V vehicle- to-vehicle
- V2I vehicle-to-infrastructure
- the V2X data 238 enables the vehicle to communicate with other objects that include V2X capabilities. For example, the vehicle may communicate with other vehicles, smartphones, traffic lights, laptops, road-side V2X units, and so forth.
- the cognitive load driving assistant 260 After receiving the input data, the cognitive load driving assistant 260 computes any number of cognitive metrics that relate to the current cognitive load of the driver. Subsequently, the cognitive load driving assistant 260 determines whether the cognitive metrics indicate that the driver may be unable to devote a typical and/or safe amount of mental resources to the primary task of driving. In general, the cognitive load driving assistant 260 may compute any number of cognitive metrics and assess whether the cognitive metrics indicate an elevated current cognitive load in any technically feasible fashion. For example, and without limitation, for a subset of the driver-facing sensors 232, the cognitive load driving assistant 260 could compute a current value for a cognitive metric and compare the current value to historical values for the cognitive metric. Substantially in parallel, for each of the remaining driver-facing sensors 232, the cognitive load driving assistant 260 could compare current sensor data to historical sensor data. The cognitive load driving assistant 260 could then determine whether the results of the various comparisons indicate an elevated current cognitive load.
- the cognitive load driving assistant 260 could compute a weighted average of the deviations of the values of any number of cognitive metrics and any number of driver-facing sensors 232 from historical values to determine an average deviation. If the average deviation exceeds a certain preset limit, then the cognitive load driving assistant 260 could determine that the current cognitive load is elevated. In another example, the cognitive load driving assistant 260 could compare the value of a primary cognitive load metric to historical values of the primary cognitive load metric to determine whether the current cognitive load may be elevated. Additionally, the cognitive load driving assistant 260 could compare the values of any number of driver-facing sensors 232 to historical values to provide a confidence measurement.
- the cognitive load driving assistant 260 may compute a current cognitive load based on any number, including one, of cognitive metrics and sensor data. Further, the cognitive load driving assistant 260 may determine historical values for cognitive metrics, cognitive loads, and/or sensor data in any technically feasible fashion. For example, and without limitation, in some embodiments the cognitive load driving assistant 260 may store the current cognitive load and other relevant data, referred to herein as a "driving context" in any available memory (e.g., the system memory 240). The driving context may include any number and type of data such as time of day, the location of the vehicle, detailed sensor readings, and so forth.
- the cognitive load driving assistant 260 may retrieve previously stored cognitive loads and driving contexts to determine historical cognitive loads at any level of situational granularity. For example and without limitation, in some embodiments, the cognitive load driving assistant 260 may compute an average cognitive load based on all historical cognitive loads. In other embodiments, and without limitation, the cognitive load driving assistant 260 may compute an average cognitive load based on the historical cognitive loads in similar driving contexts (e.g., the same time of day and/or location).
- the cognitive load driving assistant 260 may transmit and/or receive cognitive loads and, optionally, driving contexts to other a cognitive load database 282 that is included in a cloud 280 (e.g., encapsulated shared resources, software, data, etc.).
- the cognitive load driving assistant 260 and other cognitive load driving assistants included in other vehicles may then retrieve information from the cognitive load database 282.
- the cognitive load driving assistant 260 may analyze such data as part of evaluating the current cognitive load, detecting situations that involve high cognitive loads, and so forth.
- the cognitive load driving assistant 260 may transmit and/or receive cognitive loads and, optionally, driving contexts with other cognitive load driving assistants 260 as V2X data 238.
- the cognitive load driving assistant 260 may be configured to transmit and store data relevant to the cognitive load of the driver in any technically feasible fashion.
- the cognitive load driving assistant 260 may be configured to receive and process data relevant to the cognitive loads of other drivers as well as any additional factors that may influence the cognitive load of the other drivers in any technically feasible fashion.
- the cognitive load driving assistant 260 may perform any number of actions designed to increase the safety of the driver.
- relevant data may include, without limitation, such current location of the vehicle, time of day, data provided by the navigation subsystem 246 and the entertainment subsystem 244, cognitive loads of drivers along the planned driving route, and so forth.
- the actions may directly or indirectly modify the driving task and any secondary tasks that may distract the driver.
- the cognitive load driving assistant 260 could provide feedback to the driver via the display device 214.
- the feedback could include the current cognitive load, historical cognitive loads, and suggestions for reducing the complexity of the primary driving task, such as easier (less congested) driving routes or lanes.
- the cognitive load driving assistant 260 may reduce human machine interface (HMI) complexity to reduce distractions.
- HMI human machine interface
- the cognitive load driving assistant 260 could block incoming cellular phone calls, lower the volume of music, block non-critical alerts (e.g., low windshield washer fluid alert, etc.), and the like.
- the cognitive load driving assistant 260 may perform actions designed to preemptively increase driving safety. For example, and without limitation, suppose that the cognitive load driving assistant 260 detects elevated cognitive loads associated with other drivers in the proximately of the vehicle or along the driving route specified by the navigation subsystem 246. To increase the vigilance of the driver, the cognitive load driving assistant 260 may alert the driver to expect potentially hazardous situations (e.g., accidents, dangerous curves, etc.) and/or distracted drivers.
- potentially hazardous situations e.g., accidents, dangerous curves, etc.
- the cognitive load driving assistant 260 may work in conjunction with the navigation subsystem 246 and/or other elements included in the ADAS 250 to increase driving safety based on one or more predictive heuristics.
- the cognitive load driving assistant 260 could configure the navigation subsystem 246 to avoid locations associated with elevated cognitive loads. For example, and without limitation, if elevated historical cognitive loads are associated with a particular exit to an airport, then the cognitive load driving assistant 260 could configure the navigation subsystem 246 to preferentially select an alternative exit to the airport.
- the cognitive load driving assistant 260 upon detecting elevated cognitive loads of the driver or nearby drivers, the cognitive load driving assistant 260 could modify one or more ADAS parameters to increase the conservatism of the ADAS 250. For example, and without limitation, the cognitive load driving assistant 260 could configure preemptive braking to activate at an earlier time or could decrease the baseline at which the ADAS 250 notifies the driver of a lane departure from the current driving lane.
- the cognitive load driving assistant 260 may configure the vehicle to provide feedback to the driver in any technically feasible fashion.
- the cognitive load driving assistant 260 may configure the vehicle to provide any combination of visual feedback, auditory feedback, haptic vibrational feedback, tactile feedback, force feedback, proprioceptive sensory feedback, and so forth.
- the cognitive load driving assistant 260 may configure any features of the vehicle in any technically feasible fashion.
- the cognitive load driving assistant 260 may configure the entertainment subsystem 244, the navigation subsystem 246, applications included in the ADAS 250, and any control mechanisms provided by the vehicle via any number of control signals or via any type of interface.
- the cognitive load driving assistant 260 receives cognitive load data and/or related data from other vehicles (e.g., via the cognitive load database 282, the V2X data 282, etc.).
- the cognitive load driving assistant 260 may leverage such shared data in any technically feasible fashion to optimize driving safety either at the current time or at a future time.
- the cognitive load driving assistant 260 could compare the current cognitive load to a baseline cognitive load based on collective cognitive loads of many drivers normalized for time, location, and other factors.
- the cognitive load driving assistant 260 attempts to maintain the current cognitive load below the threshold represented by the baseline cognitive load.
- the cognitive load driving assistant 260 may examine the average cognitive load of drivers in close proximity to the vehicle or along a driving route associated with the vehicle to detect a preponderance of elevated cognitive loads that indicates a complex situation, such as an accident. Upon detecting such an area of elevated cognitive loads, the cognitive load driving assistant 260 may generate a sensory warning designed to cause the driver to become more vigilant, generate a new driving route that avoids areas of elevated cognitive load, and so forth. In yet another example, and without limitation, the cognitive load driving assistant 260 may generate a "heat map" based on collective cognitive loads. The cognitive load driving assistant 260 may then suggest altering the driving environment based on the heat map. In particular, the cognitive load driving assistant 260 may recommend lane changes to lanes associated with lower cognitive loads; interact with the navigation subsystem 246 to optimize the driving route, and the like.
- the cognitive load driving assistant 260 may be configured to process any type of input data and/or compute any number of metrics related to cognitive load. Further, the cognitive load driving assistant 260 may be configured to increase driving safety and/or improve the driving experience based on the processed data and metrics in any technically feasible fashion.
- the cognitive load driving assistant 260 is described in the context of the head unit 130 herein, the functionality included in cognitive load driving assistant 260 may be implemented in any technically feasible fashion and in any combination of software and hardware.
- each of the processor 270 and the system memory 240 may be embedded in or mounted on a laptop, a tablet, a smartphone, a smartwatch, a smart wearable, or the like that implements the cognitive load driving assistant 260.
- the cognitive load driving assistant 260 may be implemented as a stand-alone unit that supplements the functionality of existing vehicle safety systems. Such a stand-alone unit may be implemented as a software application that executes on any processor.
- FIG 3 is a more detailed illustration of the cognitive load driving assistant 260 of Figure 2, according to various embodiments.
- the cognitive load driving assistant 260 includes, without limitation, a pupillometery engine 320, a body state engine 330, a cognitive load analyzer 340, a current driving context 370, and a cognitive load feedback engine 380.
- any number of components may provide the functionality included in the cognitive load driving assistant 260 and each of the components may be implemented in software, hardware, or any combination of software and hardware.
- the pupillometry engine 320 receives pupil data from a pupil sensor 302 that measures the sizes of the driver's pupils via eye tracking tools.
- the pupillometry engine 320 computes a pupil-based metric that reflects the cognitive load of the driver.
- the pupillometry engine 320 may compute the pupil-based metric in any technically feasible fashion.
- the pupilloemetry engine 320 may analyze the pupil data to identify specific rapid changes in pupil size that are associated with increased cognitive load.
- the body state engine 330 receives sensor data from a heart rate sensor 304, a galvanic skin response (GSR) sensor 306, and a blood pressure (BP) sensor 308. Based on the sensor data, the body state engine 330 computes a body-based metric that reflects the cognitive load of the driver. The body state engine 330 may compute the body -based metric in any technically feasible fashion. For example, and without limitation, the body state engine 330 may evaluate the heart rate in conjunction with the skin rate to determine a level of psychophysiological arousal. Further, the body state engine 330 may evaluate the BP to estimate an amount of blood flow in the front part of the brain.
- GSR galvanic skin response
- BP blood pressure
- the body state engine 330 may evaluate any type of sensor data in any combination to compute any number of metrics that reflect the cognitive load of the driver.
- the cognitive load analyzer 340 receives the pupil-based metric and the body -based metric and computes a current cognitive load 350 that approximates the cognitive load of the driver.
- the cognitive load analyzer 340 may compute the current cognitive load 350 in any technically feasible fashion.
- the cognitive load analyzer 340 may compute the current cognitive load 350 as a weighted average of the pupil-based metric and the body-based metric.
- the cognitive load analyzer 340 may perform any number of comparison operations between the current value of any number of metrics and any number and type of corresponding baseline values to determine the current cognitive load 350.
- the cognitive load analyzer 340 may determine that the value of a particular metric is erroneous based on the values of other metrics. In some embodiments, the cognitive load analyzer 340 may compute the current cognitive load 350 based on a subset of metrics and compute a confidence value based on a different subset of metrics.
- the cognitive load driving assistant 260 evaluates data received via the driver- facing sensors 232, the cognitive load driving assistant 260 also generates a current driving context 370 that includes data received via the non-driver-facing sensors 234, data received via the GNSS receiver 236, and the V2X data 238.
- the current driving context 370 described the current driving environment.
- the current driving context 370 includes, without limitation, driving task parameters 372, secondary task parameters 378, vehicle parameters 374, and environmental parameters 376.
- the driving task parameters 374 directly influence a driving task load that represents the mental resources required to perform the primary driving task.
- the secondary task parameters 460 directly influence a secondary task load that represents the mental resources required to perform secondary tasks, such as operating the entertainment subsystem 244 or talking on a cellular phone.
- the vehicle parameters 374 and the environmental parameters 376 reflect circumstances that impact the mental resources required to perform the driving task and/or the secondary tasks.
- the vehicle parameters 374 and the environmental parameters 376 could include the location of the vehicle, the condition of the road, the weather, the lighting conditions, and so forth.
- the cognitive load feedback engine 380 receives the current cognitive load 350 and the current driving context 370 and generates, without limitation, feedback signals 388, driving adjustment signals 382, entertainment subsystem adjustment signals 384, and navigation subsystem adjustment signals 386.
- the cognitive load feedback engine 380 evaluates the current cognitive load 350 relative to a baseline cognitive load to determine whether the current cognitive load 350 is elevated.
- the cognitive load feedback engine 380 may determine the baseline cognitive load in any technically feasible fashion.
- the baseline cognitive load could be a predetermined constant value.
- the cognitive load feedback engine 380 may dynamically compute the baseline cognitive load based on any number and type of historical data associated with any number of drivers and any number of driving contexts.
- the cognitive load feedback engine 380 may endeavor to reduce the current cognitive load 350.
- the cognitive load feedback engine 380 may examine the current driving context 370 to determine how to optimize the driving environment to reduce the driving task load and/or the secondary tasks loads.
- the cognitive load feedback engine 380 may generate any number of control signals in any technically feasible fashion that is consistent with the capabilities and interfaces implemented in the vehicle. Such control signals may provide, without limitation, any combination of visual feedback, auditory feedback, haptic vibrational feedback, tactile feedback, force feedback, proprioceptive sensory feedback, and so forth.
- the cognitive load feedback engine 380 could transmit the feedback signals 388 that configure the display device 214 to provide visual feedback regarding the current cognitive load 350, historical cognitive loads, and recommendations for reducing the driving task and/or secondary tasks loads. If the vehicle is equipped with the advanced driving features, then the cognitive load feedback engine 380 could increase the conservatism of the vehicle via the driving adjustment signals 382, such as decreasing a baseline at which the ADAS 250 notifies the driver of a lane departure. In some embodiments, the cognitive load feedback engine 380 may configure the entertainment subsystem 244 via the entertainment subsystem adjustment signals 384 to reduce distractions associated with an in-vehicle audio system.
- the cognitive load feedback engine 380 may configure the navigation subsystem 246 via the navigation subsystem adjustment signals 386 to replace a current driving route with a new driving route that is less congested, thereby lowering the mental resources required to perform the primary driving task.
- Figure 4 illustrates the relationship between the current driving context 340 and the current cognitive load 350 of Figure 3, according to various embodiments.
- the current driving context 340 includes the driving task parameters 372, the secondary task parameters 378, the vehicle parameters 374, and the environmental parameters 376.
- the driving task parameters 372 directly influence a driving task load 450 that represents the mental resources required to perform the primary driving task
- the secondary task parameters 460 directly influence a secondary task load 460 that represents the mental resources required to perform secondary tasks, such as talking on a cell phone.
- the driving task parameters 372, secondary task parameters 378, vehicle parameters 374, and environmental parameters 376 contribute to the current cognitive load 350.
- the current cognitive load 350 increases (depicted as an increasing cognitive load 472) within an overall cognitive load 470.
- the overall cognitive load 470 represents the total cognitive load of the driver and, within the overall cognitive load 470, a baseline cognitive load 474 reflects the typical cognitive loads of the driver.
- the cognitive load feedback engine 380 analyzes the current driving context 370 and transmits the navigation subsystem adjustment signal 386 "reroute via less congested roads" to the navigation subsystem 246, and the entertainment subsystem adjustment signal 384 "mute the audio system” to the entertainment subsystem 244. Subsequently, as a result of the reduction in the driving task load 450 and the secondary task load 460 attributable to, respectively, the navigation subsystem adjustment signal 386 and the entertainment subsystem adjustment signal 384, the current cognitive load 350 decreases and no longer exceeds the baseline cognitive load 474.
- the cognitive load feedback engine 380 attempts to adjust the current driving context 340 to either directly or indirectly reduce the current cognitive load 350. Accordingly, the level of driver distraction is reduced and the safety of the driver and surrounding drivers is increased.
- FIG. 5 is a flow diagram of method steps for managing cognitive load while driving, according to various embodiments. Although the method steps are described in conjunction with the systems of Figures 1-4, persons skilled in the art will understand that any system configured to implement the method steps, in any order, falls within the scope of the various embodiments.
- a method 500 begins at step 504, where the cognitive load driving assistant 260 included in a vehicle receives sensor data via the driver-facing sensors 232 and the non- driver-facing sensors 234.
- the driver-facing sensors 232 may include any number of sensors that monitor characteristics of the driver.
- the driver- facing sensors 232 may include the pupil sensor 302, the heart rate sensor 304, the galvanic skin response (GSR) sensor 306, the blood pressure (BP) sensor 308, and the like.
- the non-driver-facing sensors 324 monitor data that is not directly related to the driver, such as environmental data and vehicle data.
- the cognitive load driving assistant 260 computes the current cognitive load 350 based on the driver-facing sensor data.
- the cognitive load driving assistant 260 computes the current driving context 370 based on the non-driver-facing sensor data in conjunction with other relevant environmental and vehicle data.
- the additional data may include any type of data received in any technically feasible fashion.
- the additional data could include a location of the vehicle based on data received via the GNSS receiver 236 and locations of other vehicles based on V2X data 238.
- the cognitive load driving assistant 260 typically performs steps 506 and steps 508 substantially in parallel.
- the cognitive load driving assistant 260 transmits the current cognitive load 350 and the current driving context 370 to the cognitive load database 282 included in the cloud 280. Sharing cognitive data in this manner enables other cognitive load driving assistants 260 included in other vehicles to alert other drivers when the current cognitive load 350 indicates that the driver of the vehicle may pose a safety risk.
- the cognitive load feedback engine 380 computes the baseline cognitive load 474 based on historical cognitive load data in conjunction with historical driving contexts.
- the historical cognitive load data and the historical driving contexts may be stored in any memory, in any technically feasible fashion, and include any amount of data associated with any number of drivers.
- the historical cognitive load data could be stored in the cognitive load database 282 and include data for many drivers.
- the cognitive load feedback engine 380 may compute the baseline cognitive load 474 in any technically feasible fashion.
- the cognitive load feedback engine 380 could compute the baseline cognitive load 474 as the average of all historical cognitive loads associated with the driver.
- the cognitive load feedback engine 380 compares the current cognitive load 350 to the baseline cognitive load 474.
- step 514 the cognitive load feedback engine 380 determines that the current cognitive load 350 is not greater than the baseline cognitive load 474, then the method 500 returns to step 504 where the cognitive load driving assistant 260 receives new sensor data. If, however, at step 514, the cognitive load feedback engine 380 determines that the current cognitive load 350 is greater than the baseline cognitive load 474, then the method 500 proceeds directly to step 516.
- the cognitive load feedback engine 380 provides feedback to the driver indicating the elevated current cognitive load 350.
- the cognitive load feedback engine 380 may provide the feedback in any technically feasible fashion and may include any additional data for reference.
- the cognitive load feedback engine 380 could display an "evaluated cognitive load" warning via the dashboard-mounted display device 214.
- the warning could include the current cognitive load 350 and an indication of how the current cognitive load 350 relates to the baseline cognitive load 474.
- the cognitive load feedback engine 380 could audibly warn the driver that the current cognitive load 350 indicates a dangerous driving situation.
- the cognitive load feedback engine 380 performs corrective actions designed to reduce the driving task load 450 and/or the secondary task load 460 based on the current driving context 370 and/or the historical driving contexts. For example, and without limitation, the cognitive load feedback engine 380 could determine that the current driving route is challenging and, in response, interact with the navigation subsystem 246 to suggest a less congested route for the vehicle. In another example, and without limitation, the cognitive load feedback engine 380 could determine that the number of secondary tasks that the driver is performing significantly exceeds the number of secondary tasks that the driver typically performs and, in response, interact with the entertainment subsystem 244 to mute the speakers.
- the method 500 then returns to step 504 where the cognitive load driving assistant 260 receives new sensor data.
- the cognitive load driving assistant 260 continues to cycle through steps 504-518, assessing the current cognitive load 350 to detect and attempt to minimize situations associated with elevated cognitive loads until the vehicle or the cognitive load driving assistant 260 is turned off.
- a cognitive driving assistant analyzes driver-facing sensor data and provides feedback regarding elevated driver cognitive loads to enable drivers to recognize and react to dangerous driving environments.
- the cognitive driving assistant processes driver-facing sensor data to compute a current cognitive load.
- the cognitive driving assistant processes non driver-facing sensor data along with other relevant data, such as GNSS data, to generate a current driving context.
- the current driving context includes driving parameters, vehicle parameters, environmental parameters, and secondary task parameters.
- a cognitive load feedback engine analyzes the current cognitive load of the driver with respect to historical cognitive loads of the driver in similar driving contexts. For example, if a current time included in the current driving context indicates night time lighting conditions, then the cognitive load feedback engine could compare the current cognitive load of the driver to historical cognitive loads in other driving contexts that indicate night time lighting conditions. If the cognitive load feedback engine determines that the current cognitive load is greater than the "baseline" cognitive load in similar driving contexts, then the cognitive load feedback engine initiates corrective action.
- the corrective action may include any type of passive feedback, such as an audible warning, or any type of active control, such as disabling a ringer of a cellular phone.
- the cognitive load feedback engine transmits the current cognitive load and/or the current driving context to a cognitive load database stored in a public cloud.
- a cognitive load feedback engine operating in other vehicles to preemptively identify dangerous driving situations. For example, if the current cognitive load of the driver is elevated, then a cognitive load feedback engine in a second vehicle located in the immediate vicinity of the vehicle could notify the driver of the second vehicle that a distracted driver is nearby.
- At least one advantage of the disclosed approach is that because the cognitive load feedback engine enables drivers to adjust driving and/or secondary task behavior based on cognitive loads, driver safety may be increased.
- aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.”
- aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Neurology (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Neurosurgery (AREA)
- Pulmonology (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
In one embodiment, a cognitive load driving assistant increases driving safety based on cognitive loads. In operation, the cognitive load driving assistant computes a current cognitive load of a driver based on sensor data. If the current cognitive load exceeds a threshold cognitive load, then the cognitive load driving assistant modifies the driving environment to reduce the cognitive load required to perform the primary driving task and/secondary task(s), such as texting via a cellular phone. The cognitive load driving assistant may modify the driving environment indirectly via sensory feedback to the driver or directly through reducing the complexity of the primary driving task and/or secondary tasks. In particular, if the driver is exhibiting elevated cognitive loads typically associated with distracted driving, then the cognitive load driving assistant modifies the driving environment to allow the driver to devote appropriate mental resources to the primary driving task, thereby increasing driving safety.
Description
COGNITIVE LOAD DRIVING ASSISTANT
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the priority benefit of the United States Provisional Patent Application having Serial Number 62/102,434 (Attorney Docket Number HRMN/0148USL) and filed on January 12, 2015. The subject matter of this related application is hereby incorporated herein by reference.
BACKGROUND
Field of the Various Embodiments
[0002] The various embodiments relate generally to automotive systems and, more specifically, to a cognitive load driving assistant.
Description of the Related Art
[0003] Oftentimes, drivers fail to focus an appropriate amount of attention on the task of driving. For example, drivers may not adjust their focus to adequately address complex driving situations attributable to traffic, the manner in which others are driving, pedestrians, road conditions, weather conditions, volume of traffic, and the like. Further, drivers typically engage in multiple, secondary in-vehicle activities that divert their attention from the primary task of driving. Such secondary in-vehicle activities may include listening to loud music, participating in conversations, texting, soothing a crying child, and so forth. [0004] "Distracted" driving attributable to complex driving situations and secondary in- vehicle activities increases the likelihood of collisions and accidents. For example, a driver who is driving on a winding road at night while talking with a passenger and operating an entertainment system is more likely to become involved in an automobile accident than a driver who is focused solely on the task of driving along a straight road during the day. Moreover, because using in-vehicle technologies while driving has become widespread, the frequency of injuries from accidents caused by distracted driving has increased. Some examples of prevalent in-vehicle technologies are navigation systems and entertainments systems.
[0005] In general, the three primary types of driver distractions are visual distractions, manual distractions, and cognitive distractions. Many adverse driving conditions and in-vehicle activities lead to multiple types of driver distractions. For example, texting is associated with a visual distraction that causes the driver to take his or her eyes off the road, a manual
distraction that causes the driver to take his or her hands off the steering wheel, and a cognitive distraction that causes the driver to take his or her mind off the task of driving.
[0006] Because the impact of cognitive distractions on a driver is more difficult to assess than the impact of visual distractions and manual distractions, most drivers are oblivious to the amount of mental resources required to perform activities and tasks. As a result, drivers typically fail to modify the driving environment or their actions to reduce their cognitive load when their level of driver distraction becomes dangerously high.
[0007] As the foregoing illustrates, more effective techniques that enable drivers to better understand their levels of cognitive load while driving would be useful. SUMMARY
[0008] One embodiment sets forth a computer-readable storage medium including instructions that, when executed by a processor, cause the processor to perform the steps of computing a current cognitive load associated with a driver while the driver is operating a vehicle based on data received via one or more sensors; determining that the current cognitive load exceeds a baseline cognitive load; and in response, causing one or more actions to occur that are intended to reduce the current cognitive load associated with the driver.
[0009] Further embodiments provide, among other things, a method and a system configured to implement the computer-readable storage medium set forth above.
[0010] At least one advantage of the disclosed techniques is that conveying cognitive load levels to drivers and/or adjusting vehicle behavior based on cognitive load levels may increase driver safety. In particular, if the driver is exhibiting elevated cognitive loads typically associated with distracted driving, then a cognitive load driving assistant may take action to reduce the complexity of the driving situation and/or the secondary tasks that the driver is performing, thereby allowing the driver to devote appropriate mental resources to the primary driving task.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] So that the manner in which the above recited features of the various embodiments can be understood in detail, a more particular description, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments and are
therefore not to be considered limiting in scope, for the various embodiments may admit to other equally effective embodiments.
[0012] Figure 1 illustrates a passenger compartment of a vehicle that is configured to implement one or more aspects of the various embodiments; [0013] Figure 2 is a more detailed illustration of the head unit of Figure 1, according to various embodiments
[0014] Figure 3 is a more detailed illustration of the cognitive load driving assistant of Figure 2, according to various embodiments;
[0015] Figure 4 illustrates the relationship between the current driving context and the current cognitive load of Figure 3, according to various embodiments;
[0016] Figure 5 is a flow diagram of method steps for managing cognitive load while driving, according to various embodiments.
DETAILED DESCRIPTION
[0017] In the following description, numerous specific details are set forth to provide a more thorough understanding of the various embodiments. However, it will be apparent to one of skill in the art that the various embodiments may be practiced without one or more of these specific details.
Vehicle Overview
[0018] Figure 1 illustrates a passenger compartment 100 of a vehicle that is configured to implement one or more aspects of the various embodiments. As shown, the passenger compartment 100 includes, without limitation, a windshield 1 10 and a head unit 130 positioned proximate to a dashboard 120. In various embodiments, the passenger compartment 100 may include any number of additional components that implement any technically feasible functionality. For example and without limitation, in some embodiments the passenger compartment 100 may include a rear- view camera.
[0019] As shown, the head unit 130 is located in the center of the dashboard 120. In various embodiments, the head unit 130 may be mounted at any location within the passenger compartment 100 in any technically feasible fashion that does not block the windshield 1 10. The head unit 130 may include any number and type of instrumentation and applications, and may provide any number of input and output mechanisms. For example, and without
limitation, the head unit 130 typically enables the driver and/or passengers to control entertainment functionality. In some embodiments, the head unit 130 may include navigation functionality and/or advanced driver assistance functionality designed to increase driver safety, automate driving tasks, and the like. [0020] The head unit 130 may support any number of input and output data types and formats as known in the art. For example, and without limitation, in some embodiments, the head unit 130 may include built-in Bluetooth for hands-free calling and audio streaming, universal serial bus (USB) connections, speech recognition, rear-view camera inputs, video outputs for any number and type of displays, and any number of audio outputs. In general, any number of sensors, displays, receivers, transmitters, etc. may be integrated into the head unit 130 or may be implemented externally to the head unit 130. External devices may communicate with the head unit 130 in any technically feasible fashion.
[0021] While driving, the driver of the vehicle is exposed to a variety of stimuli that are related to either the primary driving task and/or any number of secondary tasks. For example, and without limitation, the driver could see lane markers 142, a pedestrian 144, a cyclist 146, and a police car 148 via the windshield 1 10. In response, the driver could steer the vehicle to track the lane markers 142 while avoiding the pedestrian 144 and the cyclist 146 and apply the brake pedal to allow the police car 148 to cross the road in front of the vehicle. Further, and without limitation, the driver could concurrently participate in a conversation 152, listen to music 154, and attempt to soothe a crying baby 156. Challenging driving environments and secondary activities typically increase the cognitive load of the driver and may contribute to an unsafe driving environment for the driver and for objects (other vehicles, the pedestrian 144, etc.) in the proximity of the vehicle. In general, the head unit 130 includes functionality to enable the driver to efficiently perform both the primary driving task and certain secondary tasks as well as functionality designed to increase driver safety while performing such tasks.
[0022] Figure 2 is a more detailed illustration of the head unit 130 of Figure 1, according to various embodiments. As shown, the head unit 130 includes, without limitation, a processor 270 and a system memory 240. The processor 270 and the system memory 240 may be implemented in any technically feasible fashion. For example, and without limitation, in various embodiments, any combination of the processor 270 and the system memory 240 may be implemented as a stand-alone chip or as part of a more comprehensive solution that is implemented as an application-specific integrated circuit (ASIC) or a system-on-a-chip (SoC).
[0023] The processor 270 generally comprises a programmable processor that executes program instructions to manipulate input data. The processor 270 may include any number of processing cores, memories, and other modules for facilitating program execution. The processor 270 may receive input from drivers and/or passengers of the vehicle via any number of user input devices 212 and generate pixels for display on the display device 214. The user input devices 212 may include various types of input devices, such as buttons, a microphone, cameras, a touch-based input device integrated with a display device 214 (i.e., a touch screen), and other input devices for providing input data to the head unit 130.
[0024] The system memory 240 generally comprises storage chips such as random access memory (RAM) chips that store application programs and data for processing by the processor 270. In various embodiments, the system memory 240 includes non-volatile memory such as optical drives, magnetic drives, flash drives, or other storage. In some embodiments, a storage 220 may supplement or replace the system memory 240. The storage 220 may include any number and type of external memories that are accessible to the processor 170. For example, and without limitation, the storage 220 may include a Secure Digital Card, an external Flash memory, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
[0025] As shown, the system memory 240 includes, without limitation, an entertainment subsystem 244, a navigation subsystem 246, and an advanced driver assistance system (ADAS) 250. The entertainment subsystem 244 includes software that controls any number and type of entertainment components, such as an AM/FM radio, a satellite radio, an audio and video computer files player (e.g., MP3 audio files player), an optical media player (e.g., compact disc (CD) player), and so forth. In some embodiments, any number of entertainment components may be included in the head unit 130 and any number of entertainment components may be implemented as stand-alone devices. The navigation subsystem 246 includes any number and type of applications that enable a driver to efficiently navigate the vehicle. For example, the navigation subsystem 246 may include maps, direction routing software, and the like.
[0026] The ADAS 250 includes functionality that is designed to increase driver safety and/or automate driving tasks. For example, and without limitation, in various embodiments, the ADAS 250 may provide hill descent control, automatic parking, and the like. Notably, the functionality included in the ADAS 250 may supplement, enhance, and/or automate
functionality provided by other components included in the vehicle to decrease the likelihood of accidents or collisions in challenging conditions and/or driving scenarios.
[0027] In general, challenging driving environments and distractions may strain that ability of the driver to devote adequate attention to the primary driving task. For example, suppose that the driver is driving the vehicle during low light conditions along a congested, windy road while texting on a cell phone. In such a scenario, the driver may not devote enough mental resources to the primary driving task to operate the vehicle in a safe manner. However, many drivers do not recognize when their cognitive loads increase past a comfortable level and they begin to exhibit unsafe driving behaviors associated with distracted driving. For this reason, the ADAS 250 includes, without limitation, a cognitive load driving assistant 260.
Dynamically Modifying the Driving Environment Based on Cognitive Load
[0028] In general, the cognitive load driving assistant 260 continually estimates the current cognitive load of the driver and determines whether the current cognitive load indicates an abnormally stressful driving environment and/or an abnormal number of distractions. If the cognitive load driving assistant 260 determines that the current cognitive load indicates an abnormally stressful driving environment and/or an abnormal number of distractions, then the cognitive load driving assistant 260 attempts to indirectly or direct modify the driving environment to reduce the cognitive load of the driver. For example, and without limitation, the cognitive load driving assistant 260 could notify the driver of an atypically high cognitive load and suggest alternate driving routes that are less congested than the current driving route.
[0029] The cognitive load driving assistant 260 may process any type of input data and implement any technically feasible algorithm to estimate current cognitive load and/or determine whether the current cognitive load negatively impacts the driver's ability to safely operate the vehicle. As shown, and without limitation, the head unit 130, including the cognitive load driving assistant 260, receives data via any number of driver-facing sensors 232 and non-driver-facing sensors 234. The driver-facing sensors 232 may include devices capable of detecting and relaying physiological data associated with the driver. More specifically, the driver-facing sensors 232 may measure physiological change in the body related to cognitive load. In a complementary fashion, the non-driver-facing sensors 232 may include any devices capable of detecting and relaying that data that does not reflect the physiology of the driver but are related to the driving environment.
[0030] In general, the driver-facing sensors 232 and the non-driver-facing sensors 234 may include any type of sensors designed to measure any characteristic and may be implemented in any technically feasible fashion. In particular, the driver-facing sensors 232 may, without limitation, track specific features of the driver, such as hands, fingers, head, eye gaze, feet, facial expression, voice tone, and the like. For example, and without limitation, the driver- facing sensors 232 could include sensors that measure brain activity, heart rate, skin conductance, steering-wheel grip force, muscle activity, skin/body temperature, and so forth. Further, the driver-facing sensors 232 may include, without limitation, microphones that detect conversational context, conversational turn taking, voice tone and affect, other auditory distractions, and the like. For example, and without limitation, the driver-facing sensors 232 could detect that the driver is engaged in conversation with a passenger, the driver is currently speaking, the driver's voice tone indicates that the driver is drowsy, and two other passengers are engaged in a second conversation. In some embodiments, and without limitation, the driver-facing sensors 232 may include visual imagers that detect head position and orientation, facial features, hands movements, etc. In some embodiments, and without limitation, the driving facing sensors 232 may include depth sensors that detect finger and hand gestures, body posture, and as forth and/or eye gaze and pupil size tracking sensors.
[0031] In a complementary fashion, and without limitation, the non-driver-facing sensors 234 may track any features of the vehicle and/or environment surrounding the vehicle that are relevant to the driver. For example, and without limitation, the non-driver-facing sensors 234 may track vehicle control elements, such as the position of the throttle, the position of the clutch, gear selection, the location of the brake pedal, the angle of the steering wheel, and so forth. The non-driver-facing sensors 234 may include any number of sensors for tracking vehicle speed, position, orientation, and dynamics, such as inertial and magnetic sensors. Further, the non-driver-facing sensors 234 may include devices that detect and/or track stationary and/or moving objects surrounding the vehicle. Such detection sensors may include, without limitation, a front-mounted visible light imager, an infrared imager, a radio detection and ranging (RADAR) sensor, a light detection and ranging (LIDAR) sensor, a dedicated short range communication (DSRC) sensor, thermal and motion sensors, depth sensors, sonar and acoustic sensors, and the like. In some embodiments, and without limitation, the non-driver-facing sensors 234 may include remote sensors that provide information regarding local weather, traffic, etc.
[0032] The driver-facing sensors 232 and the non-driver-facing sensors 234 may be deployed in any technically feasible fashion. For example, and without limitation, the driver-facing sensors 232 and the non-driver-facing sensors 234 may include any number and combination of vehicle-integrated sensors, vehicle-integrated imagers, wearable devices (affixed to or worn by the driver), and remote sensors. In one example, and without limitation the, driver-facing sensors 232 could include steering wheel-mounted sensors that measure heart rate, skin conductance, and grip force, while the non-driver facing sensors 234 could include include a front-mounted visible light imager, an infrared imager, and a LIDAR sensor.
[0033] In some embodiments, the cognitive load driving assistant 260 may receive additional input data, referred to herein as advanced driver assistance system (ADAS) data. Such ADAS data may include, without limitation, data received from a global navigation satellite system (GNSS) receiver 236, data received from the navigation subsystem 245, and data received from the entertainment subsystem 244. The global navigation satellite system (GNSS) receiver 236 determines global position of the vehicle. The GNSS receiver 236 operates based on one or more of the global positioning system of manmade Earth satellites, various electromagnetic spectrum signals (such as cellular tower signals, wireless internet signals, and the like), or other signals or measurements, and/or on a combination of the above items. In various embodiments, the cognitive load driving assistant 260 accesses global positioning data from GNSS receiver 236 in order to determine a current location of the vehicle. Further, in some embodiments, the cognitive load driving assistant 260 accesses data provided by the navigation subsystem 246 in order to determine a likely future location of the vehicle. In some embodiments, the cognitive load driving assistant 260 accesses data provided by entertainment subsystem 244 to assess the impact of secondary tasks, such as listening to music, on the cognitive load of the driver. [0034] In yet other embodiments, the cognitive load driving assistant 260 may receive and transmit additional ADAS data including, and without limitation, automotive vehicle-to- everything (V2X) data 238. The vehicle-to-every thing (V2X) data 238 may include vehicle- to-vehicle (V2V) data, vehicle-to-infrastructure (V2I) data, and so forth. The V2X data 238 enables the vehicle to communicate with other objects that include V2X capabilities. For example, the vehicle may communicate with other vehicles, smartphones, traffic lights, laptops, road-side V2X units, and so forth.
[0035] After receiving the input data, the cognitive load driving assistant 260 computes any number of cognitive metrics that relate to the current cognitive load of the driver.
Subsequently, the cognitive load driving assistant 260 determines whether the cognitive metrics indicate that the driver may be unable to devote a typical and/or safe amount of mental resources to the primary task of driving. In general, the cognitive load driving assistant 260 may compute any number of cognitive metrics and assess whether the cognitive metrics indicate an elevated current cognitive load in any technically feasible fashion. For example, and without limitation, for a subset of the driver-facing sensors 232, the cognitive load driving assistant 260 could compute a current value for a cognitive metric and compare the current value to historical values for the cognitive metric. Substantially in parallel, for each of the remaining driver-facing sensors 232, the cognitive load driving assistant 260 could compare current sensor data to historical sensor data. The cognitive load driving assistant 260 could then determine whether the results of the various comparisons indicate an elevated current cognitive load.
[0036] For example, and without limitation, the cognitive load driving assistant 260 could compute a weighted average of the deviations of the values of any number of cognitive metrics and any number of driver-facing sensors 232 from historical values to determine an average deviation. If the average deviation exceeds a certain preset limit, then the cognitive load driving assistant 260 could determine that the current cognitive load is elevated. In another example, the cognitive load driving assistant 260 could compare the value of a primary cognitive load metric to historical values of the primary cognitive load metric to determine whether the current cognitive load may be elevated. Additionally, the cognitive load driving assistant 260 could compare the values of any number of driver-facing sensors 232 to historical values to provide a confidence measurement.
[0037] In general, the cognitive load driving assistant 260 may compute a current cognitive load based on any number, including one, of cognitive metrics and sensor data. Further, the cognitive load driving assistant 260 may determine historical values for cognitive metrics, cognitive loads, and/or sensor data in any technically feasible fashion. For example, and without limitation, in some embodiments the cognitive load driving assistant 260 may store the current cognitive load and other relevant data, referred to herein as a "driving context" in any available memory (e.g., the system memory 240). The driving context may include any number and type of data such as time of day, the location of the vehicle, detailed sensor readings, and so forth. Subsequently, the cognitive load driving assistant 260 may retrieve previously stored cognitive loads and driving contexts to determine historical cognitive loads at any level of situational granularity. For example and without limitation, in some
embodiments, the cognitive load driving assistant 260 may compute an average cognitive load based on all historical cognitive loads. In other embodiments, and without limitation, the cognitive load driving assistant 260 may compute an average cognitive load based on the historical cognitive loads in similar driving contexts (e.g., the same time of day and/or location).
[0038] In some embodiments and without limitation, the cognitive load driving assistant 260 may transmit and/or receive cognitive loads and, optionally, driving contexts to other a cognitive load database 282 that is included in a cloud 280 (e.g., encapsulated shared resources, software, data, etc.). The cognitive load driving assistant 260 and other cognitive load driving assistants included in other vehicles may then retrieve information from the cognitive load database 282. The cognitive load driving assistant 260 may analyze such data as part of evaluating the current cognitive load, detecting situations that involve high cognitive loads, and so forth.
[0039] In some embodiments, the cognitive load driving assistant 260 may transmit and/or receive cognitive loads and, optionally, driving contexts with other cognitive load driving assistants 260 as V2X data 238. In general, the cognitive load driving assistant 260 may be configured to transmit and store data relevant to the cognitive load of the driver in any technically feasible fashion. Similarly, the cognitive load driving assistant 260 may be configured to receive and process data relevant to the cognitive loads of other drivers as well as any additional factors that may influence the cognitive load of the other drivers in any technically feasible fashion.
[0040] After determining the current cognitive load of the driver and assessing other relevant data, the cognitive load driving assistant 260 may perform any number of actions designed to increase the safety of the driver. As previously detailed, such relevant data may include, without limitation, such current location of the vehicle, time of day, data provided by the navigation subsystem 246 and the entertainment subsystem 244, cognitive loads of drivers along the planned driving route, and so forth. The actions may directly or indirectly modify the driving task and any secondary tasks that may distract the driver.
[0041] For example, and without limitation, the cognitive load driving assistant 260 could provide feedback to the driver via the display device 214. The feedback could include the current cognitive load, historical cognitive loads, and suggestions for reducing the complexity of the primary driving task, such as easier (less congested) driving routes or lanes. In some
embodiments, and without limitation, the cognitive load driving assistant 260 may reduce human machine interface (HMI) complexity to reduce distractions. For example, and without limitation, the cognitive load driving assistant 260 could block incoming cellular phone calls, lower the volume of music, block non-critical alerts (e.g., low windshield washer fluid alert, etc.), and the like.
[0042] In some embodiments, the cognitive load driving assistant 260 may perform actions designed to preemptively increase driving safety. For example, and without limitation, suppose that the cognitive load driving assistant 260 detects elevated cognitive loads associated with other drivers in the proximately of the vehicle or along the driving route specified by the navigation subsystem 246. To increase the vigilance of the driver, the cognitive load driving assistant 260 may alert the driver to expect potentially hazardous situations (e.g., accidents, dangerous curves, etc.) and/or distracted drivers.
[0043] In some embodiments and without limitation, the cognitive load driving assistant 260 may work in conjunction with the navigation subsystem 246 and/or other elements included in the ADAS 250 to increase driving safety based on one or more predictive heuristics. In some embodiments, the cognitive load driving assistant 260 could configure the navigation subsystem 246 to avoid locations associated with elevated cognitive loads. For example, and without limitation, if elevated historical cognitive loads are associated with a particular exit to an airport, then the cognitive load driving assistant 260 could configure the navigation subsystem 246 to preferentially select an alternative exit to the airport. In other embodiments, upon detecting elevated cognitive loads of the driver or nearby drivers, the cognitive load driving assistant 260 could modify one or more ADAS parameters to increase the conservatism of the ADAS 250. For example, and without limitation, the cognitive load driving assistant 260 could configure preemptive braking to activate at an earlier time or could decrease the baseline at which the ADAS 250 notifies the driver of a lane departure from the current driving lane.
[0044] The cognitive load driving assistant 260 may configure the vehicle to provide feedback to the driver in any technically feasible fashion. For example, and without limitation, the cognitive load driving assistant 260 may configure the vehicle to provide any combination of visual feedback, auditory feedback, haptic vibrational feedback, tactile feedback, force feedback, proprioceptive sensory feedback, and so forth. Further, the cognitive load driving assistant 260 may configure any features of the vehicle in any technically feasible fashion. For example, the cognitive load driving assistant 260 may
configure the entertainment subsystem 244, the navigation subsystem 246, applications included in the ADAS 250, and any control mechanisms provided by the vehicle via any number of control signals or via any type of interface.
[0045] As described above, in some embodiments the cognitive load driving assistant 260 receives cognitive load data and/or related data from other vehicles (e.g., via the cognitive load database 282, the V2X data 282, etc.). In operation, the cognitive load driving assistant 260 may leverage such shared data in any technically feasible fashion to optimize driving safety either at the current time or at a future time. For example, and without limitation, instead of comparing the current cognitive load to a personalized average cognitive load, the cognitive load driving assistant 260 could compare the current cognitive load to a baseline cognitive load based on collective cognitive loads of many drivers normalized for time, location, and other factors. In general, the cognitive load driving assistant 260 attempts to maintain the current cognitive load below the threshold represented by the baseline cognitive load.
[0046] In another example, and without limitation, the cognitive load driving assistant 260 may examine the average cognitive load of drivers in close proximity to the vehicle or along a driving route associated with the vehicle to detect a preponderance of elevated cognitive loads that indicates a complex situation, such as an accident. Upon detecting such an area of elevated cognitive loads, the cognitive load driving assistant 260 may generate a sensory warning designed to cause the driver to become more vigilant, generate a new driving route that avoids areas of elevated cognitive load, and so forth. In yet another example, and without limitation, the cognitive load driving assistant 260 may generate a "heat map" based on collective cognitive loads. The cognitive load driving assistant 260 may then suggest altering the driving environment based on the heat map. In particular, the cognitive load driving assistant 260 may recommend lane changes to lanes associated with lower cognitive loads; interact with the navigation subsystem 246 to optimize the driving route, and the like.
[0047] In general, the cognitive load driving assistant 260 may be configured to process any type of input data and/or compute any number of metrics related to cognitive load. Further, the cognitive load driving assistant 260 may be configured to increase driving safety and/or improve the driving experience based on the processed data and metrics in any technically feasible fashion. Although the cognitive load driving assistant 260 is described in the context of the head unit 130 herein, the functionality included in cognitive load driving assistant 260 may be implemented in any technically feasible fashion and in any combination of software and hardware. For example, and without limitation, each of the processor 270 and the system
memory 240 may be embedded in or mounted on a laptop, a tablet, a smartphone, a smartwatch, a smart wearable, or the like that implements the cognitive load driving assistant 260. In other embodiments, and without limitation, the cognitive load driving assistant 260 may be implemented as a stand-alone unit that supplements the functionality of existing vehicle safety systems. Such a stand-alone unit may be implemented as a software application that executes on any processor.
[0048] Figure 3 is a more detailed illustration of the cognitive load driving assistant 260 of Figure 2, according to various embodiments. As shown, the cognitive load driving assistant 260 includes, without limitation, a pupillometery engine 320, a body state engine 330, a cognitive load analyzer 340, a current driving context 370, and a cognitive load feedback engine 380. In alternate embodiments and without limitation, any number of components may provide the functionality included in the cognitive load driving assistant 260 and each of the components may be implemented in software, hardware, or any combination of software and hardware. [0049] In operation, the pupillometry engine 320 receives pupil data from a pupil sensor 302 that measures the sizes of the driver's pupils via eye tracking tools. Based on the pupil data, the pupillometry engine 320 computes a pupil-based metric that reflects the cognitive load of the driver. The pupillometry engine 320 may compute the pupil-based metric in any technically feasible fashion. For example, and without limitation, the pupilloemetry engine 320 may analyze the pupil data to identify specific rapid changes in pupil size that are associated with increased cognitive load.
[0050] Operating substantially in parallel to the pupillometry engine 320, the body state engine 330 receives sensor data from a heart rate sensor 304, a galvanic skin response (GSR) sensor 306, and a blood pressure (BP) sensor 308. Based on the sensor data, the body state engine 330 computes a body-based metric that reflects the cognitive load of the driver. The body state engine 330 may compute the body -based metric in any technically feasible fashion. For example, and without limitation, the body state engine 330 may evaluate the heart rate in conjunction with the skin rate to determine a level of psychophysiological arousal. Further, the body state engine 330 may evaluate the BP to estimate an amount of blood flow in the front part of the brain. In general, the body state engine 330 may evaluate any type of sensor data in any combination to compute any number of metrics that reflect the cognitive load of the driver.
[0051] As shown, the cognitive load analyzer 340 receives the pupil-based metric and the body -based metric and computes a current cognitive load 350 that approximates the cognitive load of the driver. The cognitive load analyzer 340 may compute the current cognitive load 350 in any technically feasible fashion. For example, and without limitation, the cognitive load analyzer 340 may compute the current cognitive load 350 as a weighted average of the pupil-based metric and the body-based metric. In various embodiments, the cognitive load analyzer 340 may perform any number of comparison operations between the current value of any number of metrics and any number and type of corresponding baseline values to determine the current cognitive load 350. Further, the cognitive load analyzer 340 may determine that the value of a particular metric is erroneous based on the values of other metrics. In some embodiments, the cognitive load analyzer 340 may compute the current cognitive load 350 based on a subset of metrics and compute a confidence value based on a different subset of metrics.
[0052] While the cognitive load driving assistant 260 evaluates data received via the driver- facing sensors 232, the cognitive load driving assistant 260 also generates a current driving context 370 that includes data received via the non-driver-facing sensors 234, data received via the GNSS receiver 236, and the V2X data 238. The current driving context 370 described the current driving environment. As shown, the current driving context 370 includes, without limitation, driving task parameters 372, secondary task parameters 378, vehicle parameters 374, and environmental parameters 376. In general, the driving task parameters 374 directly influence a driving task load that represents the mental resources required to perform the primary driving task. By contrast, the secondary task parameters 460 directly influence a secondary task load that represents the mental resources required to perform secondary tasks, such as operating the entertainment subsystem 244 or talking on a cellular phone. The vehicle parameters 374 and the environmental parameters 376 reflect circumstances that impact the mental resources required to perform the driving task and/or the secondary tasks. For example, and without limitation, the vehicle parameters 374 and the environmental parameters 376 could include the location of the vehicle, the condition of the road, the weather, the lighting conditions, and so forth. [0053] As shown, the cognitive load feedback engine 380 receives the current cognitive load 350 and the current driving context 370 and generates, without limitation, feedback signals 388, driving adjustment signals 382, entertainment subsystem adjustment signals 384, and navigation subsystem adjustment signals 386. In operation, the cognitive load feedback
engine 380 evaluates the current cognitive load 350 relative to a baseline cognitive load to determine whether the current cognitive load 350 is elevated. The cognitive load feedback engine 380 may determine the baseline cognitive load in any technically feasible fashion. For example, and without limitation, the baseline cognitive load could be a predetermined constant value. In some embodiments, the cognitive load feedback engine 380 may dynamically compute the baseline cognitive load based on any number and type of historical data associated with any number of drivers and any number of driving contexts.
[0054] If the cognitive load feedback engine 380 determines that the current cognitive load 350 is elevated relative to the baseline cognitive load, then the cognitive load feedback engine 380 may endeavor to reduce the current cognitive load 350. Notably, the cognitive load feedback engine 380 may examine the current driving context 370 to determine how to optimize the driving environment to reduce the driving task load and/or the secondary tasks loads. In general, the cognitive load feedback engine 380 may generate any number of control signals in any technically feasible fashion that is consistent with the capabilities and interfaces implemented in the vehicle. Such control signals may provide, without limitation, any combination of visual feedback, auditory feedback, haptic vibrational feedback, tactile feedback, force feedback, proprioceptive sensory feedback, and so forth.
[0055] For example, and without limitation, the cognitive load feedback engine 380 could transmit the feedback signals 388 that configure the display device 214 to provide visual feedback regarding the current cognitive load 350, historical cognitive loads, and recommendations for reducing the driving task and/or secondary tasks loads. If the vehicle is equipped with the advanced driving features, then the cognitive load feedback engine 380 could increase the conservatism of the vehicle via the driving adjustment signals 382, such as decreasing a baseline at which the ADAS 250 notifies the driver of a lane departure. In some embodiments, the cognitive load feedback engine 380 may configure the entertainment subsystem 244 via the entertainment subsystem adjustment signals 384 to reduce distractions associated with an in-vehicle audio system. In yet other embodiments, the cognitive load feedback engine 380 may configure the navigation subsystem 246 via the navigation subsystem adjustment signals 386 to replace a current driving route with a new driving route that is less congested, thereby lowering the mental resources required to perform the primary driving task.
[0056] Figure 4 illustrates the relationship between the current driving context 340 and the current cognitive load 350 of Figure 3, according to various embodiments. As shown, the
current driving context 340 includes the driving task parameters 372, the secondary task parameters 378, the vehicle parameters 374, and the environmental parameters 376. In general, the driving task parameters 372 directly influence a driving task load 450 that represents the mental resources required to perform the primary driving task By contrast, the secondary task parameters 460 directly influence a secondary task load 460 that represents the mental resources required to perform secondary tasks, such as talking on a cell phone.
[0057] Together, the driving task parameters 372, secondary task parameters 378, vehicle parameters 374, and environmental parameters 376 contribute to the current cognitive load 350. In particular, as the driving task load 450 and/or the secondary task load 460 increases, the current cognitive load 350 increases (depicted as an increasing cognitive load 472) within an overall cognitive load 470. The overall cognitive load 470 represents the total cognitive load of the driver and, within the overall cognitive load 470, a baseline cognitive load 474 reflects the typical cognitive loads of the driver.
[0058] As shown, initially the current cognitive load 350 exceeds the baseline cognitive load 474. In response, the cognitive load feedback engine 380 analyzes the current driving context 370 and transmits the navigation subsystem adjustment signal 386 "reroute via less congested roads" to the navigation subsystem 246, and the entertainment subsystem adjustment signal 384 "mute the audio system" to the entertainment subsystem 244. Subsequently, as a result of the reduction in the driving task load 450 and the secondary task load 460 attributable to, respectively, the navigation subsystem adjustment signal 386 and the entertainment subsystem adjustment signal 384, the current cognitive load 350 decreases and no longer exceeds the baseline cognitive load 474.
[0059] As the foregoing example illustrates, in general, if the current cognitive load 350 exceeds the baseline cognitive load 474, then the cognitive load feedback engine 380 attempts to adjust the current driving context 340 to either directly or indirectly reduce the current cognitive load 350. Accordingly, the level of driver distraction is reduced and the safety of the driver and surrounding drivers is increased.
[0060] Figure 5 is a flow diagram of method steps for managing cognitive load while driving, according to various embodiments. Although the method steps are described in conjunction with the systems of Figures 1-4, persons skilled in the art will understand that any system configured to implement the method steps, in any order, falls within the scope of the various embodiments.
[0061] As shown, a method 500 begins at step 504, where the cognitive load driving assistant 260 included in a vehicle receives sensor data via the driver-facing sensors 232 and the non- driver-facing sensors 234. The driver-facing sensors 232 may include any number of sensors that monitor characteristics of the driver. For example and without limitation, the driver- facing sensors 232 may include the pupil sensor 302, the heart rate sensor 304, the galvanic skin response (GSR) sensor 306, the blood pressure (BP) sensor 308, and the like. By contrast, the non-driver-facing sensors 324 monitor data that is not directly related to the driver, such as environmental data and vehicle data.
[0062] At step 506, the cognitive load driving assistant 260 computes the current cognitive load 350 based on the driver-facing sensor data. At step 508, the cognitive load driving assistant 260 computes the current driving context 370 based on the non-driver-facing sensor data in conjunction with other relevant environmental and vehicle data. The additional data may include any type of data received in any technically feasible fashion. For example, and without limitation, the additional data could include a location of the vehicle based on data received via the GNSS receiver 236 and locations of other vehicles based on V2X data 238. As persons skilled in the art will recognize, the cognitive load driving assistant 260 typically performs steps 506 and steps 508 substantially in parallel.
[0063] At step 10, the cognitive load driving assistant 260 transmits the current cognitive load 350 and the current driving context 370 to the cognitive load database 282 included in the cloud 280. Sharing cognitive data in this manner enables other cognitive load driving assistants 260 included in other vehicles to alert other drivers when the current cognitive load 350 indicates that the driver of the vehicle may pose a safety risk.
[0064] At step 512, the cognitive load feedback engine 380 computes the baseline cognitive load 474 based on historical cognitive load data in conjunction with historical driving contexts. The historical cognitive load data and the historical driving contexts may be stored in any memory, in any technically feasible fashion, and include any amount of data associated with any number of drivers. For example, and without limitation, the historical cognitive load data could be stored in the cognitive load database 282 and include data for many drivers. The cognitive load feedback engine 380 may compute the baseline cognitive load 474 in any technically feasible fashion. For example, and without limitation, the cognitive load feedback engine 380 could compute the baseline cognitive load 474 as the average of all historical cognitive loads associated with the driver.
[0065] At step 514, the cognitive load feedback engine 380 compares the current cognitive load 350 to the baseline cognitive load 474. If, at step 514, the cognitive load feedback engine 380 determines that the current cognitive load 350 is not greater than the baseline cognitive load 474, then the method 500 returns to step 504 where the cognitive load driving assistant 260 receives new sensor data. If, however, at step 514, the cognitive load feedback engine 380 determines that the current cognitive load 350 is greater than the baseline cognitive load 474, then the method 500 proceeds directly to step 516.
[0066] At step 516, the cognitive load feedback engine 380 provides feedback to the driver indicating the elevated current cognitive load 350. The cognitive load feedback engine 380 may provide the feedback in any technically feasible fashion and may include any additional data for reference. For example, and without limitation, the cognitive load feedback engine 380 could display an "evaluated cognitive load" warning via the dashboard-mounted display device 214. The warning could include the current cognitive load 350 and an indication of how the current cognitive load 350 relates to the baseline cognitive load 474. In another example, and without limitation, the cognitive load feedback engine 380 could audibly warn the driver that the current cognitive load 350 indicates a dangerous driving situation.
[0067] At step 518, the cognitive load feedback engine 380 performs corrective actions designed to reduce the driving task load 450 and/or the secondary task load 460 based on the current driving context 370 and/or the historical driving contexts. For example, and without limitation, the cognitive load feedback engine 380 could determine that the current driving route is challenging and, in response, interact with the navigation subsystem 246 to suggest a less congested route for the vehicle. In another example, and without limitation, the cognitive load feedback engine 380 could determine that the number of secondary tasks that the driver is performing significantly exceeds the number of secondary tasks that the driver typically performs and, in response, interact with the entertainment subsystem 244 to mute the speakers.
[0068] The method 500 then returns to step 504 where the cognitive load driving assistant 260 receives new sensor data. The cognitive load driving assistant 260 continues to cycle through steps 504-518, assessing the current cognitive load 350 to detect and attempt to minimize situations associated with elevated cognitive loads until the vehicle or the cognitive load driving assistant 260 is turned off.
[0069] In one embodiment, a cognitive driving assistant analyzes driver-facing sensor data and provides feedback regarding elevated driver cognitive loads to enable drivers to recognize
and react to dangerous driving environments. In operation, the cognitive driving assistant processes driver-facing sensor data to compute a current cognitive load. Substantially in parallel, the cognitive driving assistant processes non driver-facing sensor data along with other relevant data, such as GNSS data, to generate a current driving context. The current driving context includes driving parameters, vehicle parameters, environmental parameters, and secondary task parameters.
[0070] Because the impacts of different "distractions," such as talking on a cellular phone, vary between individual drivers, a cognitive load feedback engine analyzes the current cognitive load of the driver with respect to historical cognitive loads of the driver in similar driving contexts. For example, if a current time included in the current driving context indicates night time lighting conditions, then the cognitive load feedback engine could compare the current cognitive load of the driver to historical cognitive loads in other driving contexts that indicate night time lighting conditions. If the cognitive load feedback engine determines that the current cognitive load is greater than the "baseline" cognitive load in similar driving contexts, then the cognitive load feedback engine initiates corrective action. The corrective action may include any type of passive feedback, such as an audible warning, or any type of active control, such as disabling a ringer of a cellular phone.
[0071] In some embodiments, the cognitive load feedback engine transmits the current cognitive load and/or the current driving context to a cognitive load database stored in a public cloud. Such information enables other cognitive load feedback engines operating in other vehicles to preemptively identify dangerous driving situations. For example, if the current cognitive load of the driver is elevated, then a cognitive load feedback engine in a second vehicle located in the immediate vicinity of the vehicle could notify the driver of the second vehicle that a distracted driver is nearby. [0072] At least one advantage of the disclosed approach is that because the cognitive load feedback engine enables drivers to adjust driving and/or secondary task behavior based on cognitive loads, driver safety may be increased. In particular, educating drivers on their cognitive load levels and/or the cognitive load levels of nearby drivers provides drivers with an opportunity to increase their concentration on the primary driving task during challenging driving situations and/or reduce their concentration on secondary tasks. Consequently, driver safety may be increased for the driver as well as nearby drivers.
[0073] The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. [0074] Aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0075] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0076] Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to
produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors may be, without limitation, general purpose processors, special-purpose processors, application-specific processors, or field-programmable processors or gate arrays.
[0077] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0078] While the preceding is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Claims
1. A computer-readable storage medium including instructions that, when executed by a processor, cause the processor to perform the steps of:
computing a current cognitive load associated with a driver while the driver is operating a vehicle based on data received via one or more sensors; determining that the current cognitive load exceeds a threshold cognitive load; and in response, causing one or more actions to occur that are intended to reduce the current cognitive load associated with the driver.
2. The computer-readable storage medium of claim 1, wherein causing one or more actions to occur comprises configuring a display screen to display the current cognitive load.
3. The computer-readable storage medium of claim 1, wherein causing one or more actions to occur comprises configuring an entertainment subsystem to reduce the volume of an audio signal.
4. The computer-readable storage medium of claim 1, wherein causing one or more actions to occur comprises causing a navigation subsystem to generate a modified driving route that has less cognitive complexity than a current driving route.
5. The computer-readable storage medium of claim 1, wherein the one or more sensors measure one or more physiological changes in a body of the driver.
6. The computer-readable storage medium of claim 5, wherein the one or more physiological changes are associated with at least one of a brain activity, a heart rate, a skin conductance, and sizes of pupils.
7. The computer-readable storage medium of claim 5, wherein computing the current cognitive load comprises:
performing a multiplication operation between a first measurement received via a first sensor included in the one or more sensors and a first weight to generate a first weighted measurement;
performing a multiplication operation between a second measurement received via a second sensor included in the one or more sensors and a second weight to generate a second weighted measurement; and
computing an average of the first weighted measurement and the second weighted measurement.
8. The computer-readable storage medium of claim 1, further comprising computing a confidence factor associated with the current cognitive load based on data received via one or more other sensors.
9. The computer-readable storage medium of claim 8, wherein at least one or the one or more sensors measures sizes of pupils and the one or more other sensors measure at least one of a brain activity, a heart rate, and a skin conductance.
10. The computer-readable storage medium of claim 1, further comprising computing the threshold cognitive load based on a plurality of previous cognitive loads computed for the driver.
11. The computer-readable storage medium of claim 1, further comprising computing the threshold cognitive load based on a plurality of current cognitive loads computed for a plurality of other drivers.
12. A method for sharing cognitive load while driving, the method comprising:
computing a first cognitive load associated with a first driver based on data received via one or more sensors, wherein the first driver and the one or more sensors are associated with a first vehicle; and
sharing the first cognitive load with a second driver that is associated with a second vehicle.
13. The method of claim 12, wherein sharing the first cognitive load comprises storing the first cognitive load in a cognitive load database included in a cloud that is accessible to the second vehicle.
14. The method of claim 12, wherein sharing the first cognitive load comprises transmitting vehicle-to-everything data that includes the first cognitive load to the second vehicle.
15. The method of claim 12, further comprising:
receiving a second cognitive load that is associated with the second driver; and in response, modifying a characteristic of the first vehicle.
16. A system configured to manage cognitive load while driving, the system comprising: a memory storing a cognitive load driving assistant; and
a processor that is coupled to the memory and, when executing the cognitive load driving assistant, is configured to:
compute a current cognitive load associated with a driver, while the driver is operating a vehicle, based on data received via one or more sensors; determine that the current cognitive load exceeds a threshold cognitive load; and
taking one or more actions that are intended to reduce the current cognitive load below the threshold cognitive load.
17. The system of claim 16, wherein taking one or more actions comprises configuring a display screen to display the current cognitive load.
18. The system of claim 16, wherein taking one or more actions comprises configuring an advanced driver assistance system (ADAS) to increase the conservatism of one or more safety features associated with the vehicle.
19. The system of claim 16, wherein the processor is further configured to:
compute a current driving context based on data received via one or more other sensors;
select a previous driving context from a cognitive load database based on the current driving context; and
assign the threshold cognitive load based on a previous cognitive load associated with the previous driving context.
20. The system of claim 16, wherein the processor is further configured to compute a confidence factor associated with the current cognitive load based on data received via one or more other sensors.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP21193247.0A EP3936363B1 (en) | 2015-01-12 | 2016-01-11 | Cognitive load driving assistant |
| US15/541,458 US10399575B2 (en) | 2015-01-12 | 2016-01-11 | Cognitive load driving assistant |
| EP16706029.2A EP3245093B1 (en) | 2015-01-12 | 2016-01-11 | Cognitive load driving assistant |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201562102434P | 2015-01-12 | 2015-01-12 | |
| US62/102,434 | 2015-01-12 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016115053A1 true WO2016115053A1 (en) | 2016-07-21 |
Family
ID=55410186
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2016/012908 Ceased WO2016115053A1 (en) | 2015-01-12 | 2016-01-11 | Cognitive load driving assistant |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10399575B2 (en) |
| EP (2) | EP3936363B1 (en) |
| WO (1) | WO2016115053A1 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3187358A3 (en) * | 2015-12-29 | 2017-10-18 | Thunder Power New Energy Vehicle Development Company Limited | Onboard system for mitigating distraction risk |
| EP3444820A1 (en) * | 2017-08-17 | 2019-02-20 | Dolby International AB | Speech/dialog enhancement controlled by pupillometry |
| WO2019102290A1 (en) * | 2017-11-27 | 2019-05-31 | Blubrake S.R.L. | Adaptive brake assist system for a cyclist on a bicycle by an aptic feedback |
| FR3089323A1 (en) * | 2018-12-04 | 2020-06-05 | Psa Automobiles Sa | Method for determining a mode of interaction of a virtual personal assistant on board a land motor vehicle |
| US12205581B2 (en) | 2017-08-17 | 2025-01-21 | Dolby International Ab | Speech/dialog enhancement controlled by pupillometry |
| US12420707B2 (en) * | 2022-06-24 | 2025-09-23 | Magna Electronics Inc. | Vehicular control system with cross traffic alert and collision avoidance |
Families Citing this family (51)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6447481B2 (en) * | 2015-04-03 | 2019-01-09 | 株式会社デンソー | Startup proposal apparatus and startup proposal method |
| US10282666B1 (en) * | 2015-11-10 | 2019-05-07 | Google Llc | Coherency detection and information management system |
| DE112016006769B4 (en) * | 2016-05-20 | 2024-03-28 | Ford Global Technologies, Llc | Method for sign language input into a user interface of a vehicle and vehicle |
| US10474946B2 (en) * | 2016-06-24 | 2019-11-12 | Microsoft Technology Licensing, Llc | Situation aware personal assistant |
| US9919648B1 (en) | 2016-09-27 | 2018-03-20 | Robert D. Pedersen | Motor vehicle artificial intelligence expert system dangerous driving warning and control system and method |
| JP6365621B2 (en) * | 2016-10-19 | 2018-08-01 | マツダ株式会社 | Driving assistance device |
| US10474145B2 (en) * | 2016-11-08 | 2019-11-12 | Qualcomm Incorporated | System and method of depth sensor activation |
| US10334103B2 (en) * | 2017-01-25 | 2019-06-25 | International Business Machines Corporation | Message translation for cognitive assistance |
| CN115137336A (en) * | 2017-02-28 | 2022-10-04 | 松下知识产权经营株式会社 | Processing method, system and storage medium |
| US10572745B2 (en) * | 2017-11-11 | 2020-02-25 | Bendix Commercial Vehicle Systems Llc | System and methods of monitoring driver behavior for vehicular fleet management in a fleet of vehicles using driver-facing imaging device |
| US10850746B2 (en) | 2018-07-24 | 2020-12-01 | Harman International Industries, Incorporated | Coordinating delivery of notifications to the driver of a vehicle to reduce distractions |
| US11029171B2 (en) | 2018-08-28 | 2021-06-08 | Here Global B.V. | User familiarization with a novel route for reducing cognitive load associated with navigation |
| US11047697B2 (en) | 2018-08-28 | 2021-06-29 | Here Global B.V. | User familiarization with a novel route for reducing cognitive load associated with navigation |
| US10907986B2 (en) * | 2018-08-28 | 2021-02-02 | Here Global B.V. | User familiarization with a novel route for reducing cognitive load associated with navigation |
| US20200133308A1 (en) * | 2018-10-18 | 2020-04-30 | Cartica Ai Ltd | Vehicle to vehicle (v2v) communication less truck platooning |
| CN111080047A (en) * | 2018-10-22 | 2020-04-28 | 北京嘀嘀无限科技发展有限公司 | Method and device for judging completion condition of driving task and computer readable medium |
| US11488399B2 (en) | 2018-12-19 | 2022-11-01 | Magna Electronics Inc. | Vehicle driver monitoring system for determining driver workload |
| WO2020126020A1 (en) | 2018-12-20 | 2020-06-25 | Veoneer Sweden Ab | Triggering autonomous control based on driver cognitive load |
| JP2020111202A (en) * | 2019-01-11 | 2020-07-27 | 株式会社リコー | Display control device, display device, display system, moving body, program, image generation method |
| WO2020182281A1 (en) | 2019-03-08 | 2020-09-17 | Toyota Motor Europe | Electronic device, system and method for determining the perceptual capacity of an individual human |
| WO2020228947A1 (en) | 2019-05-15 | 2020-11-19 | Toyota Motor Europe | Electronic device, system and method for predicting the performance of an individual human during a visual perception task |
| CN110209394A (en) * | 2019-05-30 | 2019-09-06 | 西安交通大学城市学院 | A kind of individualized intelligent media interface method for building up of cognitive load driving |
| JP2021060927A (en) * | 2019-10-09 | 2021-04-15 | トヨタ自動車株式会社 | Information output control method and information output control system |
| US11308698B2 (en) * | 2019-12-05 | 2022-04-19 | Facebook Technologies, Llc. | Using deep learning to determine gaze |
| WO2021124140A1 (en) * | 2019-12-17 | 2021-06-24 | Indian Institute Of Science | System and method for monitoring cognitive load of a driver of a vehicle |
| US12210965B2 (en) | 2020-01-27 | 2025-01-28 | Honda Motor Co., Ltd. | Interpretable autonomous driving system and method thereof |
| GB2592034B (en) * | 2020-02-13 | 2022-02-09 | Ford Global Tech Llc | A cognitive overload sensing system and method for a vehicle |
| JP7354888B2 (en) * | 2020-03-17 | 2023-10-03 | トヨタ自動車株式会社 | Information processing device, program, and information processing method |
| US11735206B2 (en) | 2020-03-27 | 2023-08-22 | Harman International Industries, Incorporated | Emotionally responsive virtual personal assistant |
| US11702103B2 (en) * | 2020-04-02 | 2023-07-18 | Harman International Industries, Incorporated | Affective-cognitive load based digital assistant |
| US11878714B2 (en) * | 2020-04-06 | 2024-01-23 | Harman International Industries, Incorporated | Techniques for customizing self-driving models |
| US11211095B1 (en) * | 2020-06-19 | 2021-12-28 | Harman International Industries, Incorporated | Modifying media content playback based on user mental state |
| KR20220014945A (en) * | 2020-07-29 | 2022-02-08 | 현대모비스 주식회사 | System and method for monitering driver |
| CN115996666A (en) | 2020-09-11 | 2023-04-21 | 哈曼贝克自动系统股份有限公司 | System and method for determining cognitive demands |
| US11539762B2 (en) * | 2020-10-06 | 2022-12-27 | Harman International Industries, Incorporated | Conferencing based on driver state and context |
| US11548515B2 (en) | 2020-12-22 | 2023-01-10 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for managing driver habits |
| US12005922B2 (en) * | 2020-12-31 | 2024-06-11 | Honda Motor Co., Ltd. | Toward simulation of driver behavior in driving automation |
| US11691636B2 (en) * | 2021-01-05 | 2023-07-04 | Audi Ag | Systems and methods for generating a context-dependent experience for a driver |
| US12330658B2 (en) | 2021-04-20 | 2025-06-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Identifying an origin of abnormal driving behavior for improved vehicle operation |
| KR20240006527A (en) | 2021-05-11 | 2024-01-15 | 하만 베커 오토모티브 시스템즈 게엠베하 | Fused context sensors to improve biosignal extraction |
| US20220363264A1 (en) * | 2021-05-14 | 2022-11-17 | International Business Machines Corporation | Assessing driver cognitive state |
| CN118488909A (en) * | 2021-12-30 | 2024-08-13 | 哈曼贝克自动系统股份有限公司 | Method and system for driver monitoring using in-cabin contextual awareness |
| WO2023143721A1 (en) * | 2022-01-27 | 2023-08-03 | Lotus Tech Innovation Centre Gmbh | Car driver health monitoring method and car for implementing the method |
| DE102022102504B9 (en) * | 2022-02-03 | 2023-07-20 | Audi Aktiengesellschaft | Method for operating an interface device in a vehicle, and interface device and vehicle |
| FR3135242A1 (en) * | 2022-05-06 | 2023-11-10 | Psa Automobiles Sa | Method and device for controlling a display device on board a vehicle |
| US20240035841A1 (en) * | 2022-07-29 | 2024-02-01 | Nissan North America, Inc. | Navigation display system |
| US12030505B2 (en) * | 2022-11-10 | 2024-07-09 | GM Global Technology Operations LLC | Vehicle occupant mental wellbeing assessment and countermeasure deployment |
| EP4382335A1 (en) * | 2022-12-05 | 2024-06-12 | TomTom International B.V. | Method, apparatus and computer program for controlling output of information |
| US12403931B2 (en) | 2022-12-13 | 2025-09-02 | Magna Electronics Inc. | Vehicular driver monitoring system |
| US20250100348A1 (en) * | 2023-09-21 | 2025-03-27 | Mitsubishi Electric Automotive America, Inc. | Method for mitigating driver distraction |
| DE102024002293A1 (en) | 2024-07-15 | 2024-09-05 | Mercedes-Benz Group AG | Method for reducing the cognitive load of a driver of a motor vehicle, computer program product and motor vehicle |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1374773A1 (en) * | 2002-06-27 | 2004-01-02 | Pioneer Corporation | System for informing of driver's mental condition |
| EP1625040A1 (en) * | 2003-05-16 | 2006-02-15 | DaimlerChrysler AG | Method and device for influencing the demands on a driver in a motor vehicle |
| WO2007106722A2 (en) * | 2006-03-14 | 2007-09-20 | Temic Automotive Of North America, Inc. | System and method for determining a workload level of a driver |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7349782B2 (en) * | 2004-02-29 | 2008-03-25 | International Business Machines Corporation | Driver safety manager |
| US7938785B2 (en) * | 2007-12-27 | 2011-05-10 | Teledyne Scientific & Imaging, Llc | Fusion-based spatio-temporal feature detection for robust classification of instantaneous changes in pupil response as a correlate of cognitive response |
| JP5443141B2 (en) * | 2009-12-02 | 2014-03-19 | 株式会社デンソーアイティーラボラトリ | Workload indicator device, workload display method and program |
| JP5115611B2 (en) * | 2010-09-24 | 2013-01-09 | 株式会社デンソー | Call system, in-vehicle device, and exchange |
| US9055905B2 (en) * | 2011-03-18 | 2015-06-16 | Battelle Memorial Institute | Apparatuses and methods of determining if a person operating equipment is experiencing an elevated cognitive load |
| US20160007935A1 (en) * | 2014-03-19 | 2016-01-14 | Massachusetts Institute Of Technology | Methods and apparatus for measuring physiological parameters |
-
2016
- 2016-01-11 EP EP21193247.0A patent/EP3936363B1/en active Active
- 2016-01-11 US US15/541,458 patent/US10399575B2/en active Active
- 2016-01-11 EP EP16706029.2A patent/EP3245093B1/en active Active
- 2016-01-11 WO PCT/US2016/012908 patent/WO2016115053A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1374773A1 (en) * | 2002-06-27 | 2004-01-02 | Pioneer Corporation | System for informing of driver's mental condition |
| EP1625040A1 (en) * | 2003-05-16 | 2006-02-15 | DaimlerChrysler AG | Method and device for influencing the demands on a driver in a motor vehicle |
| WO2007106722A2 (en) * | 2006-03-14 | 2007-09-20 | Temic Automotive Of North America, Inc. | System and method for determining a workload level of a driver |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3187358A3 (en) * | 2015-12-29 | 2017-10-18 | Thunder Power New Energy Vehicle Development Company Limited | Onboard system for mitigating distraction risk |
| US9993191B2 (en) | 2015-12-29 | 2018-06-12 | Thunder Power New Energy Vehicle Development Company Limited | Onboard system for mitigating distraction risk |
| EP3444820A1 (en) * | 2017-08-17 | 2019-02-20 | Dolby International AB | Speech/dialog enhancement controlled by pupillometry |
| US12205581B2 (en) | 2017-08-17 | 2025-01-21 | Dolby International Ab | Speech/dialog enhancement controlled by pupillometry |
| WO2019102290A1 (en) * | 2017-11-27 | 2019-05-31 | Blubrake S.R.L. | Adaptive brake assist system for a cyclist on a bicycle by an aptic feedback |
| CN111741878A (en) * | 2017-11-27 | 2020-10-02 | 布鲁布拉克有限责任公司 | Adaptive Brake Assist for Riders on Bikes with Haptic Feedback |
| US11260836B2 (en) | 2017-11-27 | 2022-03-01 | Blubrake S.R.L. | Adaptive brake assist system for a cyclist on a bicycle by an haptic feedback |
| FR3089323A1 (en) * | 2018-12-04 | 2020-06-05 | Psa Automobiles Sa | Method for determining a mode of interaction of a virtual personal assistant on board a land motor vehicle |
| US12420707B2 (en) * | 2022-06-24 | 2025-09-23 | Magna Electronics Inc. | Vehicular control system with cross traffic alert and collision avoidance |
Also Published As
| Publication number | Publication date |
|---|---|
| US20180009442A1 (en) | 2018-01-11 |
| EP3245093A1 (en) | 2017-11-22 |
| EP3245093B1 (en) | 2021-09-01 |
| EP3936363B1 (en) | 2025-04-30 |
| US10399575B2 (en) | 2019-09-03 |
| EP3936363A1 (en) | 2022-01-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10399575B2 (en) | Cognitive load driving assistant | |
| US10493914B2 (en) | System and method for vehicle collision mitigation with vulnerable road user context sensing | |
| JP7455456B2 (en) | Method and system for enhanced warning based on driver status in hybrid driving | |
| US12130622B2 (en) | Manual control re-engagement in an autonomous vehicle | |
| EP3889740B1 (en) | Affective-cognitive load based digital assistant | |
| KR20200108827A (en) | Method and system for switching driving mode based on self-awareness parameters in hybrid driving | |
| US20180000398A1 (en) | Wearable device and system for monitoring physical behavior of a vehicle operator | |
| US20150302718A1 (en) | Systems and methods for interpreting driver physiological data based on vehicle events | |
| CN107336710A (en) | Drive consciousness estimating device | |
| WO2018190152A1 (en) | Information processing device, information processing method, and program | |
| US11912267B2 (en) | Collision avoidance system for vehicle interactions | |
| US10717443B2 (en) | Occupant awareness monitoring for autonomous vehicles | |
| WO2019122968A1 (en) | Method and system for risk control in switching driving mode | |
| EP3729399B1 (en) | Method and system for adapting augmented switching warning | |
| JP2018097485A (en) | Driving support apparatus, driving support method, driving support program, and driving support system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16706029 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15541458 Country of ref document: US |
|
| REEP | Request for entry into the european phase |
Ref document number: 2016706029 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |