US20210009149A1 - Distractedness sensing system - Google Patents
Distractedness sensing system Download PDFInfo
- Publication number
- US20210009149A1 US20210009149A1 US17/033,383 US202017033383A US2021009149A1 US 20210009149 A1 US20210009149 A1 US 20210009149A1 US 202017033383 A US202017033383 A US 202017033383A US 2021009149 A1 US2021009149 A1 US 2021009149A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- criterion
- controller
- driver
- distraction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000001514 detection method Methods 0.000 claims abstract description 10
- 210000004556 brain Anatomy 0.000 claims description 12
- 230000003044 adaptive effect Effects 0.000 claims description 8
- 230000009471 action Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 4
- 230000007613 environmental effect Effects 0.000 claims description 4
- 230000004424 eye movement Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 33
- 230000008569 process Effects 0.000 description 27
- 238000004458 analytical method Methods 0.000 description 16
- 210000003128 head Anatomy 0.000 description 12
- 230000002159 abnormal effect Effects 0.000 description 11
- 230000000007 visual effect Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 9
- 238000012544 monitoring process Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000000537 electroencephalography Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 230000004927 fusion Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 206010052804 Drug tolerance Diseases 0.000 description 3
- 230000007177 brain activity Effects 0.000 description 3
- 230000001149 cognitive effect Effects 0.000 description 3
- 230000026781 habituation Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 238000011002 quantification Methods 0.000 description 3
- 230000035484 reaction time Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 230000036626 alertness Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 210000003169 central nervous system Anatomy 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000001054 cortical effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000006260 foam Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006996 mental state Effects 0.000 description 2
- 230000001766 physiological effect Effects 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 210000004761 scalp Anatomy 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 210000003403 autonomic nervous system Anatomy 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002802 cardiorespiratory effect Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 210000003710 cerebral cortex Anatomy 0.000 description 1
- 230000036992 cognitive tasks Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 230000002503 metabolic effect Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005312 nonlinear dynamic Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000010344 pupil dilation Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/085—Changing the parameters of the control units, e.g. changing limit values, working points by control input
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
- B60N2/0021—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
- B60N2/0022—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for sensing anthropometric parameters, e.g. heart rate or body temperature
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
- B60N2/0021—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
- B60N2/0023—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for detection of driver fatigue
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
- B60N2/0021—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
- B60N2/0024—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for identifying, categorising or investigation of the occupant or object on the seat
- B60N2/0029—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for identifying, categorising or investigation of the occupant or object on the seat for detecting the motion of the occupant
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
- B60N2/0021—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
- B60N2/003—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement characterised by the sensor mounting location in or on the seat
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W50/16—Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
-
- G06K9/00845—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
- B60N2/0021—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement
- B60N2/0024—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for identifying, categorising or investigation of the occupant or object on the seat
- B60N2/0027—Seats provided with an occupancy detection means mounted therein or thereon characterised by the type of sensor or measurement for identifying, categorising or investigation of the occupant or object on the seat for detecting the position of the occupant or of occupant's body part
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/90—Details or parts not otherwise provided for
- B60N2002/981—Warning systems, e.g. the seat or seat parts vibrates to warn the passenger when facing a danger
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2210/00—Sensor types, e.g. for passenger detection systems or for controlling seats
- B60N2210/10—Field detection presence sensors
- B60N2210/16—Electromagnetic waves
- B60N2210/22—Optical; Photoelectric; Lidar [Light Detection and Ranging]
- B60N2210/24—Cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0881—Seat occupation; Driver or passenger presence
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/26—Incapacity
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
Definitions
- the present disclosure relates to systems with integrated sensors to provide sensed information about a person's distracted state.
- the system may include an electro-dermal potential (EDP) sensing system at least partially integrated into the vehicle cabin, which can include the a singular configuration or combined configuration involving the vehicle seat, headliner, structural pillars, instrument panels, and steering wheel, to sense a person and configured to output an electro-dermal potential signal; at least one additional sensor to sense additional data that can be used to determine distractedness; and a controller to receive the electro-dermal potential signal from the electro-dermal potential sensing system and the vehicle sensor to determine a distraction state of the person using both the electro-dermal potential signal and the vehicle-related data to reduce the likelihood of a false distraction state using only one of the vehicle-related data or the electro-dermal potential signal, the controller is to output a control signal when the distraction state is identified or exceeds a distraction threshold.
- the system can also determine when there is or is not a false distraction.
- the controller is configured to determine a false distraction state based in both the vehicle-related data and the electro-dermal potential signal.
- control signal is to adjust operation of an adaptive braking system in the vehicle.
- the system includes a seat configured to support the person as an occupant and to be mounted in a vehicle; and wherein the electro-dermal potential sensing system includes a contactless sensor mounted in the seat adjacent a head of the occupant.
- the seat may be configured to support an occupant and to be mounted in a vehicle.
- An electro-dermal potential sensing system is at least partially integrated into the seat to sense physiological properties of an occupant, e.g., a driver, and configured to output an electro-dermal potential signal.
- the sensed physiological properties of the occupant can include brain cortical activity.
- a controller is positioned in the vehicle to receive the electro-dermal potential signal from the electro-dermal potential sensing system to determine a distraction state of the driver.
- the controller also determines a false distraction state using the distraction state and other sensor signals in a vehicle, the controller is to output a control signal when the distraction state exceeds a distraction threshold and when there is not a false distraction determined.
- control signal is to adjust operation of a collision avoidance system or an adaptive braking system in the vehicle.
- the electro-dermal potential system includes a plurality of contactless sensors mounted in the vehicle cabin.
- the seat includes a head rest.
- the plurality of contactless sensors includes one or more headrest sensors mounted in the headrest to measure electro-dermal potential at a head of the driver.
- the seat includes a driver warning device to indicate to the driver that the control signal is output from the controller.
- the controller measures driver distraction based on individual frequency components, or rations thereof, in the electro-dermal potential signal.
- the controller uses the electro-dermal potential signal as an input to determine driver distraction and when distraction is detected outputs the control signal to increase a time to impact variable in an object avoidance calculation.
- the sensor signals include a video output from a cabin camera to detect the driver.
- the controller can use the video output and the electro-dermal potential signal to determine the distraction state of the driver.
- the sensor signals include a navigational position signal from a navigational position sensor to detect position of the vehicle.
- the controller can use the navigational position signal and the electro-dermal potential signal to determine if there is a false distraction state of the driver.
- the sensor signals include an external camera signal from an outward facing imager to produce video external to the vehicle.
- the controller can use the external camera signal and the electro-dermal potential signal to determine the false distraction state of the driver.
- the sensor signals include an internal video signal, an external camera signal, a navigational position signal, and a vehicle speed signal.
- the controller can use the internal video signal, the external camera signal, the navigational position signal, the vehicle speed signal and the electro-dermal potential signal to determine a possible false distraction state of the driver or to correct the distraction state by any number of countermeasures.
- a vehicle system includes a vehicle safety sensor system configured to sense external objects around the vehicle and output an external sensor signal.
- the vehicle system may also include a seat configured to support an occupant and to be mounted in a vehicle and an electro-dermal potential system at least partially integrated into the seat and configured to output an electro-dermal potential signal.
- the electro-dermal potential system includes a plurality of contactless sensors mounted in the vehicle.
- a controller is to receive the electro-dermal potential signal from the electro-dermal potential system and the external sensor signal and to output a control signal, using the electro-dermal potential signal and the external sensor signal, to adjust operation of the vehicle safety sensor system in the vehicle.
- the vehicle safety sensor system includes a detection and ranging system with a range setting to sense objects outside including a position and a range of an external object, e.g., a natural obstacle, another vehicle, an animal, or a person, and the external sensor signal includes the position and range of the external object.
- an external object e.g., a natural obstacle, another vehicle, an animal, or a person
- the controller outputs a range extension signal when the controller determines that the driver is distracted or lacking in focus on driving a vehicle using the electro-dermal potential signal, and wherein the vehicle safety system extends the range setting when the controller outputs the range extension signal.
- the vehicle safety sensor system includes a light sensor, a LIDAR, a camera, or combinations thereof.
- the vehicle safety sensor system includes a radio frequency sensor, RADAR or both.
- the vehicle system includes a collision avoidance system having a trigger time based on the control signal from the controller.
- the collision avoidance system is configured to trigger an avoidance action based on the trigger time
- the collision avoidance system has a first trigger time when distraction is not detected and a second trigger time when distraction is detected.
- the second trigger time being less than the first trigger time.
- a vehicle system uses at least two sensors sensing two criteria, which are different, when processed by a controller produces an indication of distractedness or focus of the occupant or driver.
- a first sensor senses a first criterion relating to distracted driving and controlled by the driver.
- a second sensor senses a second criterion relating to distracted driving and representing an environmental condition not controlled by the driver.
- a controller receives the first criterion and the second criterion and determines a relative relationship between the first criterion and the second criterion with the relative relationship exceeding a distractedness threshold to indicate distracted driving.
- the first criterion is vehicle speed and the second criterion is traffic congestion, other vehicles speeds adjacent the vehicle, or a combination of both.
- the controller compares the vehicle speed relative to the second criterion, and when the vehicle speed slows relative to the second criterion, the controller will indicate distractedness of the driver.
- FIG. 1 is a schematic view of a vehicle according to an example embodiment.
- FIG. 2 is a schematic view of a vehicle seat with sensors therein according to an example embodiment.
- FIG. 3A is a functional block diagram of a vehicle system according to an example embodiment.
- FIG. 3B is a functional block diagram of a vehicle system according to an example embodiment.
- FIG. 4 is a chart of false distraction detection according to an example embodiment.
- the present disclosure is generally directed to vehicle mounted sensors that can be embedded at least partially in the vehicle cabin or in any part of the foam, trim, headrest, frame or a combination thereof of a vehicle seat.
- the sensors can also be positioned in the headliner, the instrument panel, structural pillars, the steering wheel, or combinations thereof. At least one of the sensors determines the electro-dermal potential originating primarily from brain cortical activity.
- EDP sensing can be contact or non-contact (e.g., field sensing) and can also sense muscle activity and skin characteristics. This will reveal high-level central nervous system (CNS) functions such as distraction or drowsiness.
- CNS central nervous system
- the systems described herein employ real-time processing of the electrical potential fluctuations, e.g., comparing various frequency bands of the sensed signal with respect to each other. These can act as the primary brain activity quantitative classifiers.
- the present systems may use the sensed signals along with other sensor information to determine false positives of distraction based on the sensed EDP signal. This system, through the acquisition of the appropriate physiological metrics and use of a software algorithm, can determine if the occupant is distracted and not attentive to the road task of the moment while correcting for false positive indications of distraction.
- a distractedness sensing system can be integrated with the seat including one or more sensors embedded in any part of the seat, e.g., the foam, the trim, the headrest or a combination thereof.
- the contactless EDP sensing system can be supplemented by appropriate physiological metrics [heart rate, heart rate variability (HRV), Cardiorespiratory Coupling/Synchrogram (CRS), breathing rate, EDP pattern shift and the like, for both standard and complex non-linear dynamics] of the seat occupant, e.g., the driver.
- HRV heart rate variability
- CRS Cardiorespiratory Coupling/Synchrogram
- breathing rate e.g., the driver.
- a controller can receive the sensed physiological metrics relevant signals and determine if the occupant is distracted and therefore if attention and reaction time is affected.
- the controller can be adapted to individual occupants using an automated user-specific calibration.
- This system can also include cameras strategically positioned to look at the driver. Inward cameras can be used in conjunction with the seat sensors to achieve sensor fusion and increase specificity and accuracy of the distraction level detection.
- the camera generates multiple images of the occupant, which can be analyzed to determine additional occupant metrics.
- the metrics can include head position, a blink rate, pupil dilation, eye position, fixation, gaze patterns, eyelid closure, head movement facial expression, overall skeletal position, and the like.
- the camera system takes an image and image processing circuitry analyzes the image to determine the image metric.
- the use of various metrics from different sources provides an objective quantification of distraction of the occupant.
- the distraction quantification can be combined with other data in the vehicle to prevent false indications of distraction, e.g., vehicle performance, driving environment, and the like. If the distraction quantification level exceeds a distraction threshold, then the vehicle may automatically trigger countermeasures, e.g., alerts, alarms, collision avoidance, and the like. If the distraction status of the driver is quantified, the vehicle can change reaction times of the collision avoidance system, e.g., the adaptive braking system, to optimize the response of the system itself in view of the driver condition as at least partly determined by the distraction level.
- the collision avoidance system e.g., the adaptive braking system
- a vehicle system uses at least two sensors sensing two criteria, which are different, when processed by a controller produces an indication of distractedness or focus of the occupant or driver.
- a first sensor senses a first criterion relating to distracted driving and controlled by the driver.
- a second sensor senses a second criterion relating to distracted driving and representing an environmental condition not controlled by the driver.
- a controller receives the first criterion and the second criterion and determines a relative relationship between the first criterion and the second criterion with the relative relationship exceeding a distractedness threshold to indicate distracted driving.
- FIG. 1 shows a vehicle 100 including a cabin 115 and an engine bay 116 , which can be forward of the cabin 115 .
- the engine bay 116 houses a motor 101 that provides motive power to the vehicle.
- a controller 102 includes an electrical signal processor adapted to execute tasks, which can be stored in a memory. The tasks can process sensed signals according to rules loaded into the controller 102 . The sensed data can be stored in memory associated with the controller 102 .
- Visual systems 103 are provided to receive instructions from the controller 102 and produce visual displays in the vehicle, e.g., in the cabin on display screens, the dashboard, a mobile electronic device associated with the vehicle.
- the displays produced by the visual systems can be images sensed by an internal camera 104 , an external camera 105 , collision warnings, distraction warnings, and the like.
- the visual system 103 can process the image data from the cameras 104 , 105 before providing the image data to the controller 102 .
- the visual system 103 can process images to identify objects and the position of the driver in an example embodiment. This data can be provided to the controller 102 .
- An audio system 106 can be part of a head unit in the vehicle.
- the head unit can be an electronic processor to process audio signals or sensed signals in the vehicle.
- the audio system 106 can sense audio in the cabin 115 and output audio into the cabin, e.g., using multiple speakers.
- the audio output from the audio system 106 can be warnings as described herein based on instruction from the controller 102 .
- the audio warnings can be spoken words or tones to indicate driver distraction, change in settings, imminent danger, activation of collision warning system or combinations thereof.
- a vehicle speed sensor 107 is provided to detect the speed of the vehicle and provide a speed signal to the controller 102 .
- the vehicle speed sensor can include the throttle position sensor.
- a navigational position system 108 detects the position of the vehicle by receipt of satellite signals or ground based position signals.
- the navigational position system 108 can include a global navigation satellite system (GNSS) such as Global Positioning System (GPS), Beidou, COMPASS, Galileo, GLONASS, Indian Regional Navigational Satellite System (IRNSS), or QZSS.
- GNSS global navigation satellite system
- GPS Global Positioning System
- Beidou Beidou
- COMPASS Galileo
- GLONASS Galileo
- GLONASS Galileo
- IRNSS Indian Regional Navigational Satellite System
- QZSS Quality of satellite system
- the navigational system can include a receiver that receives differential correction signals in North American from the FAA's WAAS system.
- the navigational position system 108 provides accurate position of the vehicle to the controller 102 .
- a distraction alarm 109 is positioned in the cabin 115 .
- the distraction alarm 109 can include mechanical alarms like vibration devices that can be positioned in the steering wheel or the seat.
- the distraction alarm 109 can be a signal to vibrate a mobile electronic device associated with the vehicle and a passenger in the vehicle.
- a vehicle seat 110 is positioned in the cabin 115 and is configured to support a person, e.g., a driver or a passenger.
- the seat 110 can include a plurality of sensors 150 , 155 , 156 to detect various biometric characteristics of the person.
- the sensors 150 can be contactless and can sense EDP adjacent the head of the seated person.
- the sensors 155 and 156 can detect other biometric information.
- the sensors 155 , 156 can be contactless, e.g., sensing parameters from the occupant without physically contacting the occupant. In some instances, at least one of the sensors 156 can contact the occupant.
- a brake system 111 is provided to brake the wheels of the vehicle.
- the brake system 111 can be activated by the driver and can also be activated automatically by the controller 102 , e.g., when distracted driving is detected, a crash is detected as imminent, or an imminent danger is detected as described herein.
- a laser sensing system 112 e.g., a LIDAR, is provided.
- the laser sensing system 112 emits light in pulses and detects the light returned after the light reflects of object external to the vehicle 100 .
- the laser sensing system 112 can produce a digital three-dimensional representation of the external environment around the vehicle in the direction of the light pulses.
- the laser sensing system 112 can perform laser scanning to produce a representation around the vehicle.
- the external environment can include other vehicles, signs, animals, people, and other objects.
- the representation or individually identified objects can be provided to the controller 102 for use in the vehicle as described herein.
- a RADAR sensing system 113 is provided in the vehicle.
- the RADAR sensing system 113 emits radio frequency energy pulses and detects the returned pulses to identify objects around the vehicle or map the external environment.
- the representation or individually identified objects can be provided to the controller 102 for use in the vehicle as described herein.
- controller 102 may provide inputs to these other systems.
- FIG. 2 shows the vehicle seat 110 configured to be fixed in a cabin of a motor vehicle.
- the seat 110 is adapted to support a person on a base 201 in an upright position against a seat back 202 .
- the base 201 is fixed to the floors in the vehicle cabin, e.g., by rails.
- a headrestraint 203 may be positioned at the top of the seat back and act as a headrest.
- Each of the base 201 , seat back 202 , and headrestraint 203 include a rigid frame, comfort layers on the frame and an external covering.
- a plurality of sensors 150 , 155 , 156 can be supported in the seat.
- a plurality of first sensors 150 may be positioned in the headrest 203 and adapted to sense EDP signals from the occupant of the seat 110 .
- a plurality of second sensors 155 may be positioned in the seat back 202 .
- the plurality of second sensors 155 may also sense EDP signals from the seated occupant.
- the plurality of second sensors 155 may include at least one sensor that does not sense EDP signals.
- One or more third sensors 156 are positioned in the seat base 201 .
- the third sensors 156 may also sense EDP signals.
- the plurality of second sensors 155 may include at least one sensor that does not sense EDP signals and may, e.g., sense presence of a person in the seat using sensors in the seat back or seat and sense weight of the occupant of the seat using sensors in the seat base.
- the sensors 150 develop raw EDP signals, which are filtered to produce analysis signals including frequency components relevant to the EDP of the person in the seat while attenuating unrelated frequency components.
- a method for monitoring a mental state of a person having a body including a head positioned at the headrestraint adjacent sensors in the headrestraint.
- the method also includes positioning a sensor at least proximate to portions of the skin of the body below the head to develop raw signals, and processing the raw signals to produce at least one bandpass-filtered state-indicating signal representative of raw signal magnitude within a predetermined frequency range as an indication of the mental state (e.g., distracted state) of the person.
- At least one sensor 150 is positioned to be at the posterior of the head near or at the occipital-visual cortical region. This may assist in accurately measuring brain waves, e.g., through EDP.
- EDP electronic datagrams
- visual habituation is the brain's ability to decrease its response to repeated stimuli once the information has been processed and is no longer perceived as a relevant processing demand.
- the occupant should not experience significant habituation patterns as the visual scenery though mundane at times is in continuous variation and the conditions demand attention in such areas. Lack of activity related to visual processing or habituation of visual stimuli can serve as a subset classification of potential distraction in addition to other brain wave responses and secondary monitoring systems.
- FIG. 3A shows schematic view of a process 300 that can be implemented to determine distractedness using sensors, e.g., in a vehicle 100 .
- the controller monitors sensed data from an array of sensors 303 associated with the vehicle.
- the sensors can be any of the sensors described herein. Examples of sensors in the sensor array 303 include the EDP sensor, internal and external imagers, laser based sensors, seat sensors, and the like.
- the sensor array can include up to N sensors, where N is any positive integer.
- the sensor array 303 can monitor a driver or an occupant of the vehicle seat.
- the monitoring can include EDP sensing using the contactless sensors 150 .
- the EDP signals are used to detect a distraction state of the driver.
- the EDP signals can be separated into various sub-signals, e.g., at different frequencies, by using filters to allow certain divisions into sub-bands. These sub-bands may overlap in frequency ranges. A general range of frequencies for each sub-band can be defined within a reasonable variance.
- a first sub-signal can be up to four hertz.
- a second sub-signal can be four hertz to seven hertz.
- a third sub-signal can be seven hertz to fourteen hertz.
- a fourth sub-signal can be fourteen hertz to about thirty hertz.
- a fifth sub-signal can be about thirty hertz to about one hundred hertz. Other sub-signals may overlap these ranges for the first through sixth sub-signals, e.g., from eight hertz to thirteen hertz.
- the relationships between these sub-signals can be used to determine whether the driver is distracted from the task of driving.
- the patterns of the sub-signals or the ratios of multiple sub-signals to each other can be used to determine if a distraction is occurring.
- the sensor array 303 can include a vehicle cabin imager, e.g., a camera, that is used to detect the driver in the vehicle seat.
- the camera data is used to detect a distraction pattern in the driver.
- the camera can detect movement or lack of movement of the driver, facial features of the driver or both.
- the camera data can be video signals sent to a data processor in the controller to determine if the driver matches a distraction pattern.
- the data processor can determine if the actions of the driver or the image of the driver matches a known distraction pattern. Examples of distraction patterns can include head position or eye position not directed forward in a manner of a driver looking out the windshield.
- the sensor array 303 can include external imagers, cameras, LIDAR (Light Detection and Ranging), RADAR (RAdio Detection and Ranging), and SONAR (SOund Navigation and Ranging). These systems can detect objects around the vehicle, e.g., other vehicles, stop signs, stop lights, and the like.
- the imagers can detect the color and shape of external objects.
- LIDAR and RADAR can detect the size and position of external objects relative to the vehicle, which may be in motion.
- the sensed data from each of the individual sensors are compared to thresholds for sensed data of the respective sensors.
- the image data from an imager is compared to image data, e.g., the change in pixels can be used to indicate that a threshold in the image data has been exceeded.
- the EDP data can be compared at multiple frequencies to determine if an EDP signal, e.g., a brain wave, exceeds an EDP threshold to indicate distractedness.
- the EDP data can also be compared to EDP patterns over a time period with these patterns being indicative of a person who is focused or a person who is distracted.
- a LIDAR, RADAR, or sonar sensor can detect the relative position of the vehicle compared to objects or other vehicles.
- a navigational sensor can be used to determine the location of the vehicle and provide speed data for the present vehicle and other vehicles around the present vehicle being operated by the person for whom distractedness is being determined.
- Seat sensors can determine the position of the person, metabolic and physiological parameters, biometric parameters, EDP, and other data related to the person.
- Each of the sensed data can be compared to thresholds that are stored in memory in the vehicle.
- the memory is associated with the vehicle controller.
- the process 300 relies on distractedness being determined based on at least data from two sensors.
- the primary sensor can be the EDP sensor, which can be weighted more heavily in the process to control the vehicle. If only one sensor indicates distractedness, then the process moves to step 306 and returns to monitoring at step 301 . If two or more sensors indicate distractedness via triggered thresholds, then it is determined if whether the determination of distractedness is false at step 310 . In an example, the EDP data from the EDP sensor by itself may falsely indicate distractedness.
- the other data can be combined with the EDP data or sensor results to determine if the EDP is falsely indicating distractedness or that the person is focused on a different task.
- An example of a process for determining falseness is described in greater detail with regard to FIG. 4 .
- the combined data from the sensor array determines whether the driver is distracted by using the results from two or more sensors or using all of the results from each sensor. If distractedness is determined at step 315 , then the vehicle can initiate countermeasures at step 317 . If distractedness is not found, i.e., the person is focused, then the process moves to step 313 and returns to the controller monitoring the sensors 301 .
- the countermeasures at step 317 can include a distraction warning, e.g., an audible warning through the vehicle audio system, a light warning, or mechanical warnings.
- Mechanical warnings can include vibration warnings, e.g., in the steering wheel, in the seat, in pedals, in the driver's mobile phone, or combinations thereof.
- the mechanical warnings can vibrate vehicle components that contact the occupant or would notify the occupant.
- the driver's mobile phone may be electrically connected to the vehicle through a wired connection or wireless connection, e.g., Bluetooth, WIFI or the like.
- the countermeasures can also include secondary countermeasures, e.g., activating and/or increasing the range of the vehicle anti-collision systems or adaptive cruise control.
- the secondary countermeasures are vehicle controls and processes. Primary countermeasures are those that encourage the distracted occupant to refocus and not be distracted.
- the LIDAR/RADAR detection range can be increased as a countermeasure.
- the external camera range can be increased. Increasing the range of detection allows the systems to detect objects farther away from the vehicle and allows more time to automatically process to compensate for the distracted state of the driver.
- the increase of the time buffers representing the distance from the vehicle to an object outside the vehicle, e.g., other vehicles on the road, road hazards and the like increases the distance from an object at which the vehicle can automatically activate a countermeasure or detect the object.
- the increase of the time buffers reduces the time to impact from the vehicle to the object. The vehicle will than activate collision avoidance systems sooner when the driver is distracted. After step 317 , the process can return to step 301 .
- FIG. 3B shows process 340 that can be implemented in the vehicle 100 to sense a distracted state of the occupant of the seat.
- the driver of the vehicle has the option to manually turn off the system for detecting distractedness. If the driver chooses to turn off the system for detecting distractedness at step 345 , the system turns OFF at step 363 .
- the process 340 can also be turned OFF when the vehicle is not moving or in park for a period of time.
- the decision to keep the system for detecting distractedness ON can be based on receiving at least two or more sensor signals that can be used to determine distractedness. In an example, at least one of the two sensor signals can be the EDP signal.
- the process moves to activating the sensor array at step 303 .
- the sensor array can be the same sensor array as described with reference to FIG. 3A and may include any sensor as described herein.
- the sensor array at step 303 outputs sensed data to the controller 102 to process the sensed data at step 351 .
- the processing at step 351 can include filtering or normalizing the raw data from the sensors in the sensors array from step 303 .
- the sensed data from step 351 is compared to thresholds that are individualized for each type of sensor data. If two or more thresholds are not met at 355 , which indicate the person is focused, then the process returns to the sensor array sensing data at step 303 .
- the present process is based on the assumption that the occupant is not distracted. It will be understood that a similar process, which assumes the driver is distracted and the system must prove the occupant is not distracted, is within the scope of the present disclosure.
- the sensing of data at step 303 can be continuous with the return indicating the present data does not indicate distractedness. If two or more different thresholds are met, then the process moves to a false alarm determination at step 357 . If a false distractedness is determined, the vehicle may not trigger countermeasures.
- the navigation positioning (e.g., GPS) data is used to confirm the distraction determination based on the EDP signal distractedness determination.
- the vehicle speed data is used to confirm the distraction determination.
- the images from the inward camera, the outward camera or both are used to confirm the distraction determination.
- a final determination of distractedness is made, which takes into account the sensor data relied on to determine distractedness and the false distracted determination to reduce the likelihood of false distractedness determinations. If it is determined at step 359 that the person is not distracted, then the process returns to step 303 . If distractedness is determined, then the vehicle can initiate countermeasures at step 360 . After countermeasures are initiated, then process returns to a determination to keep the distraction algorithm ON at step 345 .
- the countermeasures can remain ON for a set period of time or until the system determines that the occupant is no longer distracted or when the distractedness is determined to be false. In an example embodiment, the countermeasures remain ON until the present methods and systems determine that the driver is now not distracted.
- the present distractedness determination processes 300 , 350 use data from two or more different sensors to determine distractedness.
- the EDP sensor can provide the primary data but data from other vehicle sensors can be combined to more accurately determine distractedness.
- the other sensed data can be data related to the occupant within the vehicle cabin, data from outside the vehicle cabin, or both.
- the combination of the distractedness determination based on each individual sensor can be used to reduce the likelihood of false indications of distractedness.
- the step 343 of turning ON the distraction intelligence in the vehicle proceeds to the sensors 303 for YES and to step 363 when the intelligence is turned OFF.
- the keep intelligence ON step 345 can be between the initiate countermeasures step 360 and the sensors 303 /
- FIG. 4 shows a table 400 of various sample scenarios 401 - 405 to use multiple inputs from different sources to determine the state of distractedness of a person.
- Any one sensor can be used as the primary data.
- Any other sensor can be used as secondary data to correct, to validate, or to invalidate the determination of distractedness based on the primary sensed data.
- the sensed EDP data can be the primary input for determining distractedness of a person being sensed by the sensors as described herein.
- the sensors for primary data can include vehicle speed, relative vehicle speed, positional data, navigational data, outward camera data, inward facing camera, and the like.
- the addition of the secondary data can be used to correct for false indications of driver distraction based solely on the primary data.
- the inputs for secondary data can include vehicle speed, navigational positioning information (GPS in North America), an external facing camera, an inward facing camera, possibly focused on the driver, and the like.
- GPS navigational positioning information
- the data from these devices is used in controller circuitry to determine if there is a false indication of driver distraction.
- the first sensed signals from the EDP sensor are determined to be normal, i.e. within a defined tolerance or range.
- the output from the controller circuitry will indicate that the diver is focused.
- a focused status of an occupant is the occupant person being not distracted from the task of driving.
- the primary sensor system first determines that the person is not distracted.
- the secondary sensor data can be added to verify driver's state of distraction.
- the vehicle speed with respect to the surrounding traffic as a secondary input is judged to be abnormal.
- the vehicle speed from the current vehicle is known from on board sensors.
- the surrounding vehicle speeds can be determined from sonar, radar, or lidar sensors, or combinations thereof, on the vehicle.
- the surrounding vehicle speeds can also be transmitted between vehicles in a vehicle communication network.
- An additional secondary sensor signal is the navigational position data with regard to a vehicle and possible traffic congestion at the vehicle location.
- the navigational or vehicle positioning sensor e.g., GPS in North America
- Traffic congestion can be a measure of vehicle position over time relative to the normal overall traffic flow for a particular time of day. This data can include the position of the present vehicle and combined with amalgamated data from a server with regard to traffic at that location and that time of day. Traffic congestion can also be sensed by an outward facing sensor, e.g., an imaging sensor, a camera sensor, a RADAR sensor, a laser sensor, or a sonar sensor.
- the outward sensor is a camera or imaging sensor and it does not detect traffic congestion.
- the inward imager or camera senses that the occupant is normal.
- the corrected distracted analysis based on a fusion of the primary sensed data and the secondary sensed data results in the determination from the controller circuitry that the occupant is focused, i.e. not distracted.
- a counter measure to counter act the distractedness of the occupant is not triggered.
- the sensed EDP is normal i.e. within a defined tolerance or range and the analysis based on the primary sensor data is that the occupant is focused.
- the secondary sensors can be used to check or confirm the primary sensor analysis.
- the relative vehicle speed is abnormal.
- the navigational result is no traffic congestion.
- the outward imaging sensor shows no traffic congestion.
- the inward camera senses abnormal occupant behavior, i.e. outside of a defined tolerance or range.
- the corrected analysis of the primary sensor data combined with the secondary sensor data results in a determination the occupant is distracted.
- the system further determines that the primary sensor data was analyzed and reached a false result. This can be used to teach the algorithm that its result was incorrect.
- the algorithm in the controller circuitry for analyzing the primary sensor data can change its parameters to more closely match the corrected results from the corrected analysis.
- the system can use the corrected analysis of not focused to trigger counter measures as described herein.
- the primary sensor determines that the EDP sensed data is abnormal in the sense that the occupant is not focused or is distracted.
- the EDP sensed data can be compared to focused waveforms and unfocused waveforms in the controller circuitry. When outside the focused thresholds of variability from the focused waveform, then the controller circuitry can determine that the occupant is not focused and therefore distracted.
- the secondary sensed data from the secondary sensors can be used to correct the determination that the occupant is distracted.
- the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor.
- the relative vehicle speed is abnormal.
- the navigational system sensor determines that there is no traffic congestion.
- the outward imager determines that there is no traffic congestion.
- the inward imager provides data showing abnormal, which indicates that the occupant is distracted.
- the primary data or analysis is combined with the secondary data (one input or more than one output) and determines that the occupant of the seat is not focused on the task of driving, i.e. is distracted.
- the analysis from the first sensor input is confirmed or does not produce a false distractedness reading or alarm.
- the countermeasures in the vehicle can be triggered based on the primary analysis and the sensor fusion with the secondary sensed data.
- the countermeasures can be any countermeasure as described herein.
- the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor.
- the relative vehicle speed is normal.
- the navigational system sensor determines that there is traffic congestion.
- the outward imager determines that there is traffic congestion.
- the inward imager provides data showing normal, which indicates that the occupant is focused.
- the primary data or analysis is combined with the secondary data and determines that the occupant of the seat is focused.
- the analysis from the first sensor input is determined to be a false reading that the driver is distracted.
- the secondary sensor data corrects the incorrect or false alarm from the primary sensor analysis.
- the countermeasures in the vehicle are not triggered.
- the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor.
- the relative vehicle speed is normal.
- the navigational system sensor determines that there is traffic congestion.
- the outward imager determines that there is traffic congestion.
- the inward imager provides data showing abnormal, which indicates that the occupant is distracted.
- the primary data or analysis is combined with the secondary data and determines that the occupant of the seat is not focused, i.e. is distracted.
- the analysis from the first sensor input is confirmed or does not produce a false distractedness reading or alarm.
- the countermeasures in the vehicle can be triggered based on the primary analysis and the sensor fusion with the secondary sensed data.
- the countermeasures can be any countermeasure as described herein.
- the scenarios 401 - 405 represent an example of circuitry applying an algorithm to the sensed data to output a resulting signal to trigger an alert, vehicle control, or vehicle countermeasure in attempt to lessen the effects of the distracted driving.
- Circuitry may exploit artificial intelligence and neural networks to determine distractedness and/or false distractedness.
- the criteria from the secondary sensors can be used in some example embodiments to indicate distractedness.
- the secondary sensors can detect secondary criteria related to the occupant or driver within the vehicle cabin.
- the secondary criteria can also include sensed data related to the environment outside the vehicle. The sensed data can be used to derive the secondary criteria. Relative relationships between criteria sensed by the secondary sensors can be determined and when these relationships indicate distractedness, the vehicle can warn the occupant or alter operation of the vehicle.
- the vehicle speed is sensed and used as a first criteria for determining distractedness. This criterion is controlled by the driver.
- first criterion can include sensed signals related to the driver, e.g., brain waves, HR, HRV, eye movement, body position and movement and the like. These are directly controlled by the driver or produced by the driver's body.
- the second criterion can be traffic congestion, other vehicles speeds adjacent the vehicle, or a combination of both.
- the second criterion is not controlled by the driver.
- the second criterion may relate to vehicle or outside the vehicle data that are not under the control or produced by the driver.
- the relationship between the first and second criteria can indicate distractedness. For example, when the first criterion changes in a known manner and the second criterion also changes in a known manner, this indicates distractedness.
- a distracted driver may slow his/her vehicle (i.e., first criterion) when surrounding traffic does not slow (i.e., second criterion).
- An additional first criterion can also be used to confirm distractedness, e.g., a slower and deeper breathing pattern, a change in posture, a slower heart rate, etc.
- a distracted driver may slow the vehicle (i.e., first criterion) and there is no slowing due to traffic congestion, other obstacle, traffic light, and the like (i.e., second criterion). This indicates possible distractedness of a driver.
- the controller can also take into account the operational status of the vehicle. If the vehicle is experiencing some type of operational failure, then the driver may not be distracted.
- the controller compares the first criterion, e.g., vehicle speed, relative to the second criterion. When the vehicle speed slows down relative to the second criterion, the controller will indicate distractedness of the driver.
- the second criteria can also be relative vehicle behavior.
- the first criterion can be changing vehicles speeds at a first rate.
- the second criterion can be the rates that the vehicle speeds of other vehicles change speed relative to the changing of speed of a vehicle by the driver's being sensed. If the present vehicle changes vehicle speeds more often than the other vehicles, this indicates distractedness.
- the throttle position can be sensed using a throttle position sensor.
- the throttle position can be used as part of the first criterion, a supplement to vehicle speed or a replacement for vehicle speed.
- the driver controls the position of the throttle, with the cruise control or adaptive cruise control off, which in turn controls vehicle speed.
- the system may utilize adaptive tolerancing of sensing system thresholds for both the occupant and outward environmental sensing technology to improve situational distraction classification and countermeasure readiness.
- the external system thresholds may be influenced by the internal system indications and equally the internals system thresholds may be influenced by the external system indications.
- the cognitive attention measurement via the sensed electrical brain activity may lower its distraction indication threshold if metrics such as the vehicle speed, weather, and/or lidar/radar systems indicate that the surrounding environment contains certain conditions (e.g., proximity to external objects, speed of vehicle, wet or icy road conditions) that requires higher alertness in that situation in comparison to normal operating conditions where non-distracted but also non-heightened attentions levels would not be concerning.
- metrics such as the vehicle speed, weather, and/or lidar/radar systems indicate that the surrounding environment contains certain conditions (e.g., proximity to external objects, speed of vehicle, wet or icy road conditions) that requires higher alertness in that situation in comparison to normal operating conditions where non-distracted but also non-heightened attentions levels would not be concerning.
- the internal eye and skeletal tracking sensing, autonomic nervous system monitoring, as well as the external systems and all other monitors can have multiple threshold levels both in the magnitude of their metrics (e.g.
- cognitive attention eye open area, turn, HR/BR, distance, speed
- temporal resolution e.g. seconds per degrees of turn allowed of the head.
- eye or skeletal tracking may indicate various degrees of turn away from the optimal viewing window of attention for various amounts of time. In situations requiring higher levels of alertness both the maximum degree of turn away from that plane and the time length allowed for each degree of turn may be lowered before an indication is given and/or a countermeasure is readied and perhaps activated.
- Long term data related to detected distraction can be processed secondary to the real-time algorithms to provide a variety of statistical information for both the occupant and machine learning systems.
- the long-term data may be stored in the vehicle or off-vehicle on a remote server.
- the vehicle may include electronic communication to an external server, e.g., over WIFI, mobile communication networks, such as cellular communications, and the like.
- the long-term distraction calculations may be used to alter the instructions for determining distraction or for mitigating false positives.
- the present disclosure quantifies the distraction/concentration status of the driver while correcting for false indications of distraction.
- the vehicle can use the distraction/concentration status of the driver to manipulate reaction times of various vehicle safety systems, e.g., the adaptive braking system, to optimize the response of the system itself. This may reduce the risk of forward collisions.
- the present system can be used in an autonomous vehicle, e.g., a levels 1-2 automobile(s), where the vehicle uses the level of distraction, a determination of distractedness, or the multiple sensor determination of a distracted driver, to be able to judge the most appropriate time to switch from manual to autonomous drive and vice-versa, or to engage certain levels of countermeasures.
- an autonomous vehicle e.g., a levels 1-2 automobile(s)
- the vehicle uses the level of distraction, a determination of distractedness, or the multiple sensor determination of a distracted driver, to be able to judge the most appropriate time to switch from manual to autonomous drive and vice-versa, or to engage certain levels of countermeasures.
- This system is beneficial to all modes of transportation extending even beyond automotive and personal vehicle.
- the present disclosure illustrates a controller 102 . It is within the scope of the present disclosure for the controller 102 to represent multiple processors, memories and electronic control units, which can work independently with various systems to affect the functions and tasks described herein. The vehicle may use a more distributed controller system then a single controller and remain within the scope of the present disclosure.
- the controller 102 includes circuitry to process sensed signals that represent real world conditions and data.
- the present disclosure describes the sensed EDP data to be the primary data and other data related to the person or the vehicle to be secondary data.
- some embodiments of the present disclosure use the sensed EDP as the secondary data to correct for false determinations of the distractedness based on the other non-EDP data.
- the internal camera and the operation of the vehicle e.g., drifting or crossing lines in the street, can be used to determine distractedness.
- the EDP signal can be used to validate any determination of distractedness.
- a further secondary data can be time, e.g., time of day and time of vehicle usage.
- the vehicle may input the time of day that the vehicle is being driven as a secondary input to prevent false determinations of distractedness or alter the levels of thresholds in the cameras for determining distractedness. When the time of day is night, then the thresholds of distractedness may be lowered.
- the vehicle may track the usual time that the vehicle is driven in a given time period. When the vehicle is operated outside the usual time periods, then there may be greater likelihood of distracted driving.
- EEG electroencephalography
- EEG electrophysiological monitoring method to record electrical activity of the brain. It is typically noninvasive, with the electrodes placed along the scalp, although invasive electrodes are sometimes used in specific applications. EEG measures voltage fluctuations resulting from ionic current within the neurons of the brain. In clinical contexts, EEG refers to the recording of the brain's spontaneous electrical activity over a period of time, as recorded from multiple electrodes placed on the scalp. Diagnostic applications generally focus on the spectral content of EEG, that is, the type of neural oscillations that can be observed in EEG signals.
Landscapes
- Engineering & Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Health & Medical Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Human Computer Interaction (AREA)
- Neurology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Cardiology (AREA)
- Psychology (AREA)
- Molecular Biology (AREA)
- Neurosurgery (AREA)
- Physiology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The present disclosure relates to systems with integrated sensors to provide sensed information about a person's distracted state.
- It is advantageous to be able to detect a person's focus and attention. For instance, driving of a motor vehicle while distracted, which is a type of driver error, is a significant cause of preventable road accidents. Vehicle systems that assist in warning a driver of distracted driving or take action in such an occurrence may reduce the number of such accidents or attempt to mitigate damage caused by driver distractedness.
- Systems and methods for detecting distractedness or lack of focus are described. The system may include an electro-dermal potential (EDP) sensing system at least partially integrated into the vehicle cabin, which can include the a singular configuration or combined configuration involving the vehicle seat, headliner, structural pillars, instrument panels, and steering wheel, to sense a person and configured to output an electro-dermal potential signal; at least one additional sensor to sense additional data that can be used to determine distractedness; and a controller to receive the electro-dermal potential signal from the electro-dermal potential sensing system and the vehicle sensor to determine a distraction state of the person using both the electro-dermal potential signal and the vehicle-related data to reduce the likelihood of a false distraction state using only one of the vehicle-related data or the electro-dermal potential signal, the controller is to output a control signal when the distraction state is identified or exceeds a distraction threshold. In an example embodiment, the system can also determine when there is or is not a false distraction.
- In an example embodiment, the controller is configured to determine a false distraction state based in both the vehicle-related data and the electro-dermal potential signal.
- In an example embodiment, the control signal is to adjust operation of an adaptive braking system in the vehicle.
- In an example embodiment, the system includes a seat configured to support the person as an occupant and to be mounted in a vehicle; and wherein the electro-dermal potential sensing system includes a contactless sensor mounted in the seat adjacent a head of the occupant.
- A vehicle seating system with sensors to sense the distraction of a driver or occupant of the vehicle who may be seated in a vehicle seat. The seat may be configured to support an occupant and to be mounted in a vehicle. An electro-dermal potential sensing system is at least partially integrated into the seat to sense physiological properties of an occupant, e.g., a driver, and configured to output an electro-dermal potential signal. The sensed physiological properties of the occupant can include brain cortical activity. A controller is positioned in the vehicle to receive the electro-dermal potential signal from the electro-dermal potential sensing system to determine a distraction state of the driver. The controller also determines a false distraction state using the distraction state and other sensor signals in a vehicle, the controller is to output a control signal when the distraction state exceeds a distraction threshold and when there is not a false distraction determined.
- In an example embodiment, the control signal is to adjust operation of a collision avoidance system or an adaptive braking system in the vehicle.
- In an example embodiment, the electro-dermal potential system includes a plurality of contactless sensors mounted in the vehicle cabin.
- In an example embodiment, the seat includes a head rest. The plurality of contactless sensors includes one or more headrest sensors mounted in the headrest to measure electro-dermal potential at a head of the driver.
- In an example embodiment, the seat includes a driver warning device to indicate to the driver that the control signal is output from the controller.
- In an example embodiment, the controller measures driver distraction based on individual frequency components, or rations thereof, in the electro-dermal potential signal.
- In an example embodiment, the controller uses the electro-dermal potential signal as an input to determine driver distraction and when distraction is detected outputs the control signal to increase a time to impact variable in an object avoidance calculation.
- In an example embodiment, the sensor signals include a video output from a cabin camera to detect the driver. The controller can use the video output and the electro-dermal potential signal to determine the distraction state of the driver.
- In an example embodiment, the sensor signals include a navigational position signal from a navigational position sensor to detect position of the vehicle. The controller can use the navigational position signal and the electro-dermal potential signal to determine if there is a false distraction state of the driver.
- In an example embodiment, the sensor signals include an external camera signal from an outward facing imager to produce video external to the vehicle. The controller can use the external camera signal and the electro-dermal potential signal to determine the false distraction state of the driver.
- In an example embodiment, the sensor signals include an internal video signal, an external camera signal, a navigational position signal, and a vehicle speed signal. The controller can use the internal video signal, the external camera signal, the navigational position signal, the vehicle speed signal and the electro-dermal potential signal to determine a possible false distraction state of the driver or to correct the distraction state by any number of countermeasures.
- A vehicle system is described that includes a vehicle safety sensor system configured to sense external objects around the vehicle and output an external sensor signal. The vehicle system may also include a seat configured to support an occupant and to be mounted in a vehicle and an electro-dermal potential system at least partially integrated into the seat and configured to output an electro-dermal potential signal. The electro-dermal potential system includes a plurality of contactless sensors mounted in the vehicle. A controller is to receive the electro-dermal potential signal from the electro-dermal potential system and the external sensor signal and to output a control signal, using the electro-dermal potential signal and the external sensor signal, to adjust operation of the vehicle safety sensor system in the vehicle.
- In an example embodiment, the vehicle safety sensor system includes a detection and ranging system with a range setting to sense objects outside including a position and a range of an external object, e.g., a natural obstacle, another vehicle, an animal, or a person, and the external sensor signal includes the position and range of the external object.
- In an example embodiment, the controller outputs a range extension signal when the controller determines that the driver is distracted or lacking in focus on driving a vehicle using the electro-dermal potential signal, and wherein the vehicle safety system extends the range setting when the controller outputs the range extension signal.
- In an example embodiment, the vehicle safety sensor system includes a light sensor, a LIDAR, a camera, or combinations thereof.
- In an example embodiment, the vehicle safety sensor system includes a radio frequency sensor, RADAR or both.
- In an example embodiment, the vehicle system includes a collision avoidance system having a trigger time based on the control signal from the controller. The collision avoidance system is configured to trigger an avoidance action based on the trigger time
- In an example embodiment, the collision avoidance system has a first trigger time when distraction is not detected and a second trigger time when distraction is detected. The second trigger time being less than the first trigger time.
- A vehicle system is described that uses at least two sensors sensing two criteria, which are different, when processed by a controller produces an indication of distractedness or focus of the occupant or driver. In an example, a first sensor senses a first criterion relating to distracted driving and controlled by the driver. In an example, a second sensor senses a second criterion relating to distracted driving and representing an environmental condition not controlled by the driver. A controller receives the first criterion and the second criterion and determines a relative relationship between the first criterion and the second criterion with the relative relationship exceeding a distractedness threshold to indicate distracted driving.
- In an example, the first criterion is vehicle speed and the second criterion is traffic congestion, other vehicles speeds adjacent the vehicle, or a combination of both. In an example, the controller compares the vehicle speed relative to the second criterion, and when the vehicle speed slows relative to the second criterion, the controller will indicate distractedness of the driver.
- Any of the above examples may be combined with each other to form additional embodiments of the present disclosure.
-
FIG. 1 is a schematic view of a vehicle according to an example embodiment. -
FIG. 2 is a schematic view of a vehicle seat with sensors therein according to an example embodiment. -
FIG. 3A is a functional block diagram of a vehicle system according to an example embodiment. -
FIG. 3B is a functional block diagram of a vehicle system according to an example embodiment. -
FIG. 4 is a chart of false distraction detection according to an example embodiment. - As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
- The present disclosure is generally directed to vehicle mounted sensors that can be embedded at least partially in the vehicle cabin or in any part of the foam, trim, headrest, frame or a combination thereof of a vehicle seat. The sensors can also be positioned in the headliner, the instrument panel, structural pillars, the steering wheel, or combinations thereof. At least one of the sensors determines the electro-dermal potential originating primarily from brain cortical activity. Such EDP sensing can be contact or non-contact (e.g., field sensing) and can also sense muscle activity and skin characteristics. This will reveal high-level central nervous system (CNS) functions such as distraction or drowsiness. The systems described herein employ real-time processing of the electrical potential fluctuations, e.g., comparing various frequency bands of the sensed signal with respect to each other. These can act as the primary brain activity quantitative classifiers. The present systems may use the sensed signals along with other sensor information to determine false positives of distraction based on the sensed EDP signal. This system, through the acquisition of the appropriate physiological metrics and use of a software algorithm, can determine if the occupant is distracted and not attentive to the road task of the moment while correcting for false positive indications of distraction.
- A distractedness sensing system can be integrated with the seat including one or more sensors embedded in any part of the seat, e.g., the foam, the trim, the headrest or a combination thereof. The contactless EDP sensing system can be supplemented by appropriate physiological metrics [heart rate, heart rate variability (HRV), Cardiorespiratory Coupling/Synchrogram (CRS), breathing rate, EDP pattern shift and the like, for both standard and complex non-linear dynamics] of the seat occupant, e.g., the driver. A controller can receive the sensed physiological metrics relevant signals and determine if the occupant is distracted and therefore if attention and reaction time is affected. The controller can be adapted to individual occupants using an automated user-specific calibration.
- This system can also include cameras strategically positioned to look at the driver. Inward cameras can be used in conjunction with the seat sensors to achieve sensor fusion and increase specificity and accuracy of the distraction level detection. The camera generates multiple images of the occupant, which can be analyzed to determine additional occupant metrics. The metrics can include head position, a blink rate, pupil dilation, eye position, fixation, gaze patterns, eyelid closure, head movement facial expression, overall skeletal position, and the like. The camera system takes an image and image processing circuitry analyzes the image to determine the image metric.
- The use of various metrics from different sources provides an objective quantification of distraction of the occupant. The distraction quantification can be combined with other data in the vehicle to prevent false indications of distraction, e.g., vehicle performance, driving environment, and the like. If the distraction quantification level exceeds a distraction threshold, then the vehicle may automatically trigger countermeasures, e.g., alerts, alarms, collision avoidance, and the like. If the distraction status of the driver is quantified, the vehicle can change reaction times of the collision avoidance system, e.g., the adaptive braking system, to optimize the response of the system itself in view of the driver condition as at least partly determined by the distraction level.
- A vehicle system is described that uses at least two sensors sensing two criteria, which are different, when processed by a controller produces an indication of distractedness or focus of the occupant or driver. In an example, a first sensor senses a first criterion relating to distracted driving and controlled by the driver. In an example, a second sensor senses a second criterion relating to distracted driving and representing an environmental condition not controlled by the driver. A controller receives the first criterion and the second criterion and determines a relative relationship between the first criterion and the second criterion with the relative relationship exceeding a distractedness threshold to indicate distracted driving.
-
FIG. 1 shows avehicle 100 including acabin 115 and anengine bay 116, which can be forward of thecabin 115. Theengine bay 116 houses amotor 101 that provides motive power to the vehicle. Acontroller 102 includes an electrical signal processor adapted to execute tasks, which can be stored in a memory. The tasks can process sensed signals according to rules loaded into thecontroller 102. The sensed data can be stored in memory associated with thecontroller 102. -
Visual systems 103 are provided to receive instructions from thecontroller 102 and produce visual displays in the vehicle, e.g., in the cabin on display screens, the dashboard, a mobile electronic device associated with the vehicle. The displays produced by the visual systems can be images sensed by aninternal camera 104, anexternal camera 105, collision warnings, distraction warnings, and the like. Thevisual system 103 can process the image data from thecameras controller 102. Thevisual system 103 can process images to identify objects and the position of the driver in an example embodiment. This data can be provided to thecontroller 102. - An
audio system 106 can be part of a head unit in the vehicle. The head unit can be an electronic processor to process audio signals or sensed signals in the vehicle. Theaudio system 106 can sense audio in thecabin 115 and output audio into the cabin, e.g., using multiple speakers. The audio output from theaudio system 106 can be warnings as described herein based on instruction from thecontroller 102. The audio warnings can be spoken words or tones to indicate driver distraction, change in settings, imminent danger, activation of collision warning system or combinations thereof. - A
vehicle speed sensor 107 is provided to detect the speed of the vehicle and provide a speed signal to thecontroller 102. The vehicle speed sensor can include the throttle position sensor. - A
navigational position system 108 detects the position of the vehicle by receipt of satellite signals or ground based position signals. Thenavigational position system 108 can include a global navigation satellite system (GNSS) such as Global Positioning System (GPS), Beidou, COMPASS, Galileo, GLONASS, Indian Regional Navigational Satellite System (IRNSS), or QZSS. The navigational system can include a receiver that receives differential correction signals in North American from the FAA's WAAS system. Thenavigational position system 108 provides accurate position of the vehicle to thecontroller 102. - A
distraction alarm 109 is positioned in thecabin 115. Thedistraction alarm 109 can include mechanical alarms like vibration devices that can be positioned in the steering wheel or the seat. Thedistraction alarm 109 can be a signal to vibrate a mobile electronic device associated with the vehicle and a passenger in the vehicle. - A
vehicle seat 110 is positioned in thecabin 115 and is configured to support a person, e.g., a driver or a passenger. Theseat 110 can include a plurality ofsensors sensors 150 can be contactless and can sense EDP adjacent the head of the seated person. Thesensors sensors sensors 156 can contact the occupant. - A
brake system 111 is provided to brake the wheels of the vehicle. Thebrake system 111 can be activated by the driver and can also be activated automatically by thecontroller 102, e.g., when distracted driving is detected, a crash is detected as imminent, or an imminent danger is detected as described herein. - A
laser sensing system 112, e.g., a LIDAR, is provided. Thelaser sensing system 112 emits light in pulses and detects the light returned after the light reflects of object external to thevehicle 100. Thelaser sensing system 112 can produce a digital three-dimensional representation of the external environment around the vehicle in the direction of the light pulses. Thelaser sensing system 112 can perform laser scanning to produce a representation around the vehicle. The external environment can include other vehicles, signs, animals, people, and other objects. The representation or individually identified objects can be provided to thecontroller 102 for use in the vehicle as described herein. - A
RADAR sensing system 113 is provided in the vehicle. TheRADAR sensing system 113 emits radio frequency energy pulses and detects the returned pulses to identify objects around the vehicle or map the external environment. The representation or individually identified objects can be provided to thecontroller 102 for use in the vehicle as described herein. - Other typical vehicle systems may be included in the
vehicle 100 but are not illustrated for clarity of the drawings. Thecontroller 102 may provide inputs to these other systems. -
FIG. 2 shows thevehicle seat 110 configured to be fixed in a cabin of a motor vehicle. Theseat 110 is adapted to support a person on a base 201 in an upright position against a seat back 202. Thebase 201 is fixed to the floors in the vehicle cabin, e.g., by rails. Aheadrestraint 203 may be positioned at the top of the seat back and act as a headrest. Each of thebase 201, seat back 202, andheadrestraint 203 include a rigid frame, comfort layers on the frame and an external covering. A plurality ofsensors first sensors 150 may be positioned in theheadrest 203 and adapted to sense EDP signals from the occupant of theseat 110. A plurality ofsecond sensors 155 may be positioned in the seat back 202. The plurality ofsecond sensors 155 may also sense EDP signals from the seated occupant. The plurality ofsecond sensors 155 may include at least one sensor that does not sense EDP signals. One or morethird sensors 156 are positioned in theseat base 201. Thethird sensors 156 may also sense EDP signals. The plurality ofsecond sensors 155 may include at least one sensor that does not sense EDP signals and may, e.g., sense presence of a person in the seat using sensors in the seat back or seat and sense weight of the occupant of the seat using sensors in the seat base. Thesensors 150 develop raw EDP signals, which are filtered to produce analysis signals including frequency components relevant to the EDP of the person in the seat while attenuating unrelated frequency components. - In another aspect, a method is provided for monitoring a mental state of a person having a body including a head positioned at the headrestraint adjacent sensors in the headrestraint. The method also includes positioning a sensor at least proximate to portions of the skin of the body below the head to develop raw signals, and processing the raw signals to produce at least one bandpass-filtered state-indicating signal representative of raw signal magnitude within a predetermined frequency range as an indication of the mental state (e.g., distracted state) of the person.
- At least one
sensor 150 is positioned to be at the posterior of the head near or at the occipital-visual cortical region. This may assist in accurately measuring brain waves, e.g., through EDP. As driving is a visually dominant cognitive task the ability to detect processing in that anatomical area of the brain (e.g., the visual cortex) as well as other processing and cognitive networks of mental processing offers the ability to monitor visual attention level specifically. For example, visual habituation is the brain's ability to decrease its response to repeated stimuli once the information has been processed and is no longer perceived as a relevant processing demand. In addition to generally low visual attention, the occupant should not experience significant habituation patterns as the visual scenery though mundane at times is in continuous variation and the conditions demand attention in such areas. Lack of activity related to visual processing or habituation of visual stimuli can serve as a subset classification of potential distraction in addition to other brain wave responses and secondary monitoring systems. -
FIG. 3A shows schematic view of aprocess 300 that can be implemented to determine distractedness using sensors, e.g., in avehicle 100. At 301, the controller monitors sensed data from an array ofsensors 303 associated with the vehicle. The sensors can be any of the sensors described herein. Examples of sensors in thesensor array 303 include the EDP sensor, internal and external imagers, laser based sensors, seat sensors, and the like. The sensor array can include up to N sensors, where N is any positive integer. - The
sensor array 303 can monitor a driver or an occupant of the vehicle seat. The monitoring can include EDP sensing using thecontactless sensors 150. The EDP signals are used to detect a distraction state of the driver. The EDP signals can be separated into various sub-signals, e.g., at different frequencies, by using filters to allow certain divisions into sub-bands. These sub-bands may overlap in frequency ranges. A general range of frequencies for each sub-band can be defined within a reasonable variance. A first sub-signal can be up to four hertz. A second sub-signal can be four hertz to seven hertz. A third sub-signal can be seven hertz to fourteen hertz. A fourth sub-signal can be fourteen hertz to about thirty hertz. A fifth sub-signal can be about thirty hertz to about one hundred hertz. Other sub-signals may overlap these ranges for the first through sixth sub-signals, e.g., from eight hertz to thirteen hertz. The relationships between these sub-signals can be used to determine whether the driver is distracted from the task of driving. The patterns of the sub-signals or the ratios of multiple sub-signals to each other can be used to determine if a distraction is occurring. - The
sensor array 303 can include a vehicle cabin imager, e.g., a camera, that is used to detect the driver in the vehicle seat. The camera data is used to detect a distraction pattern in the driver. The camera can detect movement or lack of movement of the driver, facial features of the driver or both. The camera data can be video signals sent to a data processor in the controller to determine if the driver matches a distraction pattern. The data processor can determine if the actions of the driver or the image of the driver matches a known distraction pattern. Examples of distraction patterns can include head position or eye position not directed forward in a manner of a driver looking out the windshield. - The
sensor array 303 can include external imagers, cameras, LIDAR (Light Detection and Ranging), RADAR (RAdio Detection and Ranging), and SONAR (SOund Navigation and Ranging). These systems can detect objects around the vehicle, e.g., other vehicles, stop signs, stop lights, and the like. The imagers can detect the color and shape of external objects. LIDAR and RADAR can detect the size and position of external objects relative to the vehicle, which may be in motion. - At 304, the sensed data from each of the individual sensors are compared to thresholds for sensed data of the respective sensors. The image data from an imager is compared to image data, e.g., the change in pixels can be used to indicate that a threshold in the image data has been exceeded. The EDP data can be compared at multiple frequencies to determine if an EDP signal, e.g., a brain wave, exceeds an EDP threshold to indicate distractedness. The EDP data can also be compared to EDP patterns over a time period with these patterns being indicative of a person who is focused or a person who is distracted. A LIDAR, RADAR, or sonar sensor can detect the relative position of the vehicle compared to objects or other vehicles. A navigational sensor can be used to determine the location of the vehicle and provide speed data for the present vehicle and other vehicles around the present vehicle being operated by the person for whom distractedness is being determined. Seat sensors can determine the position of the person, metabolic and physiological parameters, biometric parameters, EDP, and other data related to the person. Each of the sensed data can be compared to thresholds that are stored in memory in the vehicle. The memory is associated with the vehicle controller.
- At 305, it is determined if at least two or more of the sensed data exceeds a distractedness threshold. The
process 300 relies on distractedness being determined based on at least data from two sensors. The primary sensor can be the EDP sensor, which can be weighted more heavily in the process to control the vehicle. If only one sensor indicates distractedness, then the process moves to step 306 and returns to monitoring atstep 301. If two or more sensors indicate distractedness via triggered thresholds, then it is determined if whether the determination of distractedness is false atstep 310. In an example, the EDP data from the EDP sensor by itself may falsely indicate distractedness. The other data can be combined with the EDP data or sensor results to determine if the EDP is falsely indicating distractedness or that the person is focused on a different task. An example of a process for determining falseness is described in greater detail with regard toFIG. 4 . - At
step 311, the combined data from the sensor array determines whether the driver is distracted by using the results from two or more sensors or using all of the results from each sensor. If distractedness is determined atstep 315, then the vehicle can initiate countermeasures atstep 317. If distractedness is not found, i.e., the person is focused, then the process moves to step 313 and returns to the controller monitoring thesensors 301. The countermeasures atstep 317 can include a distraction warning, e.g., an audible warning through the vehicle audio system, a light warning, or mechanical warnings. Mechanical warnings can include vibration warnings, e.g., in the steering wheel, in the seat, in pedals, in the driver's mobile phone, or combinations thereof. The mechanical warnings can vibrate vehicle components that contact the occupant or would notify the occupant. The driver's mobile phone may be electrically connected to the vehicle through a wired connection or wireless connection, e.g., Bluetooth, WIFI or the like. The countermeasures can also include secondary countermeasures, e.g., activating and/or increasing the range of the vehicle anti-collision systems or adaptive cruise control. The secondary countermeasures are vehicle controls and processes. Primary countermeasures are those that encourage the distracted occupant to refocus and not be distracted. - In an example embodiment, the LIDAR/RADAR detection range can be increased as a countermeasure. The external camera range can be increased. Increasing the range of detection allows the systems to detect objects farther away from the vehicle and allows more time to automatically process to compensate for the distracted state of the driver. The increase of the time buffers representing the distance from the vehicle to an object outside the vehicle, e.g., other vehicles on the road, road hazards and the like increases the distance from an object at which the vehicle can automatically activate a countermeasure or detect the object. The increase of the time buffers reduces the time to impact from the vehicle to the object. The vehicle will than activate collision avoidance systems sooner when the driver is distracted. After
step 317, the process can return to step 301. -
FIG. 3B shows process 340 that can be implemented in thevehicle 100 to sense a distracted state of the occupant of the seat. Atstep 341, it is determined if the vehicle is on, e.g., by determining that the ignition is in an “ON” state. If the vehicle is off then, the process 340 ends atstep 365. If the vehicle is ON, then the vehicle turns on its system for detecting distractedness by loading instructions in the controller circuitry. In an example embodiment, turning in the distracted detection in the vehicle can be optional, e.g., set in a setting procedure or turned OFF by a switch. In an example, when the vehicle turns ON, it also initiates the vehicle intelligence system. Atstep 345, the driver of the vehicle has the option to manually turn off the system for detecting distractedness. If the driver chooses to turn off the system for detecting distractedness atstep 345, the system turns OFF atstep 363. The process 340 can also be turned OFF when the vehicle is not moving or in park for a period of time. The decision to keep the system for detecting distractedness ON can be based on receiving at least two or more sensor signals that can be used to determine distractedness. In an example, at least one of the two sensor signals can be the EDP signal. - With the distraction process 340 remaining ON at
step 345, the process moves to activating the sensor array atstep 303. The sensor array can be the same sensor array as described with reference toFIG. 3A and may include any sensor as described herein. The sensor array atstep 303 outputs sensed data to thecontroller 102 to process the sensed data atstep 351. The processing atstep 351 can include filtering or normalizing the raw data from the sensors in the sensors array fromstep 303. The sensed data fromstep 351 is compared to thresholds that are individualized for each type of sensor data. If two or more thresholds are not met at 355, which indicate the person is focused, then the process returns to the sensor array sensing data atstep 303. The present process is based on the assumption that the occupant is not distracted. It will be understood that a similar process, which assumes the driver is distracted and the system must prove the occupant is not distracted, is within the scope of the present disclosure. - It will be understood that the sensing of data at
step 303 can be continuous with the return indicating the present data does not indicate distractedness. If two or more different thresholds are met, then the process moves to a false alarm determination atstep 357. If a false distractedness is determined, the vehicle may not trigger countermeasures. In an example, the navigation positioning (e.g., GPS) data is used to confirm the distraction determination based on the EDP signal distractedness determination. In an example, the vehicle speed data is used to confirm the distraction determination. In an example, the images from the inward camera, the outward camera or both are used to confirm the distraction determination. - At
step 359, a final determination of distractedness is made, which takes into account the sensor data relied on to determine distractedness and the false distracted determination to reduce the likelihood of false distractedness determinations. If it is determined atstep 359 that the person is not distracted, then the process returns to step 303. If distractedness is determined, then the vehicle can initiate countermeasures atstep 360. After countermeasures are initiated, then process returns to a determination to keep the distraction algorithm ON atstep 345. The countermeasures can remain ON for a set period of time or until the system determines that the occupant is no longer distracted or when the distractedness is determined to be false. In an example embodiment, the countermeasures remain ON until the present methods and systems determine that the driver is now not distracted. - The present distractedness determination processes 300, 350 use data from two or more different sensors to determine distractedness. The EDP sensor can provide the primary data but data from other vehicle sensors can be combined to more accurately determine distractedness. The other sensed data can be data related to the occupant within the vehicle cabin, data from outside the vehicle cabin, or both. The combination of the distractedness determination based on each individual sensor can be used to reduce the likelihood of false indications of distractedness.
- In an example embodiment, the
step 343 of turning ON the distraction intelligence in the vehicle proceeds to thesensors 303 for YES and to step 363 when the intelligence is turned OFF. The keep intelligence ONstep 345 can be between the initiate countermeasures step 360 and thesensors 303/ -
FIG. 4 shows a table 400 of various sample scenarios 401-405 to use multiple inputs from different sources to determine the state of distractedness of a person. Any one sensor can be used as the primary data. Any other sensor can be used as secondary data to correct, to validate, or to invalidate the determination of distractedness based on the primary sensed data. For example, as shown inFIG. 4 , the sensed EDP data can be the primary input for determining distractedness of a person being sensed by the sensors as described herein. The sensors for primary data can include vehicle speed, relative vehicle speed, positional data, navigational data, outward camera data, inward facing camera, and the like. The addition of the secondary data can be used to correct for false indications of driver distraction based solely on the primary data. The inputs for secondary data can include vehicle speed, navigational positioning information (GPS in North America), an external facing camera, an inward facing camera, possibly focused on the driver, and the like. The data from these devices is used in controller circuitry to determine if there is a false indication of driver distraction. - In
first scenario 401 using the sensors as described herein, the first sensed signals from the EDP sensor are determined to be normal, i.e. within a defined tolerance or range. The output from the controller circuitry will indicate that the diver is focused. A focused status of an occupant is the occupant person being not distracted from the task of driving. Thus, the primary sensor system first determines that the person is not distracted. Then the secondary sensor data can be added to verify driver's state of distraction. The vehicle speed with respect to the surrounding traffic as a secondary input is judged to be abnormal. The vehicle speed from the current vehicle is known from on board sensors. The surrounding vehicle speeds can be determined from sonar, radar, or lidar sensors, or combinations thereof, on the vehicle. The surrounding vehicle speeds can also be transmitted between vehicles in a vehicle communication network. Here, the relative vehicle speed is abnormal, i.e. outside a defined tolerance or expected range. An additional secondary sensor signal is the navigational position data with regard to a vehicle and possible traffic congestion at the vehicle location. The navigational or vehicle positioning sensor (e.g., GPS in North America) is not sensing any traffic congestion in the sample embodiment. Traffic congestion can be a measure of vehicle position over time relative to the normal overall traffic flow for a particular time of day. This data can include the position of the present vehicle and combined with amalgamated data from a server with regard to traffic at that location and that time of day. Traffic congestion can also be sensed by an outward facing sensor, e.g., an imaging sensor, a camera sensor, a RADAR sensor, a laser sensor, or a sonar sensor. Here the outward sensor is a camera or imaging sensor and it does not detect traffic congestion. The inward imager or camera senses that the occupant is normal. The corrected distracted analysis based on a fusion of the primary sensed data and the secondary sensed data results in the determination from the controller circuitry that the occupant is focused, i.e. not distracted. A counter measure to counter act the distractedness of the occupant is not triggered. - In the
second scenario 402, the sensed EDP is normal i.e. within a defined tolerance or range and the analysis based on the primary sensor data is that the occupant is focused. The secondary sensors can be used to check or confirm the primary sensor analysis. The relative vehicle speed is abnormal. The navigational result is no traffic congestion. The outward imaging sensor shows no traffic congestion. The inward camera senses abnormal occupant behavior, i.e. outside of a defined tolerance or range. The corrected analysis of the primary sensor data combined with the secondary sensor data results in a determination the occupant is distracted. The system further determines that the primary sensor data was analyzed and reached a false result. This can be used to teach the algorithm that its result was incorrect. The algorithm in the controller circuitry for analyzing the primary sensor data can change its parameters to more closely match the corrected results from the corrected analysis. The system can use the corrected analysis of not focused to trigger counter measures as described herein. - In scenarios 403-405, the primary sensor determines that the EDP sensed data is abnormal in the sense that the occupant is not focused or is distracted. The EDP sensed data can be compared to focused waveforms and unfocused waveforms in the controller circuitry. When outside the focused thresholds of variability from the focused waveform, then the controller circuitry can determine that the occupant is not focused and therefore distracted. The secondary sensed data from the secondary sensors can be used to correct the determination that the occupant is distracted.
- In the third scenario 403, the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor. The relative vehicle speed is abnormal. The navigational system sensor determines that there is no traffic congestion. The outward imager determines that there is no traffic congestion. The inward imager provides data showing abnormal, which indicates that the occupant is distracted. The primary data or analysis is combined with the secondary data (one input or more than one output) and determines that the occupant of the seat is not focused on the task of driving, i.e. is distracted. The analysis from the first sensor input is confirmed or does not produce a false distractedness reading or alarm. The countermeasures in the vehicle can be triggered based on the primary analysis and the sensor fusion with the secondary sensed data. The countermeasures can be any countermeasure as described herein.
- In the fourth scenario 404, the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor. The relative vehicle speed is normal. The navigational system sensor determines that there is traffic congestion. The outward imager determines that there is traffic congestion. The inward imager provides data showing normal, which indicates that the occupant is focused. The primary data or analysis is combined with the secondary data and determines that the occupant of the seat is focused. The analysis from the first sensor input is determined to be a false reading that the driver is distracted. The secondary sensor data corrects the incorrect or false alarm from the primary sensor analysis. The countermeasures in the vehicle are not triggered.
- In the fifth scenario 405, the secondary sensors provide additional data to be used with the primary abnormal determination from the primary sensor. The relative vehicle speed is normal. The navigational system sensor determines that there is traffic congestion. The outward imager determines that there is traffic congestion. The inward imager provides data showing abnormal, which indicates that the occupant is distracted. The primary data or analysis is combined with the secondary data and determines that the occupant of the seat is not focused, i.e. is distracted. The analysis from the first sensor input is confirmed or does not produce a false distractedness reading or alarm. The countermeasures in the vehicle can be triggered based on the primary analysis and the sensor fusion with the secondary sensed data. The countermeasures can be any countermeasure as described herein.
- The scenarios 401-405 represent an example of circuitry applying an algorithm to the sensed data to output a resulting signal to trigger an alert, vehicle control, or vehicle countermeasure in attempt to lessen the effects of the distracted driving. Circuitry may exploit artificial intelligence and neural networks to determine distractedness and/or false distractedness.
- The criteria from the secondary sensors, e.g., as shown in
FIG. 4 , can be used in some example embodiments to indicate distractedness. The secondary sensors can detect secondary criteria related to the occupant or driver within the vehicle cabin. The secondary criteria can also include sensed data related to the environment outside the vehicle. The sensed data can be used to derive the secondary criteria. Relative relationships between criteria sensed by the secondary sensors can be determined and when these relationships indicate distractedness, the vehicle can warn the occupant or alter operation of the vehicle. In an example embodiment, the vehicle speed is sensed and used as a first criteria for determining distractedness. This criterion is controlled by the driver. Other first criterion can include sensed signals related to the driver, e.g., brain waves, HR, HRV, eye movement, body position and movement and the like. These are directly controlled by the driver or produced by the driver's body. The second criterion can be traffic congestion, other vehicles speeds adjacent the vehicle, or a combination of both. The second criterion is not controlled by the driver. The second criterion may relate to vehicle or outside the vehicle data that are not under the control or produced by the driver. The relationship between the first and second criteria can indicate distractedness. For example, when the first criterion changes in a known manner and the second criterion also changes in a known manner, this indicates distractedness. A distracted driver may slow his/her vehicle (i.e., first criterion) when surrounding traffic does not slow (i.e., second criterion). An additional first criterion can also be used to confirm distractedness, e.g., a slower and deeper breathing pattern, a change in posture, a slower heart rate, etc. In another example, a distracted driver may slow the vehicle (i.e., first criterion) and there is no slowing due to traffic congestion, other obstacle, traffic light, and the like (i.e., second criterion). This indicates possible distractedness of a driver. The controller can also take into account the operational status of the vehicle. If the vehicle is experiencing some type of operational failure, then the driver may not be distracted. The controller compares the first criterion, e.g., vehicle speed, relative to the second criterion. When the vehicle speed slows down relative to the second criterion, the controller will indicate distractedness of the driver. - The second criteria can also be relative vehicle behavior. With vehicles communicating with each other, the first criterion can be changing vehicles speeds at a first rate. The second criterion can be the rates that the vehicle speeds of other vehicles change speed relative to the changing of speed of a vehicle by the driver's being sensed. If the present vehicle changes vehicle speeds more often than the other vehicles, this indicates distractedness.
- The throttle position can be sensed using a throttle position sensor. The throttle position can be used as part of the first criterion, a supplement to vehicle speed or a replacement for vehicle speed. The driver controls the position of the throttle, with the cruise control or adaptive cruise control off, which in turn controls vehicle speed.
- The system may utilize adaptive tolerancing of sensing system thresholds for both the occupant and outward environmental sensing technology to improve situational distraction classification and countermeasure readiness. The external system thresholds may be influenced by the internal system indications and equally the internals system thresholds may be influenced by the external system indications.
- For example, the cognitive attention measurement via the sensed electrical brain activity (e.g., EDP) may lower its distraction indication threshold if metrics such as the vehicle speed, weather, and/or lidar/radar systems indicate that the surrounding environment contains certain conditions (e.g., proximity to external objects, speed of vehicle, wet or icy road conditions) that requires higher alertness in that situation in comparison to normal operating conditions where non-distracted but also non-heightened attentions levels would not be concerning. As with the field sensing or contact based sensing of the EDP as it relates to the underlying brain activity, the internal eye and skeletal tracking sensing, autonomic nervous system monitoring, as well as the external systems and all other monitors can have multiple threshold levels both in the magnitude of their metrics (e.g. cognitive attention, eye open area, turn, HR/BR, distance, speed) and their temporal resolution (e.g. seconds per degrees of turn allowed of the head). For example, eye or skeletal tracking may indicate various degrees of turn away from the optimal viewing window of attention for various amounts of time. In situations requiring higher levels of alertness both the maximum degree of turn away from that plane and the time length allowed for each degree of turn may be lowered before an indication is given and/or a countermeasure is readied and perhaps activated.
- Long term data related to detected distraction can be processed secondary to the real-time algorithms to provide a variety of statistical information for both the occupant and machine learning systems. The long-term data may be stored in the vehicle or off-vehicle on a remote server. The vehicle may include electronic communication to an external server, e.g., over WIFI, mobile communication networks, such as cellular communications, and the like. The long-term distraction calculations may be used to alter the instructions for determining distraction or for mitigating false positives. The present disclosure quantifies the distraction/concentration status of the driver while correcting for false indications of distraction. The vehicle can use the distraction/concentration status of the driver to manipulate reaction times of various vehicle safety systems, e.g., the adaptive braking system, to optimize the response of the system itself. This may reduce the risk of forward collisions.
- The present system can be used in an autonomous vehicle, e.g., a levels 1-2 automobile(s), where the vehicle uses the level of distraction, a determination of distractedness, or the multiple sensor determination of a distracted driver, to be able to judge the most appropriate time to switch from manual to autonomous drive and vice-versa, or to engage certain levels of countermeasures.
- This system is beneficial to all modes of transportation extending even beyond automotive and personal vehicle.
- The present disclosure illustrates a
controller 102. It is within the scope of the present disclosure for thecontroller 102 to represent multiple processors, memories and electronic control units, which can work independently with various systems to affect the functions and tasks described herein. The vehicle may use a more distributed controller system then a single controller and remain within the scope of the present disclosure. Thecontroller 102 includes circuitry to process sensed signals that represent real world conditions and data. - The present disclosure describes the sensed EDP data to be the primary data and other data related to the person or the vehicle to be secondary data. However, some embodiments of the present disclosure use the sensed EDP as the secondary data to correct for false determinations of the distractedness based on the other non-EDP data. For example, the internal camera and the operation of the vehicle, e.g., drifting or crossing lines in the street, can be used to determine distractedness. The EDP signal can be used to validate any determination of distractedness.
- A further secondary data can be time, e.g., time of day and time of vehicle usage. The vehicle may input the time of day that the vehicle is being driven as a secondary input to prevent false determinations of distractedness or alter the levels of thresholds in the cameras for determining distractedness. When the time of day is night, then the thresholds of distractedness may be lowered. The vehicle may track the usual time that the vehicle is driven in a given time period. When the vehicle is operated outside the usual time periods, then there may be greater likelihood of distracted driving.
- One example of electro-dermal potential may be a type of electroencephalography (EEG), which is an electrophysiological monitoring method to record electrical activity of the brain. It is typically noninvasive, with the electrodes placed along the scalp, although invasive electrodes are sometimes used in specific applications. EEG measures voltage fluctuations resulting from ionic current within the neurons of the brain. In clinical contexts, EEG refers to the recording of the brain's spontaneous electrical activity over a period of time, as recorded from multiple electrodes placed on the scalp. Diagnostic applications generally focus on the spectral content of EEG, that is, the type of neural oscillations that can be observed in EEG signals.
- While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/033,383 US20210009149A1 (en) | 2017-12-04 | 2020-09-25 | Distractedness sensing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/830,892 US10836403B2 (en) | 2017-12-04 | 2017-12-04 | Distractedness sensing system |
US17/033,383 US20210009149A1 (en) | 2017-12-04 | 2020-09-25 | Distractedness sensing system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/830,892 Division US10836403B2 (en) | 2017-12-04 | 2017-12-04 | Distractedness sensing system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210009149A1 true US20210009149A1 (en) | 2021-01-14 |
Family
ID=66547942
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/830,892 Active 2038-09-02 US10836403B2 (en) | 2017-12-04 | 2017-12-04 | Distractedness sensing system |
US17/033,383 Abandoned US20210009149A1 (en) | 2017-12-04 | 2020-09-25 | Distractedness sensing system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/830,892 Active 2038-09-02 US10836403B2 (en) | 2017-12-04 | 2017-12-04 | Distractedness sensing system |
Country Status (3)
Country | Link |
---|---|
US (2) | US10836403B2 (en) |
CN (1) | CN109878527A (en) |
DE (1) | DE102018220286A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11524691B2 (en) | 2019-07-29 | 2022-12-13 | Lear Corporation | System and method for controlling an interior environmental condition in a vehicle |
US12059980B2 (en) | 2019-06-21 | 2024-08-13 | Lear Corporation | Seat system and method of control |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170210289A1 (en) * | 2016-01-22 | 2017-07-27 | Arjun Kundan Dhawan | Driver Focus Analyzer |
BR112019002648B1 (en) * | 2016-08-09 | 2023-01-17 | Nissan Motor Co., Ltd | AUTOMATIC DRIVING VEHICLE CONTROL METHOD AND CONTROL DEVICE |
US11052223B2 (en) | 2017-12-21 | 2021-07-06 | Lear Corporation | Seat assembly and method |
US10867218B2 (en) | 2018-04-26 | 2020-12-15 | Lear Corporation | Biometric sensor fusion to classify vehicle passenger state |
US11345362B2 (en) * | 2018-12-31 | 2022-05-31 | Robert Bosch Gmbh | Adaptive warnings and emergency braking for distracted drivers |
US10672249B1 (en) | 2019-05-06 | 2020-06-02 | Cambridge Mobile Telematics Inc. | Determining, scoring, and reporting mobile phone distraction of a driver |
US10759441B1 (en) * | 2019-05-06 | 2020-09-01 | Cambridge Mobile Telematics Inc. | Determining, scoring, and reporting mobile phone distraction of a driver |
KR102286924B1 (en) * | 2019-11-13 | 2021-08-06 | 현대자동차주식회사 | Automatic return apparatus for seat back of seat in vehicle and method for returning of the seat back of seat in vehicle |
WO2021124140A1 (en) * | 2019-12-17 | 2021-06-24 | Indian Institute Of Science | System and method for monitoring cognitive load of a driver of a vehicle |
EP3845431B1 (en) * | 2020-01-06 | 2024-08-07 | Aptiv Technologies AG | Driver-monitoring system |
EP4097706A4 (en) * | 2020-01-29 | 2023-07-19 | Netradyne, Inc. | Combination alerts |
US11059490B1 (en) | 2020-03-17 | 2021-07-13 | Lear Corporation | Seat system |
US11292371B2 (en) | 2020-05-13 | 2022-04-05 | Lear Corporation | Seat assembly |
US11634055B2 (en) | 2020-05-13 | 2023-04-25 | Lear Corporation | Seat assembly |
US11173818B1 (en) | 2020-05-13 | 2021-11-16 | Lear Corporation | Seat assembly |
US11590873B2 (en) | 2020-05-13 | 2023-02-28 | Lear Corporation | Seat assembly |
JP2021196941A (en) * | 2020-06-16 | 2021-12-27 | トヨタ自動車株式会社 | Driver monitoring system |
KR20220055214A (en) * | 2020-10-26 | 2022-05-03 | 현대자동차주식회사 | Advanced Driver Assistance System and Vehicle having the advanced Driver Assistance System |
US11679706B2 (en) | 2020-12-02 | 2023-06-20 | Lear Corporation | Seat assembly |
CN112806996A (en) * | 2021-01-12 | 2021-05-18 | 哈尔滨工业大学 | Driver distraction multi-channel assessment method and system under L3-level automatic driving condition |
FI129297B (en) * | 2021-02-19 | 2021-11-15 | Taipale Telematics Oy | Device, method and computer program for determining the driving manner of a vehicle driver |
US11845442B2 (en) * | 2021-03-29 | 2023-12-19 | Ford Global Technologies, Llc | Systems and methods for driver presence and position detection |
US11556175B2 (en) | 2021-04-19 | 2023-01-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Hands-free vehicle sensing and applications as well as supervised driving system using brainwave activity |
CN113335296B (en) * | 2021-06-24 | 2022-11-29 | 东风汽车集团股份有限公司 | Self-adaptive detection system and method for decentralized driving |
DE102022212357A1 (en) | 2022-11-18 | 2024-05-23 | Volkswagen Aktiengesellschaft | Method for operating an assistance system of a motor vehicle, computer program product and assistance system |
Family Cites Families (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5694939A (en) | 1995-10-03 | 1997-12-09 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Autogenic-feedback training exercise (AFTE) method and system |
US5807114A (en) | 1996-03-27 | 1998-09-15 | Emory University And Georgia Tech Research Corporation | System for treating patients with anxiety disorders |
US6438399B1 (en) | 1999-02-16 | 2002-08-20 | The Children's Hospital Of Philadelphia | Multi-wavelength frequency domain near-infrared cerebral oximeter |
US6366207B1 (en) | 2000-02-04 | 2002-04-02 | Michael Murphy | Device for modifying vehicle operator driving behavior |
DE10126224A1 (en) | 2001-05-30 | 2002-12-12 | Bosch Gmbh Robert | Method and device for characterizing the condition of the driver of a motor vehicle |
EP1395176B1 (en) | 2001-06-13 | 2008-10-15 | Compumedics Limited | Method for monitoring consciousness |
NL1019989C2 (en) | 2002-02-18 | 2003-08-19 | Tno | A method for preventing motion sickness, as well as a device for detecting and signaling potentially pathogenic movements. |
US7138922B2 (en) | 2003-03-18 | 2006-11-21 | Ford Global Technologies, Llc | Drowsy driver monitoring and prevention system |
US7722526B2 (en) | 2004-07-16 | 2010-05-25 | Samuel Kim | System, method and apparatus for preventing motion sickness |
FR2880166B1 (en) | 2004-12-24 | 2008-10-31 | Renault Sas | METHOD AND DEVICE FOR ASSISTING ALERT DRIVING IN THE EVENT OF AN EMERGENCY SITUATION FOR A MOTOR VEHICLE |
US8214183B2 (en) | 2005-07-25 | 2012-07-03 | Airbus | Process and system of modeling of an interface between a user and his environment aboard a vehicle |
ITBO20060089A1 (en) | 2006-02-10 | 2007-08-11 | Ferrari Spa | METHOD OF CONTROL OF A VEHICLE TO ADAPT THE DYNAMIC BEHAVIOR OF THE VEHICLE IN THE PSYCHO-PHYSICAL STATE OF THE DRIVER. |
US8031062B2 (en) | 2008-01-04 | 2011-10-04 | Smith Alexander E | Method and apparatus to improve vehicle situational awareness at intersections |
JP4727688B2 (en) | 2008-04-23 | 2011-07-20 | トヨタ自動車株式会社 | Awakening level estimation device |
US8448230B2 (en) | 2008-08-22 | 2013-05-21 | International Business Machines Corporation | System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet |
US20180197636A1 (en) | 2009-03-10 | 2018-07-12 | Gearbox Llc | Computational Systems and Methods for Health Services Planning and Matching |
US9095303B2 (en) | 2009-03-23 | 2015-08-04 | Flint Hills Scientific, Llc | System and apparatus for early detection, prevention, containment or abatement of spread abnormal brain activity |
JP2010241963A (en) | 2009-04-06 | 2010-10-28 | Midori Hokuyo Kk | Leather imparted with antifouling property |
EP2453792B1 (en) | 2009-07-13 | 2017-05-17 | Koninklijke Philips N.V. | Electro-physiological measurement with reduced motion artifacts |
US9460601B2 (en) | 2009-09-20 | 2016-10-04 | Tibet MIMAR | Driver distraction and drowsiness warning and sleepiness reduction for accident avoidance |
DE202009013768U1 (en) | 2009-10-09 | 2011-02-24 | Pichler, Christian | Sitting or lying device with integrated circuit for magnetic field therapy |
EP2489307B1 (en) | 2009-10-14 | 2018-10-03 | Delta Tooling Co., Ltd. | Biological state estimation device and computer program |
US9298985B2 (en) | 2011-05-16 | 2016-03-29 | Wesley W. O. Krueger | Physiological biosensor system and method for controlling a vehicle or powered equipment |
US10997526B2 (en) | 2010-07-02 | 2021-05-04 | United States Of America As Represented By The Administrator Of Nasa | System and method for human operator and machine integration |
US8364395B2 (en) | 2010-12-14 | 2013-01-29 | International Business Machines Corporation | Human emotion metrics for navigation plans and maps |
US9292471B2 (en) | 2011-02-18 | 2016-03-22 | Honda Motor Co., Ltd. | Coordinated vehicle response system and method for driver behavior |
US8698639B2 (en) | 2011-02-18 | 2014-04-15 | Honda Motor Co., Ltd. | System and method for responding to driver behavior |
CN102254403B (en) * | 2011-04-07 | 2012-10-24 | 江苏科技大学 | Early warning method for fatigue driving of automobile driver |
KR101372120B1 (en) | 2011-06-23 | 2014-03-07 | 현대자동차주식회사 | Apparatus and method for acquiring biometric information of a driver |
US9124955B2 (en) | 2011-09-19 | 2015-09-01 | Card Guard Scientific Survival Ltd. | Vehicle driver monitor and a method for monitoring a driver |
KR101405679B1 (en) * | 2011-10-11 | 2014-06-13 | 현대자동차주식회사 | An abnormal driving state detact and warning system which based on location-based service |
DE102012002037B4 (en) | 2012-02-03 | 2015-03-05 | automation & software Günther Tausch GmbH | Device for carrying out driver status analyzes |
US20130204153A1 (en) | 2012-02-06 | 2013-08-08 | Emily Ruth Buzhardt | Generating an alarm based on brain wave patterns of a user |
US20130325202A1 (en) | 2012-06-01 | 2013-12-05 | GM Global Technology Operations LLC | Neuro-cognitive driver state processing |
WO2014063160A1 (en) | 2012-10-19 | 2014-04-24 | Basis Science, Inc. | Detection of emotional states |
US9235198B2 (en) | 2012-10-25 | 2016-01-12 | International Business Machines Corporation | System and method for using biometrics to predict and select music preferences |
KR102011495B1 (en) | 2012-11-09 | 2019-08-16 | 삼성전자 주식회사 | Apparatus and method for determining user's mental state |
US9002458B2 (en) | 2013-06-29 | 2015-04-07 | Thync, Inc. | Transdermal electrical stimulation devices for modifying or inducing cognitive state |
CN204147427U (en) | 2012-11-26 | 2015-02-11 | 塞恩克公司 | Wearable electrocutaneous stimulation equipment |
US20150313475A1 (en) | 2012-11-27 | 2015-11-05 | Faurecia Automotive Seating, Llc | Vehicle seat with integrated sensors |
US9149236B2 (en) | 2013-02-04 | 2015-10-06 | Intel Corporation | Assessment and management of emotional state of a vehicle operator |
DE112014000934T5 (en) | 2013-02-21 | 2016-01-07 | Iee International Electronics & Engineering S.A. | Imaging-based occupant monitoring system with broad functional support |
US9751534B2 (en) | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
US9272689B2 (en) | 2013-04-06 | 2016-03-01 | Honda Motor Co., Ltd. | System and method for biometric identification in a vehicle |
CN103198617A (en) * | 2013-03-26 | 2013-07-10 | 无锡商业职业技术学院 | Fatigue driving warning system |
KR101526672B1 (en) | 2013-07-24 | 2015-06-05 | 현대자동차주식회사 | Apparatus and method for determining drowsy state |
AT513204B1 (en) | 2013-11-13 | 2015-06-15 | Avl List Gmbh | Method for modifying a driving simulator |
US9454887B1 (en) | 2013-11-19 | 2016-09-27 | Mustafa Q. Matalgah | Doze alert |
CN106068097B (en) | 2014-02-20 | 2020-09-29 | 佛吉亚汽车座椅有限责任公司 | Vehicle seat with integrated sensor |
EP3114574A4 (en) | 2014-03-03 | 2018-03-07 | Inrix, Inc. | Traffic obstruction detection |
US9135803B1 (en) | 2014-04-17 | 2015-09-15 | State Farm Mutual Automobile Insurance Company | Advanced vehicle operator intelligence system |
KR101659027B1 (en) | 2014-05-15 | 2016-09-23 | 엘지전자 주식회사 | Mobile terminal and apparatus for controlling a vehicle |
WO2015182077A1 (en) | 2014-05-27 | 2015-12-03 | 日本電気株式会社 | Emotion estimation device, emotion estimation method, and recording medium for storing emotion estimation program |
US9539944B2 (en) | 2014-06-11 | 2017-01-10 | GM Global Technology Operations LLC | Systems and methods of improving driver experience |
US9536411B2 (en) | 2014-08-07 | 2017-01-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Biometric monitoring and alerting for a vehicle |
US9302584B2 (en) | 2014-08-25 | 2016-04-05 | Verizon Patent And Licensing Inc. | Drowsy driver prevention systems and methods |
US9771081B2 (en) | 2014-09-29 | 2017-09-26 | The Boeing Company | System for fatigue detection using a suite of physiological measurement devices |
US9934697B2 (en) | 2014-11-06 | 2018-04-03 | Microsoft Technology Licensing, Llc | Modular wearable device for conveying affective state |
JP6720179B2 (en) | 2014-12-18 | 2020-07-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | System and method for slow wave sleep detection |
CN104700572A (en) * | 2015-03-24 | 2015-06-10 | 安徽师范大学 | Intelligent headrest capable of preventing fatigue driving and control method of headrest |
US10425459B2 (en) | 2015-03-27 | 2019-09-24 | Intel Corporation | Technologies for a seamless data streaming experience |
US9858794B2 (en) * | 2015-03-30 | 2018-01-02 | International Business Machines Corporation | Detecting and notifying of various potential hazards |
EP3303025B1 (en) | 2015-06-03 | 2021-04-21 | Clearmotion, Inc. | Methods and systems for controlling vehicle body motion and occupant experience |
US10136856B2 (en) | 2016-06-27 | 2018-11-27 | Facense Ltd. | Wearable respiration measurements system |
US10989704B2 (en) | 2015-06-24 | 2021-04-27 | Koninklijke Philips N.V. | Sweat monitoring apparatus and monitoring method |
JP6421714B2 (en) | 2015-07-13 | 2018-11-14 | 株式会社デンソー | Vehicle control device |
US9463794B1 (en) | 2015-09-04 | 2016-10-11 | Google Inc. | Stop sign detection and response |
TWI571398B (en) | 2015-09-16 | 2017-02-21 | 國立交通大學 | Device of drowsiness detection and alarm and method of the same |
JP6436030B2 (en) * | 2015-09-17 | 2018-12-12 | トヨタ自動車株式会社 | Life log recording system |
US10054443B1 (en) | 2015-11-05 | 2018-08-21 | National Technology & Engineering Solutions Of Sandia, Llc | Journey analysis system and method |
US9712736B2 (en) | 2015-12-15 | 2017-07-18 | Intel Coprporation | Electroencephalography (EEG) camera control |
US10485471B2 (en) | 2016-01-07 | 2019-11-26 | The Trustees Of Dartmouth College | System and method for identifying ictal states in a patient |
CN105595996B (en) * | 2016-03-10 | 2019-05-31 | 西安科技大学 | A kind of fatigue driving eeg monitoring method of electricity and brain electricity comprehensive judgement |
US9776563B1 (en) | 2016-03-21 | 2017-10-03 | Ford Global Technologies, Llc | Geofencing application for driver convenience |
AU2017236782B2 (en) | 2016-03-22 | 2021-01-28 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
JP7208012B2 (en) * | 2016-04-29 | 2023-01-18 | フリーア、ロジック、インコーポレイテッド | Devices and methods for monitoring electrical activity produced by the brain and methods for monitoring physiological conditions in humans |
US9921726B1 (en) | 2016-06-03 | 2018-03-20 | Steelcase Inc. | Smart workstation method and system |
US11462301B2 (en) | 2016-06-07 | 2022-10-04 | BlyncSync Technologies, LLC | System and method for fleet driver biometric tracking |
US9956963B2 (en) | 2016-06-08 | 2018-05-01 | GM Global Technology Operations LLC | Apparatus for assessing, predicting, and responding to driver fatigue and drowsiness levels |
JP6665696B2 (en) * | 2016-06-09 | 2020-03-13 | 株式会社デンソー | In-vehicle equipment |
KR20180001367A (en) | 2016-06-27 | 2018-01-04 | 현대자동차주식회사 | Apparatus and Method for detecting state of driver based on biometric signals of driver |
US9945679B2 (en) | 2016-06-27 | 2018-04-17 | International Business Machines Corporation | Personalized travel routes to reduce stress |
US10694946B2 (en) | 2016-07-05 | 2020-06-30 | Freer Logic, Inc. | Dual EEG non-contact monitor with personal EEG monitor for concurrent brain monitoring and communication |
US10119807B2 (en) | 2016-11-18 | 2018-11-06 | Child Mind Institute, Inc. | Thermal sensor position detecting device |
WO2018147850A1 (en) | 2017-02-09 | 2018-08-16 | Sony Mobile Communications Inc. | System and method for controlling notifications in an electronic device according to user status |
US10696249B2 (en) | 2017-02-10 | 2020-06-30 | Koninklijke Philips N.V. | Automatic car setting adjustments by identifying driver with health watch wearable or in-car sensors |
WO2019033025A1 (en) | 2017-08-10 | 2019-02-14 | Patroness, LLC | Systems and methods for enhanced autonomous operations of a motorized mobile system |
US11042784B2 (en) | 2017-09-15 | 2021-06-22 | M37 Inc. | Machine learning system and method for determining or inferring user action and intent based on screen image analysis |
US10379535B2 (en) | 2017-10-24 | 2019-08-13 | Lear Corporation | Drowsiness sensing system |
US20190133511A1 (en) | 2017-11-09 | 2019-05-09 | Lear Corporation | Occupant motion sickness sensing |
US11465631B2 (en) | 2017-12-08 | 2022-10-11 | Tesla, Inc. | Personalization system and method for a vehicle based on spatial locations of occupants' body portions |
US10210409B1 (en) | 2018-02-07 | 2019-02-19 | Lear Corporation | Seating system with occupant stimulation and sensing |
US10867218B2 (en) | 2018-04-26 | 2020-12-15 | Lear Corporation | Biometric sensor fusion to classify vehicle passenger state |
-
2017
- 2017-12-04 US US15/830,892 patent/US10836403B2/en active Active
-
2018
- 2018-11-26 DE DE102018220286.9A patent/DE102018220286A1/en active Pending
- 2018-12-03 CN CN201811467580.XA patent/CN109878527A/en active Pending
-
2020
- 2020-09-25 US US17/033,383 patent/US20210009149A1/en not_active Abandoned
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12059980B2 (en) | 2019-06-21 | 2024-08-13 | Lear Corporation | Seat system and method of control |
US11524691B2 (en) | 2019-07-29 | 2022-12-13 | Lear Corporation | System and method for controlling an interior environmental condition in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
DE102018220286A1 (en) | 2019-06-06 |
US10836403B2 (en) | 2020-11-17 |
US20190168771A1 (en) | 2019-06-06 |
CN109878527A (en) | 2019-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210009149A1 (en) | Distractedness sensing system | |
US10379535B2 (en) | Drowsiness sensing system | |
US10867218B2 (en) | Biometric sensor fusion to classify vehicle passenger state | |
US10210409B1 (en) | Seating system with occupant stimulation and sensing | |
CN112041910B (en) | Information processing apparatus, mobile device, method, and program | |
JP7155122B2 (en) | Vehicle control device and vehicle control method | |
US20190133511A1 (en) | Occupant motion sickness sensing | |
US9956963B2 (en) | Apparatus for assessing, predicting, and responding to driver fatigue and drowsiness levels | |
CN111048171B (en) | Method and device for solving motion sickness | |
US9539944B2 (en) | Systems and methods of improving driver experience | |
US10346697B2 (en) | Driver state monitoring using corneal reflection detection | |
JP7560486B2 (en) | Information processing device, information processing system, information processing method, and information processing program | |
EP2032034B1 (en) | Method for determining and analyzing a location of visual interest | |
KR20200113202A (en) | Information processing device, mobile device, and method, and program | |
WO2015175435A1 (en) | Driver health and fatigue monitoring system and method | |
US11447140B2 (en) | Cognitive tunneling mitigation device for driving | |
JP7357006B2 (en) | Information processing device, mobile device, method, and program | |
JP2008217274A (en) | Driver status determination device and operation support device | |
US11751784B2 (en) | Systems and methods for detecting drowsiness in a driver of a vehicle | |
CA3165782A1 (en) | System and method for controlling vehicle functions based on evaluated driving team composition | |
JP6648788B1 (en) | Operation control adjustment device and operation control adjustment method | |
JP2015095111A (en) | Driving support system | |
WO2022172724A1 (en) | Information processing device, information processing method, and information processing program | |
US11801858B1 (en) | System and method for monitoring cervical measurement of a driver and modifying vehicle functions based on changes in the cervical measurement of the driver | |
Dababneh et al. | Driver vigilance level detection systems: A literature survey |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LEAR CORPORATION, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIGNECO, FRANCESCO;YETUKURI, ARJUN;GALLAGHER, DAVID;AND OTHERS;SIGNING DATES FROM 20171115 TO 20171120;REEL/FRAME:053893/0076 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |