GB2595895A - Method for detecting safety relevant driving distraction - Google Patents

Method for detecting safety relevant driving distraction Download PDF

Info

Publication number
GB2595895A
GB2595895A GB2008797.9A GB202008797A GB2595895A GB 2595895 A GB2595895 A GB 2595895A GB 202008797 A GB202008797 A GB 202008797A GB 2595895 A GB2595895 A GB 2595895A
Authority
GB
United Kingdom
Prior art keywords
driver
control action
driving
distraction
distracted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2008797.9A
Other versions
GB202008797D0 (en
Inventor
Mörti Peter
Höfler Margit
Güzel Kalayci Elem
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Virtual Vehicle Research GmbH
Original Assignee
Virtual Vehicle Research GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtual Vehicle Research GmbH filed Critical Virtual Vehicle Research GmbH
Priority to GB2008797.9A priority Critical patent/GB2595895A/en
Publication of GB202008797D0 publication Critical patent/GB202008797D0/en
Publication of GB2595895A publication Critical patent/GB2595895A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/18Steering angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/30Driving style

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Psychiatry (AREA)
  • Biomedical Technology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Emergency Management (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method to detect driver distraction based on changes in control actions with which the driver controls the vehicle, for example the frequency of steering manoeuvres. An environmental information complexity value EIC is estimated 2 by analysing images from an outward facing camera 1 and weighting driving critical information elements (e.g. road curvature, traffic, signs, pedestrians). Recorded inputs of the control action rate 4 under baseline non-distracted driver conditions are used to determine 3 a preferred control action rate CA­­ND for the complexity value to which the current observed control action rate CAobs is then compared 5. Based on the difference between the observed rate CAobs and the expected rate if the driver were not distracted CAND, a warning or driving intervention is outputted.

Description

Method for Detecting Safety Relevant Driving Distraction
Background of the invention
Driver distraction is one of the most common factors reported when it comes to crashes or safety critical events: Previous research has shown that 5 to 25% of accidents can be attributed to the involvement of driver distraction [1]. Typically, visual distraction (when the gaze is off the road) and cognitive distraction (when the mind is off the road) are distinguished [1] [2]. It has been shown that, when drivers are engaged in complex visual and/or manual secondary tasks, they have a nearcrash/crash risk three-times higher than attentive drivers [3]. Automated detection of driver distraction is therefore an important road safety method.
Several methods to detect driver distraction have been developed. Most common algorithms attempt to detect driver distraction work via driver gaze analysis where a camera observes the human driver to determine gaze behavior toward the road or away from it.
A new method is proposed that does not rely on driver gaze but on changes in control actions dependent on environmental information complexity. This invention describes a method measuring information complexity of the environment and establishing a driver's baseline control action variability under non-distracted driving. Significant deviations from the baseline control action variability indicate evidence for distracted driving.
State of the Art Several devices and methods to measure driver distraction are presented that mainly measure distraction via the direction of the gaze or the position of the head or other physiological data such as heart rate or EEG. For instance, some inventions observe the driver's gaze via an external camera in relation to a primary preview region to issue an alert (W02019144880 and U520190236386) or determine driver distraction based on the gaze direction (JP2010131401, U520160214618) the proportion/duration of the off-road gaze (EP1878604, U520160180677 and CN105711517) or detect eye closure and/or gaze angle as measures of fatigue and distraction (WO/2007/092512). Others derive driver distraction by sampling information via visual or optical sensors at an on-board system of the vehicle (W02018085804) or measure driver's distraction and drowsiness by processing the driver's face and pose as a function of speed and maximum allowed travel distance (U520140139655). CN106205052 presents a method in which distraction is determined via analysing the head and face postures and eye features as well as the vehicle speed and a warning signal is provided in case visual attention is not concentrated. The analysis of eye movements in order to detect distraction is also used in combination with driver's gesture (CN106354259) or head movements (EP2314207) or to extract a feature parameter set based on collected eye motion behavior during a driving main and driving auxiliary task (CN108537161). CN110262084 describe sunglasses that assesses driver distraction via a six-axis accelerometer embedded in the nose bridge based on the head motion of the driver and provides a distraction warning signal based on GPS data. Other approaches describe methods that detect and predict behavioral and cognitive distraction via measuring the eye movements and cardiac health via a heart rate sensor (U520190038204) or assess driver distraction by the characteristics of the EEG frequency spectrum of a driver (CN107334481).
Other inventions that do not use gaze or face direction but focus on information from the vehicle such as the transverse driving speed at fixed time intervals (CN109649399) or a lateral jerk when a vehicle is traveling (US10479371). CN103661375 describes a method in which distraction is detected according to the traveling track of an automobile and the operation information of the driver (i.e., yaw angle of the automobile and the rotating speed signal of a steering wheel). W02017093716 describes a human machine interface that reduces its complexity when driver distraction based on one or several signals calculated from e.g., speed, driver drowsiness, road condition, traffic conditions, such as traffic jams or navigation data, is detected.
[4] explain differences in intra-individual steering wheel control action rates across environmental complexity conditions when drivers are driving under distracted versus non-distracted conditions. In contrast, in the present invention a method is proposed that measures environmental information complexity and establishes a driver's baseline control action variability under non-distracted driving. Significant deviations from this baseline control-action variability indicate evidence for distracted driving. In contrast to previously proposed methods, this method does not utilize the drivers gaze behavior and explicitly takes into account environmental information complexity.
Description of the Invention
The present innovation describes a method to detect driver distraction without optical observation of the drivers' gaze. Rather driver distraction detection is based on observing changes of driver control actions under specific conditions. The method consists of a three-stage process. First, the environmental information complexity is estimated preferably by digital image processing methods enumerating and weighing the driving critical information elements in the driving scene including the road curvature, traffic, traffic signs, pedestrians, etc. that need to be considered by the driver on a per-time basis. Thereby the weights for the driving critical information elements are commensurate to the amount of response required. An element with high information complexity requires multiple actions such as deciding whether a pedestrian is a potential safety threat or identifying the need for a lateral avoidance maneuver. An element with low information complexity requires few driver actions, such as processing the information on a speed limitation sign. Under higher speed, the traffic information complexity increases such that drivers would reduce their speed to keep an acceptable rate of information complexity.
Weights may be defined and shared a priori or online for different types and classes of elements and their level of/contribution to criticality or complexity in traffic conditions (e.g. determined by filed studies, literature, connected vehicle network, onboard scene interpretation, etc.).
Second, a driver's "preferred" control action rate is measured for non-distracted driving under certain levels of environmental information complexity. This is done by recording driving data such as vehicle speed, acceleration and breaking activity or steering wheel angle data. Once this connection between preferred control actions and given environmental information complexity is formed, the value of the mismatch between the actual observed and driver preferred control action rate is taken as evidence for safety critical distracted driving and can be used to trigger warnings or other safety interventions.
Drivers exhibit control over the vehicle through various control actions such as speed control and steering. Thereby, drivers respond to the informational complexity in their environment such as curvature of the road, traffic, pedestrians, road signs and more. Generally, in more complex environments such as in cities, drivers have to process more information than in simpler environments such as on country roads without much traffic or traffic signs. In addition, drivers have their inter-individually different driving styles when perceiving and processing the environmental information but also when controlling the vehicle according to their individual styles. Accordingly, drivers exhibit steering patterns that are characteristic for a given driver across different drives but also differentiate them from other drivers. For example, in experiments it was found that the steering wheel reversal rate of distracted drivers per second increases by about 20% from 0.9 per second to 1.1 [4], see figure 2. Therefore, in summary, the individual control actions during non-distracted driving (CAD) that a person exhibits when driving, are a function of the environmental information complexity (Etc) and the personal driving style for non-distracted driving (PDSND) plus some influences that are here referred to as error (e).
C AND = f (Eic, P DSND) e CAN° is the "preferred" control action rate that a specific driver adopts under non-distracted driving. The observed control action rate for a given environmental complexity and preferred driving style is: CA abs = (Ern, PDs) + e By just observing the control actions, it is not clear yet whether the driver is distracted or not. However, by comparing the observed with the preferred control action rate, evidence for safety relevant distracted driving (EsDoo) is detected when CADDs exceeds CAND under high levels of Eic: EsDDD: CAobs(EIC) > CAN (Ern and PDSND) and Ere = high The environmental information complexity estimator (2) takes information from the outward facing camera (1), which covers the field of view of a human driver in driving direction, and optionally available map information about the road environment and identifies critical information elements for the current driving situation. Critical information elements include the road curvature, traffic, traffic signs, or pedestrians or any other traffic related object. The information elements are subsequently weighted by the amount of response that is required to appropriately respond to them. A pedestrian that is showing ambivalent behavior concerning crossing the street ahead of the vehicle is associated with high information complexity as it requires multiple glances and a subsequent decisions process. Therefore, this element receives a high complexity weight. In contrast, a simple speed limitation sign on the road without any other elements represents low information complexity. Thereby the driving speed represents an important factor such that under higher speed, the traffic information complexity increases and that drivers usually reduce their speed to keep an acceptable rate of information complexity. Therefore Eir can be expressed as: Plc = f (Odt tl Where f(t) is a function of complexity weights over time.
The preferred control action estimator (3) gets input about the current information complexity estimator and the drivers current control action rate from the control action observer (4) and builds an association between them under non-distracted driving conditions. With sufficient evidence, this association is fortified to a preferred control action rate, i.e. a control action rate that the driver would exhibit spontaneously under these EIC conditions.
In a specific implementation of this invention that considers the drivers' steering control action rates, two different types of steering maneuvers (i.e., exploratory and compensatory) can be differentiated depending on their frequency and amplitude. Exploratory movements can be defined as higher frequency steering maneuvers at lower amplitude, compensatory movements as lower frequency steering maneuvers at higher amplitude. These characteristics of both types of steering maneuvers are measured for given EIC under non-distracted driving conditions. During a period of distracted driving a driver's steering maneuvers exhibit a change in the characteristics of the steering movements. This discrepancy between control actions under non-distracted driving and distracted driving is identified by the discrepancy detector (5) which receives input about the current driver control action rate (4) and compares it to the preferred control action rate for the current EIC. A discrepancy is indicative for distracted driving.
The discrepancy detector (5) receives input about the current driver control action rate (4) and compares it to the preferred control action rate for the current EIC. If a discrepancy is detected this is evidence for distracted driving.
While the present invention has been described with respect to the illustrated embodiment, it is recognized that numerous modifications and variations in addition to those mentioned herein will occur to those skilled in the art. Accordingly, it is intended that the invention not be limited to the disclosed embodiment, but that it have the full scope permitted by the language of the following claims.
For a better understanding of the invention figures are given. These figures show: Fig. 1: graphical depiction of the method Fig. 2: Steering Wheel Reversal Rates for Distracted and Non-Distracted Driving (example) References [1] European Community, "Driver Distraction 2018." 2018, [Online]. Available: https://ec.europa. eu/transport/road_safety/sites/roadsafety/files/pdf/ersosynthesis2018-driv erdistraction.pdf.
[2] Y. Liang and J. Lee, "Driver Cognitive Distraction Detection Using Eye Movements," 2007, pp. 285-300.
[3] S. Klauer, T. Dingus, T. Neale, J. Sudweeks, and D. Ramsey, -The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data," vol. 594, Jan. 2006.
[4] M. Heller and P. Moertl, "Toward Driver State Models that Explain Interindividual Variability of Distraction for Adaptive Automation," p. 15.

Claims (1)

  1. Claims 1. Method to detect safety relevant driving distraction of a human driver in a road vehicle comprising in a first step to estimate the environmental information complexity by detecting and analysing the type and the number of driving critical information elements in digital images of an outward facing camera 1 and assigning the weighted sum of information complexity values to an environmental information complexity value for each image according to the type and number of detected elements, in a second step to extract the preferred and most likely control action rates for a given driver 3 and for a given environmental information complexity value 2 by processing the recorded input of a driver's control action rates 4 under baseline and non-distracted conditions, in a third step to compare the value of the actually observed control action rates with the value of the preferred and most likely control action rates under non-distracted conditions of the driver and calculate the difference between the two values for control actions 5, in a fourth step to initiate a warning or intervention procedure based on the difference of the two values for control actions, where a low critical warning also comprises advertisement of evidence of detected safety critical distraction to other vehicle systems or the driver and high critical driving interventions.
GB2008797.9A 2020-06-10 2020-06-10 Method for detecting safety relevant driving distraction Pending GB2595895A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2008797.9A GB2595895A (en) 2020-06-10 2020-06-10 Method for detecting safety relevant driving distraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2008797.9A GB2595895A (en) 2020-06-10 2020-06-10 Method for detecting safety relevant driving distraction

Publications (2)

Publication Number Publication Date
GB202008797D0 GB202008797D0 (en) 2020-07-22
GB2595895A true GB2595895A (en) 2021-12-15

Family

ID=71615954

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2008797.9A Pending GB2595895A (en) 2020-06-10 2020-06-10 Method for detecting safety relevant driving distraction

Country Status (1)

Country Link
GB (1) GB2595895A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112806996A (en) * 2021-01-12 2021-05-18 哈尔滨工业大学 Driver distraction multi-channel assessment method and system under L3-level automatic driving condition
CN114132329B (en) * 2021-12-10 2024-04-12 智己汽车科技有限公司 Driver attention maintaining method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08197977A (en) * 1995-01-27 1996-08-06 Fuji Heavy Ind Ltd Alarm device of vehicle
US20050126841A1 (en) * 2003-12-10 2005-06-16 Denso Corporation Awakening degree determining system
US20130144491A1 (en) * 2011-12-06 2013-06-06 Hyundai Motor Company Technique for controlling lane maintenance based on driver's concentration level
US20200148225A1 (en) * 2018-11-08 2020-05-14 Ford Global Technologies, Llc Apparatus and method for determining an attention requirement level of a driver of a vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08197977A (en) * 1995-01-27 1996-08-06 Fuji Heavy Ind Ltd Alarm device of vehicle
US20050126841A1 (en) * 2003-12-10 2005-06-16 Denso Corporation Awakening degree determining system
US20130144491A1 (en) * 2011-12-06 2013-06-06 Hyundai Motor Company Technique for controlling lane maintenance based on driver's concentration level
US20200148225A1 (en) * 2018-11-08 2020-05-14 Ford Global Technologies, Llc Apparatus and method for determining an attention requirement level of a driver of a vehicle

Also Published As

Publication number Publication date
GB202008797D0 (en) 2020-07-22

Similar Documents

Publication Publication Date Title
JP6307629B2 (en) Method and apparatus for detecting safe driving state of driver
Kutila et al. Driver distraction detection with a camera vision system
JP7139331B2 (en) Systems and methods for using attention buffers to improve resource allocation management
US8085140B2 (en) Travel information providing device
US10235768B2 (en) Image processing device, in-vehicle display system, display device, image processing method, and computer readable medium
KR102540436B1 (en) System and method for predicting vehicle accidents
EP2201496B1 (en) Inattentive state determination device and method of determining inattentive state
CN107380164A (en) Driver assistance system and support system based on computer vision
Costa et al. Detecting driver’s fatigue, distraction and activity using a non-intrusive ai-based monitoring system
JP5691237B2 (en) Driving assistance device
KR20200113202A (en) Information processing device, mobile device, and method, and program
CN110641468A (en) Controlling autonomous vehicles based on passenger behavior
Sun et al. An integrated solution for lane level irregular driving detection on highways
GB2595895A (en) Method for detecting safety relevant driving distraction
JP7210929B2 (en) Driving consciousness estimation device
EP4213090A1 (en) Information processing device, information processing method, program, and information processing terminal
US20240000354A1 (en) Driving characteristic determination device, driving characteristic determination method, and recording medium
KR20160133179A (en) Method and Apparatus For Dangerous Driving Conditions Detection Based On Integrated Human Vehicle Interface
CN104798084A (en) Method and information system for filtering object information
CN113631411B (en) Display control device, display control method, and recording medium
Sun et al. Online distraction detection for naturalistic driving dataset using kinematic motion models and a multiple model algorithm
JP5493451B2 (en) Degree of consciousness determination device and alarm device
Rezaei et al. Toward next generation of driver assistance systems: A multimodal sensor-based platform
Tawari et al. Predicting unexpected maneuver while approaching intersection
JP2018537787A (en) Method and apparatus for classifying at least one eye opening data of a vehicle occupant and method and apparatus for detecting drowsiness and / or microsleep of a vehicle occupant