CN112426124A - Vehicle driving efficiency monitoring method based on eye movement data - Google Patents

Vehicle driving efficiency monitoring method based on eye movement data Download PDF

Info

Publication number
CN112426124A
CN112426124A CN202011323653.5A CN202011323653A CN112426124A CN 112426124 A CN112426124 A CN 112426124A CN 202011323653 A CN202011323653 A CN 202011323653A CN 112426124 A CN112426124 A CN 112426124A
Authority
CN
China
Prior art keywords
eye movement
driver
fixation point
driving
optical flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011323653.5A
Other languages
Chinese (zh)
Inventor
吕泽众
徐庆
魏继增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202011323653.5A priority Critical patent/CN112426124A/en
Publication of CN112426124A publication Critical patent/CN112426124A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7239Details of waveform analysis using differentiation including higher order derivatives
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Abstract

The invention discloses a vehicle driving efficiency monitoring method based on eye movement data, which comprises the following steps of 1, collecting real-time eye movement data of a driver in the driving process; step 2, extracting eye movement behavior characteristics from the eye movement data; step 3, obtaining distance information between a real-time fixation point of a driver and the driver based on eye movement data and real-time driving speed; step 4, calculating the optical flow amplitude caused by the fixation point on the imaging plane of the driver; step 5, gridding the visual space; step 6, constructing a probability distribution model; and 7, obtaining the difference between the fixation point probability distribution and the optical flow probability distribution by using a distance function between the probability distributions, and using the difference as an index of the driving efficiency. Compared with the prior art, the driving efficiency monitoring method is more accurate and more general than a single driving efficiency monitoring index.

Description

Vehicle driving efficiency monitoring method based on eye movement data
Technical Field
The invention relates to the technical field of human-computer interaction, in particular to a vehicle driving efficiency monitoring method.
Background
Vehicle driving is one of the commonly perceived motor tasks in life, and the efficiency of its performance is closely related to the safety of the lives and property. To accomplish the driving task, the driver controls the coordination of the movement of the eyes, head and body as a whole, thereby accomplishing the potential but effective collection and processing of ambient information. Among them, eye movement behavior is especially important as a main way to collect visual information. Since more than 70% of the external environment information required for executing the sensorimotor performance task is carried by visual information, the efficiency of the eye movement behavior itself can be used as one of the indexes of the performance efficiency of the typical sensorimotor performance task such as vehicle driving.
In such a complex environment as vehicle driving, it is difficult to accurately and reasonably monitor the driving behavior efficiency of the driver through a single function such as speed, driving trajectory, etc. These predetermined indexes cannot be general driving efficiency indexes in coping with different driving conditions and environments. For example, when driving a vehicle on an open highway, a smooth, higher driving speed represents better driving efficiency; and in a crowded urban road section, slower driving and unstable driving track represent better driving efficiency.
In order to solve the problem that the driving efficiency of the existing driver is difficult to be generally indicated by a single index, the driving efficiency of the driver in the vehicle driving process is monitored in real time through eye movement data, the eye movement data of the driver is combined with the driving behavior properly to study the driving efficiency, and the eye movement data provides abundant possibility for studying the eye movement behavior. The eye movement behavior is one of the most important behavior expressions of a person when the person executes a perception movement task, the eye movement behavior can effectively reflect task expression, cognitive load and the like, the driving efficiency monitoring index starts from the eye movement behavior of the person, and the result of the driving efficiency monitoring index accords with the definition of the behavior efficiency of the person when the person executes the driving task.
The reasonable mining and processing of the complex multi-modal eye movement data in the driving process is one of the valuable driving behavior efficiency monitoring methods.
Disclosure of Invention
The invention aims to provide a driving efficiency monitoring method based on driver eye movement data, which monitors the driving efficiency of a driver in a vehicle driving process in real time through the eye movement data so as to solve the problem that the driving efficiency of the driver is difficult to be generally indicated by a single index in the prior art, and realizes a more general and non-invasive real-time driving efficiency monitoring method.
The invention discloses a vehicle driving efficiency monitoring method based on eye movement data, which specifically comprises the following steps:
step 1, collecting real-time eye movement data of a driver in a driving process; the eye movement data at least comprises eye movement coordinate data of a driver during driving, pupil diameter data of the driver during driving, blink event data of the driver during driving and time sequence data of the driver during driving;
step 2, extracting eye movement behavior characteristics from the eye movement data, wherein the eye movement behavior characteristics comprise an effective fixation point space position and an optical flow caused by a fixation point; the method comprises the following steps: obtaining an effective fixation point and a space position coordinate thereof by discrete real-time fixation eye movement coordinate data through a clustering algorithm; wherein the clustering algorithm comprises: the duration of gaze is above a first time, and the dispersion of the spatial distribution is within a first distance threshold, considered as a valid point of gaze;
step 3, obtaining distance information between a real-time fixation point of a driver and the driver based on eye movement data and real-time driving speed; the method comprises the following steps: extracting direction information of a fixation point from eye movement data to serve as eye movement behavior characteristics, obtaining the distance between a driver and the fixation point in the driving direction through a distance sensor, and recording the real-time driving speed of the effective fixation point forming moment;
step 4, calculating the optical flow amplitude caused by the gazing point on the imaging plane of the driver by utilizing the space direction information of the effective gazing point and the distance information of the driver; the method comprises the following steps: extracting the amplitude of optical flow induced by the effective fixation point on an imaging plane of a driver according to the space direction information of the effective fixation point, the distance between the driver and the fixation point and the real-time driving speed;
step 5, gridding the visual space;
step 6, respectively constructing probability distribution models for the effective fixation points and the optical flow data caused by the fixation points based on the gridded visual space; the method comprises the following steps: constructing probability distribution of the effective fixation point sequence based on the interest region and an optical flow sequence caused by the effective fixation point sequence by taking a time sequence formed by the effective fixation points as a basic unit;
step 7, obtaining the difference between the fixation point probability distribution and the optical flow probability distribution by using a distance function between the probability distributions; the method comprises the following steps: measuring the difference between the distribution of the fixation points and the distribution of the optical flow by using an information theory-based tool or a probability theory-based tool, and taking the measured difference as the driving efficiency of a driver when the driver drives the vehicle;
establishing probability distribution of fixation point and optical flow sequence by using fixation point sequence with length of k as basic unit
Figure BDA0002793657870000031
And
Figure BDA0002793657870000032
the visual scanning efficiency VSE is expressed as:
Figure BDA0002793657870000033
and when G (P | | Q) is defined as VD (variation distance) function, the visual scanning efficiency is as follows:
Figure BDA0002793657870000034
wherein the metric function between the probability distributions comprises:
the non-negative Kullback-Leibler divergence KLD with direction is calculated as follows:
Figure BDA0002793657870000035
wherein, P ═ { P (X) | X ∈ X }, Q ═ { Q (X) | X ∈ X } are two probability distributions respectively;
the Jansen-Shannon divergence based on the information theory calculates a square root distance function, and the formula is as follows:
Figure BDA0002793657870000036
the formula of the variation distance function based on the probability theory is as follows:
VD(P||Q)=∑x∈X|p(x)-q(x)|。
compared with the prior art, the driving efficiency monitoring method is more accurate and more general than a single driving efficiency monitoring index.
Drawings
FIG. 1 is a schematic diagram of an algorithmic model of a method for monitoring vehicle driving efficiency based on eye movement data in accordance with the present invention;
FIG. 2 is a general flow chart of the present invention for a method of monitoring vehicle driving efficiency based on eye movement data;
FIG. 3 is a schematic view of the optical flow induced by the point of regard;
fig. 4 is a schematic diagram of frequency histogram modeling.
Detailed Description
The technical solution of the present invention is further explained with reference to the drawings and the embodiments.
As shown in fig. 1 and 2, the method for monitoring driving efficiency of a vehicle based on eye movement data according to the present invention is characterized by comprising the following steps:
step 1, collecting real-time eye movement data of a driver in a driving process; the method comprises the following steps:
the method comprises the steps of collecting eye movement coordinate data of a driver in the driving process, pupil diameter data of the driver in the driving process, blink event data of the driver in the driving process and time sequence data of the driver in the driving process;
step 2, extracting eye movement behavior characteristics from the eye movement data, wherein the eye movement behavior characteristics comprise an effective fixation point space position and an optical flow caused by a fixation point; the method comprises the following steps: obtaining an effective fixation point and a space position coordinate thereof by discrete real-time fixation eye movement coordinate data through a clustering algorithm; wherein the clustering algorithm comprises: a gaze duration above a first time (e.g., 0.1 seconds) and a spatial distribution dispersion within a first distance threshold (e.g., 1 °), may be considered as a valid point of regard;
step 3, obtaining distance information between a real-time fixation point of a driver and the driver based on eye movement data and real-time driving speed; the method comprises the following steps: extracting direction information of a fixation point from eye movement data to serve as eye movement behavior characteristics, obtaining the distance between a driver and the fixation point in the driving direction through a distance sensor, and recording the real-time driving speed of the effective fixation point forming moment;
step 4, calculating the optical flow amplitude caused by the gazing point on the imaging plane of the driver by utilizing the space direction information of the effective gazing point and the distance information of the driver; the method comprises the following steps: extracting the amplitude of the optical flow induced by the effective fixation point on the imaging plane of the driver by utilizing the definition of the optical flow and the spatial direction information of the effective fixation point, the distance between the driver and the fixation point and the real-time driving speed;
Figure BDA0002793657870000051
FIG. 3 is a schematic diagram of the optical flow induced by the point of regard. The observed person (O) generates a fixation point f during the execution of the perceptual-motor tasknDuration of the point of gaze is τnAnd the distance between the fixation point and the observer is rhon(i.e., the length of the line of sight). When there is a relative displacement μ between the point of gaze and the observer during the duration of the gazenWhen the induced light flow in the classical definition is upsilonn(bold latin character represents a vector). Upsilon isnIs munThe projection in the direction perpendicular to the direction of the line of sight, i.e. the direction of the optical flow. Based on this, what the observer actually perceivesDynamic variation mnIs defined as the point of fixation
Figure BDA0002793657870000052
Distance rho between the observer and the fixation pointnIs a section of circular arc curve with radius performing relative circular motion, and the corresponding central angle is mn(ii) a One of the elements of the three-dimensional scene, depth, is effectively applied.
Step 5, gridding the visual space;
step 6, respectively constructing probability distribution models for the effective fixation points and the optical flow data caused by the fixation points based on the gridded visual space; the method comprises the following steps: constructing probability distribution of an effective fixation point sequence based on the interest region and an optical flow sequence induced by the effective fixation point sequence by using a proper modeling mode (such as a frequency histogram, kernel function estimation and the like) and taking a time sequence formed by the effective fixation points as a basic unit; fig. 4 is a schematic diagram showing the result of frequency histogram modeling.
The method comprises the following steps that an interest area is divided in a visual space under a driving task, wherein the division is mainly gridding operation or automatic semantic segmentation operation based on video content understanding;
step 7, obtaining the difference between the fixation point probability distribution and the optical flow probability distribution by using a distance function between the probability distributions;
establishing probability distribution of fixation point and optical flow sequence by using fixation point sequence with length of k as basic unit
Figure BDA0002793657870000053
And
Figure BDA0002793657870000054
the visual scanning efficiency VSE is expressed as:
Figure BDA0002793657870000061
and when G (P | | Q) is defined as VD (variation distance) function, the visual scanning efficiency is as follows:
Figure BDA0002793657870000062
the method comprises the following steps: measuring the difference between the distribution of the fixation points and the distribution of the optical flow by using an information theory-based tool or a probability theory-based tool, and taking the measured difference as the driving efficiency of a driver when the driver drives the vehicle;
wherein the metric function between the probability distributions comprises:
for two probability distributions P ═ P (X) | X ∈ X }, Q ═ Q (X) | X ∈ X }, the "gap" or "difference" between them is measured using some function. There are many classical and widely used methods in information theory, for example, the non-negative Kullback-Leibler divergence KLD calculation formula with direction is as follows:
Figure BDA0002793657870000063
the Jansen-Shannon divergence based on the information theory calculates a square root distance function, and the formula is as follows:
Figure BDA0002793657870000064
the formula of the variation distance function based on the probability theory is as follows:
VD(P||Q)=∑x∈X|p(x)-q(x)|;
wherein the distance function between the probability distributions is a distance measure conforming to a mathematical definition, satisfying nonnegativity, symmetry, and triangle inequalities.
The invention starts from the eye movement behavior of the driver, and properly combines the driving behavior to research the driving efficiency, so that the invention is a more general driving efficiency monitoring method. The reasonable mining and processing of the complex multi-modal eye movement data in the driving process is one of the valuable driving behavior efficiency monitoring methods.
The effectiveness of the present invention was confirmed by virtual driving experiments involving fourteen subjects. In the experiment, the driving task was to maintain constant driving on an open road, and each subject was subjected to 4 experiments in the same environment and task at time intervals exceeding one week. The experimental result shows that the distance between the probability distributions and the driving performance have obvious correlation, and three correlation coefficients are as follows: pearson Linear Correlation Coefficient (PLCC) for detecting linear correlation, Spearman Rank Order Correlation Coefficient (SROCC) for non-linear correlation analysis, and Kendall Rank Order Correlation Coefficient (KROCC) for discrete data all satisfy the correlation coefficient and p value between the distance between probability distributions and driving efficiency as shown in Table 1 as the result of significant correlation analysis.
TABLE 1
Figure BDA0002793657870000071
Embodiments of the invention are described below:
the system for realizing the method for monitoring the driving efficiency of the vehicle based on the eye movement data comprises an eye movement tracking instrument, a distance sensor and a microcomputer with a display and a loudspeaker, realizes the acquisition of the eye movement data and the distance between a driver (vehicle) and a driver fixation position, and further processes and calculates to obtain the driving efficiency which is real-time for the driver, suitable for various driving environments and tasks and has general significance. Since the driving efficiency index distance function is a mathematically defined distance measure, satisfying the nonnegativity, symmetry, and trigonometric inequalities, which have been demonstrated in the present invention to have strict upper and lower bounds, thereby preventing overflow of numerical calculations.

Claims (1)

1. A vehicle driving efficiency monitoring method based on eye movement data is characterized by comprising the following steps:
step 1, collecting real-time eye movement data of a driver in a driving process; the eye movement data at least comprises eye movement coordinate data of a driver during driving, pupil diameter data of the driver during driving, blink event data of the driver during driving and time sequence data of the driver during driving;
step 2, extracting eye movement behavior characteristics from the eye movement data, wherein the eye movement behavior characteristics comprise an effective fixation point space position and an optical flow caused by a fixation point; the method comprises the following steps: obtaining an effective fixation point and a space position coordinate thereof by discrete real-time fixation eye movement coordinate data through a clustering algorithm; wherein the clustering algorithm comprises: the duration of gaze is above a first time, and the dispersion of the spatial distribution is within a first distance threshold, considered as a valid point of gaze;
step 3, obtaining distance information between a real-time fixation point of a driver and the driver based on eye movement data and real-time driving speed; the method comprises the following steps: extracting direction information of a fixation point from eye movement data to serve as eye movement behavior characteristics, obtaining the distance between a driver and the fixation point in the driving direction through a distance sensor, and recording the real-time driving speed of the effective fixation point forming moment;
step 4, calculating the optical flow amplitude caused by the gazing point on the imaging plane of the driver by utilizing the space direction information of the effective gazing point and the distance information of the driver; the method comprises the following steps: extracting the amplitude of optical flow induced by the effective fixation point on an imaging plane of a driver according to the space direction information of the effective fixation point, the distance between the driver and the fixation point and the real-time driving speed;
step 5, gridding the visual space;
step 6, respectively constructing probability distribution models for the effective fixation points and the optical flow data caused by the fixation points based on the gridded visual space; the method comprises the following steps: constructing probability distribution of the effective fixation point sequence based on the interest region and an optical flow sequence induced by the effective fixation point sequence by taking a time sequence formed by the effective fixation points as a basic unit;
step 7, obtaining the difference between the fixation point probability distribution and the optical flow probability distribution by using a distance function between the probability distributions; the method comprises the following steps: measuring the difference between the distribution of the fixation points and the distribution of the optical flow by using an information theory-based tool or a probability theory-based tool, and taking the measured difference as the driving efficiency of a driver when the driver drives the vehicle;
establishing probability distribution of fixation point and optical flow sequence by using fixation point sequence with length of k as basic unit
Figure FDA0002793657860000011
And
Figure FDA0002793657860000012
the visual scanning efficiency VSE is expressed as:
Figure FDA0002793657860000021
and when G (P | | Q) is defined as VD (variation distance) function, the visual scanning efficiency is as follows:
Figure FDA0002793657860000022
wherein the metric function between the probability distributions comprises:
the non-negative Kullback-Leibler divergence KLD with direction is calculated as follows:
Figure FDA0002793657860000023
wherein, P ═ { P (X) | X ∈ X }, Q ═ { Q (X) | X ∈ X } are two probability distributions respectively;
the Jansen-Shannon divergence based on the information theory calculates a square root distance function, and the formula is as follows:
Figure FDA0002793657860000024
the formula of the variation distance function based on the probability theory is as follows:
VD(P||Q)=∑x∈X|p(x)-q(x)|。
CN202011323653.5A 2020-11-23 2020-11-23 Vehicle driving efficiency monitoring method based on eye movement data Pending CN112426124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011323653.5A CN112426124A (en) 2020-11-23 2020-11-23 Vehicle driving efficiency monitoring method based on eye movement data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011323653.5A CN112426124A (en) 2020-11-23 2020-11-23 Vehicle driving efficiency monitoring method based on eye movement data

Publications (1)

Publication Number Publication Date
CN112426124A true CN112426124A (en) 2021-03-02

Family

ID=74693911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011323653.5A Pending CN112426124A (en) 2020-11-23 2020-11-23 Vehicle driving efficiency monitoring method based on eye movement data

Country Status (1)

Country Link
CN (1) CN112426124A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536909A (en) * 2021-06-08 2021-10-22 吉林大学 Pre-aiming distance calculation method, system and equipment based on eye movement data
CN114677665A (en) * 2022-03-08 2022-06-28 燕山大学 Driving scene attention strengthening method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
CN108537161A (en) * 2018-03-30 2018-09-14 南京理工大学 A kind of driving of view-based access control model characteristic is divert one's attention detection method
CN109572550A (en) * 2018-12-28 2019-04-05 西安航空学院 A kind of wheelpath prediction technique, system, computer equipment and storage medium
US20200151474A1 (en) * 2017-07-31 2020-05-14 Alcohol Countermeasure Systems (International) Inc. Non-intrusive assessment of fatigue in drivers using eye tracking
CN111738337A (en) * 2020-06-23 2020-10-02 吉林大学 Driver distraction state detection and identification method in mixed traffic environment
CN111797809A (en) * 2020-07-20 2020-10-20 吉林大学 Driver vision fusion method for automatic driving trajectory tracking

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US20200151474A1 (en) * 2017-07-31 2020-05-14 Alcohol Countermeasure Systems (International) Inc. Non-intrusive assessment of fatigue in drivers using eye tracking
CN108537161A (en) * 2018-03-30 2018-09-14 南京理工大学 A kind of driving of view-based access control model characteristic is divert one's attention detection method
CN109572550A (en) * 2018-12-28 2019-04-05 西安航空学院 A kind of wheelpath prediction technique, system, computer equipment and storage medium
CN111738337A (en) * 2020-06-23 2020-10-02 吉林大学 Driver distraction state detection and identification method in mixed traffic environment
CN111797809A (en) * 2020-07-20 2020-10-20 吉林大学 Driver vision fusion method for automatic driving trajectory tracking

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘志强等: "驾驶员注视行为模式识别技术研究", 《中国安全科学学报》 *
孟云伟等: "山区公路驾驶视觉信息量计算方法研究", 《交通运输系统工程与信息》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536909A (en) * 2021-06-08 2021-10-22 吉林大学 Pre-aiming distance calculation method, system and equipment based on eye movement data
CN114677665A (en) * 2022-03-08 2022-06-28 燕山大学 Driving scene attention strengthening method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Martin et al. Dynamics of driver's gaze: Explorations in behavior modeling and maneuver prediction
Jabon et al. Facial expression analysis for predicting unsafe driving behavior
Li et al. Predicting perceived visual and cognitive distractions of drivers with multimodal features
Wang et al. Online prediction of driver distraction based on brain activity patterns
Braunagel et al. Online recognition of driver-activity based on visual scanpath classification
CN104269028B (en) Fatigue driving detection method and system
CN112426124A (en) Vehicle driving efficiency monitoring method based on eye movement data
Hachisuka et al. Facial expression measurement for detecting driver drowsiness
US10394321B2 (en) Information acquiring method, information acquiring apparatus, and user equipment
Hachisuka Human and vehicle-driver drowsiness detection by facial expression
CN105286802A (en) Driver fatigue detection method based on video information
Pandey et al. Real-time drowsiness identification based on eye state analysis
CN101872419A (en) Method for detecting fatigue of automobile driver
Kasneci et al. Aggregating physiological and eye tracking signals to predict perception in the absence of ground truth
Martin et al. Gaze fixations and dynamics for behavior modeling and prediction of on-road driving maneuvers
Hirayama et al. Classification of driver's neutral and cognitive distraction states based on peripheral vehicle behavior in driver's gaze transition
Ma et al. Real time drowsiness detection based on lateral distance using wavelet transform and neural network
Arceda et al. A survey on drowsiness detection techniques
Tavakoli et al. Multimodal driver state modeling through unsupervised learning
Martin et al. Optical flow based head movement and gesture analysis in automotive environment
Dehzangi et al. Unobtrusive driver drowsiness prediction using driving behavior from vehicular sensors
CN111062300A (en) Driving state detection method, device, equipment and computer readable storage medium
Das et al. Multimodal detection of drivers drowsiness and distraction
Selvathi FPGA based human fatigue and drowsiness detection system using deep neural network for vehicle drivers in road accident avoidance system
Ansari et al. Application of fully adaptive symbolic representation to driver mental fatigue detection based on body posture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20221209

AD01 Patent right deemed abandoned