CN117017297A - Method for establishing prediction and identification model of driver fatigue and application thereof - Google Patents

Method for establishing prediction and identification model of driver fatigue and application thereof Download PDF

Info

Publication number
CN117017297A
CN117017297A CN202310973449.5A CN202310973449A CN117017297A CN 117017297 A CN117017297 A CN 117017297A CN 202310973449 A CN202310973449 A CN 202310973449A CN 117017297 A CN117017297 A CN 117017297A
Authority
CN
China
Prior art keywords
fatigue
driver
signals
signal
characteristic parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310973449.5A
Other languages
Chinese (zh)
Inventor
杜登斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzheng Intelligent Technology Beijing Co ltd
Original Assignee
Wuzheng Intelligent Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuzheng Intelligent Technology Beijing Co ltd filed Critical Wuzheng Intelligent Technology Beijing Co ltd
Priority to CN202310973449.5A priority Critical patent/CN117017297A/en
Publication of CN117017297A publication Critical patent/CN117017297A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/02108Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics
    • A61B5/02125Measuring pressure in heart or blood vessels from analysis of pulse wave characteristics of pulse wave propagation time
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/726Details of waveform analysis characterised by using transforms using Wavelet transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2131Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on a transform domain processing, e.g. wavelet transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0499Feedforward networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2123/00Data types
    • G06F2123/02Data types in the time domain, e.g. time-series data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • G06F2218/10Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Cardiology (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Psychiatry (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Pulmonology (AREA)
  • Signal Processing (AREA)
  • Computational Linguistics (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Optics & Photonics (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)

Abstract

The embodiment of the application discloses a method for establishing a prediction and identification model of driver fatigue and application thereof, wherein the method for establishing the prediction and identification model of the driver fatigue comprises the following steps: acquiring signals of face videos of each period in the driving process of a driver, and predicting physiological signals of each period of the driver based on a remote photoplethysmography; preprocessing the obtained physiological signals to obtain physiological signal abnormal change characteristic parameter data and corresponding sample characteristic parameter data of whether fatigue and fatigue degree exist; based on the physiological signal abnormal change characteristic parameter data and the corresponding sample characteristic parameter data of whether to fatigue and the fatigue degree, an ID3 algorithm is utilized to establish a cognitive parameter model for intelligently predicting whether to fatigue and the fatigue degree of a driver based on a face video signal and train so as to obtain the prediction and recognition model.

Description

Method for establishing prediction and identification model of driver fatigue and application thereof
Technical Field
The application relates to the technical field of driving safety, in particular to a method for establishing a prediction and identification model of driver fatigue and application thereof.
Background
In recent years, the traffic accident growth rate is continuously increased, so that adverse effects are brought to the stable development of society. And fatigue driving is one of the important reasons for traffic accidents. At present, even though safety problems are more and more put forward, no perfect method is available in the industry for quickly, simply, scientifically, qualitatively and quantitatively identifying whether a driver is in a fatigue driving state. The latest developed face recognition technology mainly judges whether a driver is in a fatigue driving state by analyzing the facial expression of the driver in a monitoring video. Since a person's facial expression is affected by many factors, such as physiological factors, psychological factors, social family factors, etc., the facial expression recognition of an individual is affected. Therefore, the conventional method for judging the fatigue driving of the driver based on the facial appearance of the driver has poor practical application effect.
Disclosure of Invention
The embodiment of the application aims to provide a method for establishing a prediction and identification model of driver fatigue and application thereof, which are used for solving the problems that a perfect method for quickly, simply, scientifically, qualitatively and quantitatively identifying whether a driver is in a fatigue driving state or not in the prior art and the actual application effect of the existing method for judging the driver fatigue driving based on the facial appearance of the driver is poor.
In order to achieve the above object, an embodiment of the present application provides a method for establishing a model for predicting and identifying fatigue of a driver, including: acquiring signals of face videos of each period in the driving process of a driver, and predicting physiological signals of each period of the driver based on a remote photoplethysmography;
preprocessing the obtained physiological signals to obtain physiological signal abnormal change characteristic parameter data and corresponding sample characteristic parameter data of whether fatigue and fatigue degree exist;
based on the physiological signal abnormal change characteristic parameter data and the corresponding sample characteristic parameter data of whether to fatigue and the fatigue degree, an ID3 algorithm is utilized to establish a cognitive parameter model for intelligently predicting whether to fatigue and the fatigue degree of a driver based on a face video signal and train so as to obtain the prediction and recognition model.
Optionally, the physiological signal includes a heart rate, and predicting the physiological signal of each period of the driver based on the remote photoplethysmography includes:
and extracting pulse waves contained in the skin tone image by adopting an improved Euler amplification method on the skin tone image extracted from the face video, and analyzing and processing the extracted pulse wave signals to obtain pulse wave characteristic values and heart rate signals.
Optionally, the physiological signal includes blood pressure, and predicting the physiological signal of each period of the driver based on remote photoplethysmography includes:
establishing a linear relation between pulse waves and blood pressure through pulse wave conduction time, and realizing fuzzy estimation of the blood pressure according to the related information of the measured target;
and then, the blood pressure is predicted by using the trained improved BP neural network with the parameter library.
Optionally, the extracting the pulse wave contained in the skin tone image by adopting an improved euler amplification method on the skin tone image extracted from the face video includes:
extracting an original remote photoplethysmography signal from the face video;
constructing and training a transducer-based neural network, wherein the neural network is mainly formed by longitudinally stacking a plurality of transducer modules;
and inputting the original remote photoplethysmography signal into a trained transducer-based neural network for enhancement and noise reduction treatment to obtain a processed one-dimensional time signal, namely a pulse waveform.
Optionally, the obtaining the physiological signal abnormal change feature parameter data and the corresponding sample feature parameter data of whether to fatigue and the fatigue degree includes:
and (3) extracting blood oxygen saturation characteristic parameters: performing time domain analysis on the processed physiological signal data to obtain characteristic parameters of blood oxygen saturation: standard deviation of blood oxygen; wherein the method comprises the steps of
The formula for obtaining the standard deviation of blood oxygen is as follows:wherein RSD represents blood oxygen standard deviation, RAVG represents blood oxygen saturation average value, N represents total number of blood oxygen data in the collected sample, and Ri represents blood oxygen saturation value of the ith sample;
and reflecting the fatigue state of the driver according to the fluctuation degree of the obtained blood oxygen standard deviation result.
Optionally, the obtaining the physiological signal abnormal change feature parameter data and the corresponding sample feature parameter data of whether to fatigue and the fatigue degree includes:
pulse characteristic parameter extraction: performing time domain analysis and frequency domain analysis on the pulse signal data in the processed physiological signals to obtain characteristic parameters of the pulse signals: RR interval mean, RR interval standard deviation, low frequency power and high frequency power; the method specifically comprises the following steps:
in pulse signals, the peak of each main wave is detected by wavelet transformation, and the time difference between two adjacent main wave peaks is obtained and recorded as RR i (i=1, 2,3, … …), the average value of RRi is the average value of RR interval, RR i The standard deviation of (2) is RR interval standard deviation, and the fast Fourier transform is used to change the time domain index into the frequency domain index, so as to obtain low-frequency power LF and high-frequency power HF, and the calculation formula is as follows:
and->Wherein,
MEAN represents RR interval MEAN, and SDNN represents RR interval standard deviation.
Optionally, the obtaining the physiological signal abnormal change feature parameter data and the corresponding sample feature parameter data of whether to fatigue and the fatigue degree includes:
myoelectricity characteristic parameter extraction: and obtaining characteristic parameters of the electromyographic signals by performing time domain analysis and frequency domain analysis on the processed physiological signal data: integrating myoelectric value, root mean square value, average power frequency and median frequency; wherein the calculation formula is as follows:
and->
Wherein IEMG represents an integral myoelectric value, RMS represents a root mean square value, t1 is the starting time of the acquisition of the myoelectric signal, t2 is the ending time of the acquisition of the myoelectric signal, and X (t) is the magnitude of the myoelectric signal;
performing fast Fourier transform on the surface electromyographic signals to obtain power spectrums of the surface electromyographic signals, reflecting the intensity of the surface electromyographic signals in different frequency ranges, wherein the median frequency MF and the average power frequency MPF are commonly used evaluation indexes of muscle fatigue, and the calculation formula is as follows:
and->
Wherein PSD (f) is power spectral density, MF is median frequency, f is electromyographic signal frequency, f 0 For the upper frequency limit, half the sampling frequency, equal to 500hz, mpf and MF decreases indicating that muscle fatigue is present in the subject.
In order to achieve the above object, the present application also provides a method for predicting and identifying fatigue of a driver, comprising:
and (3) acquiring a time sequence of face videos of the driver in each period, inputting the face video signals of the driver to be predicted into the prediction and recognition model established by the method for establishing the prediction and recognition model of the driver fatigue, and obtaining the prediction and recognition results of whether the driver is tired and the fatigue degree in different periods.
Optionally, the obtaining the prediction and recognition results of whether the driver is tired and the fatigue degree in different periods specifically includes:
the method comprises the steps of obtaining a clustering category of a feature vector of a face video signal of a driver to be processed, calculating similarity between the feature vector of the face video signal of the driver to be processed and each vector in the clustering category, and taking fatigue driving degree corresponding to the vector with the highest similarity as a fatigue and fatigue degree recognition result of the driver to be processed.
To achieve the above object, the present application also provides a computer storage medium having stored thereon a computer program which, when executed by a machine, implements the steps of the method as described above.
The embodiment of the application has the following advantages:
the embodiment of the application provides a method for establishing a prediction and identification model of driver fatigue, which comprises the following steps: acquiring signals of face videos of each period in the driving process of a driver, and predicting physiological signals of each period of the driver based on a remote photoplethysmography; preprocessing the obtained physiological signals to obtain physiological signal abnormal change characteristic parameter data and corresponding sample characteristic parameter data of whether fatigue and fatigue degree exist; based on the physiological signal abnormal change characteristic parameter data and the corresponding sample characteristic parameter data of whether to fatigue and the fatigue degree, an ID3 algorithm is utilized to establish a cognitive parameter model for intelligently predicting whether to fatigue and the fatigue degree of a driver based on a face video signal and train so as to obtain the prediction and recognition model.
According to the method, by utilizing a remote photoplethysmography method, physiological signals such as heart rate, heart rate variability and blood pressure of a driver are analyzed and predicted, a cognitive parameter model for predicting whether the driver is tired and the tired degree by using a face video signal is established based on the abnormal change characteristic parameter data of the physiological signals and the corresponding sample characteristic parameter data of whether the driver is tired and the tired degree, and the acquired results of all the parameters are calculated and analyzed, so that the recognition result for intelligently predicting whether the driver is tired and the tired degree is realized; the method can realize rapid, simple, scientific, qualitative and quantitative recognition of whether the driver is in the fatigue driving state or not by a perfect method, and solves the problem of poor practical application effect of the existing method for judging the fatigue driving of the driver based on the facial appearance of the driver.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It will be apparent to those skilled in the art from this disclosure that the drawings described below are merely exemplary and that other embodiments may be derived from the drawings provided without undue effort.
Fig. 1 is a flowchart of a method for establishing a model for predicting and identifying fatigue of a driver according to an embodiment of the present application.
Detailed Description
Other advantages and advantages of the present application will become apparent to those skilled in the art from the following detailed description, which, by way of illustration, is to be read in connection with certain specific embodiments, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In addition, the technical features of the different embodiments of the present application described below may be combined with each other as long as they do not collide with each other.
Based on remote photoplethysmography (rppg), the application utilizes the abnormal signal change of the face video to realize the intelligent prediction of whether the driver is tired and the result of the fatigue degree, thereby realizing the fatigue detection of the driver through the face video signal.
An embodiment of the present application provides a method for modeling a prediction and recognition of driver fatigue, and referring to fig. 1, fig. 1 is a flowchart of a method for modeling a prediction and recognition of driver fatigue provided in an embodiment of the present application, it should be understood that the method may further include additional blocks not shown and/or blocks shown may be omitted, and the scope of the present application is not limited in this respect.
At step 101, signals of face videos of each period in the driving process of the driver are acquired, and physiological signals of each period of the driver are predicted based on remote photoplethysmography.
Specifically, remote photoplethysmography (rppg) is a method of extracting physiological signals from human face video. This method is a technique of capturing a periodic change in skin color caused by a cardiac cycle by a sensor such as a camera. The rPPG technology can be used for extracting the blood volume pulse signals, so as to measure the physiological indexes related to heart cycles such as heart rate, respiratory rate, heart variability and the like. In recent years, a physiological index measurement method based on rPPG is rapidly developed, and the accuracy and the robustness are greatly improved. Human skin can be divided into 3 layers: the epidermis, dermis and subcutaneous tissue layers in which abundant capillaries are distributed. The absorption of light by human skin is mainly derived from melanin and hemoglobin in capillaries. The surge in blood flow caused by the beating of the heart causes a periodic change in the blood flow of the capillaries and the hemoglobin content therein, which in turn causes a periodic change in the amount of light absorbed by the skin. This change, although weak, can still be captured by the camera. The rpg technique is based on this principle, capturing periodic color changes of a fixed skin area due to different light absorption amounts by a camera, and further extracting BVP signals by using the captured skin color changes and measuring physiological indexes related to the cardiac cycle.
The basic principle of the method is as follows: the camera is used for capturing the video of the skin area (usually selecting the skin on the face or the arm), analyzing the periodic color change of the skin area caused by the blood flow pulsation caused by the heartbeat, further recovering the corresponding BVP signal and measuring the physiological index. The existing rPPG-based physiological index measurement algorithm flow mainly comprises 3 steps: video acquisition and region of interest division, BVP signal extraction and enhancement, and physiological index measurement.
In some embodiments, the physiological signal comprises a heart rate, the predicting physiological signals for each period of the driver based on remote photoplethysmography comprises:
and extracting pulse waves contained in the skin tone image by adopting an improved Euler amplification method on the skin tone image extracted from the face video, and analyzing and processing the extracted pulse wave signals to obtain pulse wave characteristic values and heart rate signals.
In some embodiments, further comprising: in the blood pressure measurement stage, the blood pressure is measured by establishing a double blood pressure prediction model. Firstly, the linear relation between the pulse wave conduction time (Pulse Transmit Time, PTT) and the blood pressure is established, and fuzzy estimation of the blood pressure is realized according to the related information of the measured target. And then, accurately predicting the blood pressure by using the trained improved BP neural network with the parameter library.
In some embodiments, the method for extracting the pulse wave contained in the skin tone image by adopting an improved euler amplification method on the skin tone image extracted from the face video specifically comprises the following steps:
1) And an extraction module: for extracting an original rppg signal (i.e., a blood volume pulse signal) from the face video; 2) A build and train module for building and training a transducer-based neural network consisting essentially of a plurality of transducer modules stacked longitudinally, the transducer modules comprising: one-dimensional patch mapping is used for converting the original rppg signal into dimensions to generate a token which can be accepted by a subsequent transducer encoder; a transducer encoder comprising a multi-headed self-attention module for causing the network to globally link the signal time domains; the signal recovery transformation is used for converting the output of the transducer coder into a dimension, and recovering the dimension into a one-dimensional time signal, namely a pulse waveform; 3) The processing module is used for inputting the original rppg signal into a trained transducer-based neural network to perform enhancement and noise reduction processing, and obtaining a processed one-dimensional time signal, namely a pulse waveform.
In the embodiment, the transducer network is applied to the direct processing of the rppg signal, and the global relation is established on the rppg signal in the time dimension through the transducer, so that the prediction accuracy of the pulse waveform signal is improved; meanwhile, the pulse transformer network provided by the embodiment can be used as a module to be added into tasks such as heart rate and other physiological signal prediction based on rppg, so that pure rppg signals are provided for the tasks such as rppg downstream, and accuracy of the tasks such as heart rate prediction and the like of the tasks such as rppg downstream is improved.
At step 102, the obtained physiological signal is preprocessed to obtain physiological signal abnormal change characteristic parameter data and corresponding sample characteristic parameter data of whether fatigue and fatigue degree are generated.
In some embodiments, for the calibration of fatigue level, a fatigue scale based on an existing authoritative fatigue determination table is introduced to determine the fatigue level of the sample. The driving time sequence is marked as continuous driving for 1 hour, 2 hours, 3 hours and the like; the driving fatigue degree is divided into four fatigue grades: wakefulness (grade 0), mild fatigue (grade 1), moderate fatigue (grade 2), severe fatigue (grade 3), and the like.
In some embodiments, the feature parameter extraction method includes:
1) And a blood oxygen saturation (SpO 2) characteristic parameter extraction method. When the human body is in a fatigue state, the SpO2 content is reduced to some extent. The feature parameters of SpO2 are obtained by carrying out time domain analysis on the processed physiological signal data: standard deviation of blood oxygen (RSD). According to the fluctuation degree of the obtained result, the driver can be accurately reflected
Fatigue state. The calculation formula is as follows:
wherein RAVG represents the average value of blood oxygen saturation, N represents the total number of blood oxygen data in the collected sample, and Ri represents the blood oxygen saturation value of the ith sample;
2) And a pulse characteristic parameter extraction method. The pulse signal data in the processed physiological signal data is subjected to time domain analysis and frequency domain analysis to obtain characteristic parameters of the pulse signal: RR interval MEAN (MEAN), RR interval Standard Deviation (SDNN), low frequency power (LF), high frequency power (HF). In pulse signals, the peak of each main wave is detected by wavelet transformation, and the time difference between two adjacent main wave peaks is obtained and recorded as RR i (i=1, 2,3 … …), RR i The average value of (a) is RR interval average value, RR i The standard deviation of (a) is RR interval standard deviation, and a Fast Fourier Transform (FFT) is used for changing a time domain index into a frequency domain index, so as to obtain LF and HF. The calculation formula is as follows:
3) Myoelectricity (EMG) characteristic parameter extraction method. It was found that the magnitude of the EMG signal increases and the frequency decreases as the fatigue of the human body increases, so that the fatigue state of the driver is detected by analyzing the EMG signal. And obtaining characteristic parameters of the electromyographic signals by performing time domain analysis and frequency domain analysis on the processed physiological signal data: integrated myoelectricity value (IEMG), root mean square value (RMS), average power frequency (MPF), median Frequency (MF). IEMG refers to the total amount of motor unit discharges involved in an activity in a muscle over a period of time, typically with an increase in magnitude as the degree of fatigue increases. RMS reflects the degree of recruitment of the number of units of exercise and the degree of synchrony of the discharge frequency thereof. The calculation formulas are respectively as follows:
wherein t1 is a starting time of acquiring the electromyographic signals, t2 is a finishing time of acquiring the electromyographic signals, and X (t) is the magnitude of the electromyographic signals. And performing Fast Fourier Transform (FFT) on the surface myoelectricity (EMG) signal to obtain a power spectrum of the EMG signal, and reflecting the intensity of the EMG signal in different frequency ranges. MF and MPF are commonly used indexes for evaluating muscle fatigue. The calculation formula is as follows:
wherein PSD (f) is power spectral density, MF is median frequency, f is electromyographic signal frequency, f 0 For the upper frequency limit, half the sampling frequency, equal to 500hz, mpf and MF decreases indicating that muscle fatigue is present in the subject.
At step 103, based on the physiological signal abnormal change feature parameter data and the corresponding sample feature parameter data of whether to fatigue and the fatigue degree, using an ID3 algorithm to build a cognitive parameter model for intelligently predicting whether to fatigue and the fatigue degree of the driver based on the face video signal and training the cognitive parameter model to obtain the prediction and recognition model.
Specifically, the ID3 algorithm is a greedy algorithm, and is used for constructing a decision tree, and the core idea is to measure the selection of the attribute by using the information gain, and select the attribute with the largest information gain after splitting for splitting. The algorithm traverses the possible decision space with a top-down greedy search.
Specifically, let the training data set be D, |d| denote its sample size, i.e., the number of samples. Is provided with K classes C k ,k=1,2,…,K,|C k I is belonging to class C k Is used for the number of samples of (a),set feature a * With V different valuesAccording to feature a * Dividing D into V subsets D 1 ,D 2 ,…,D V ,|D t I is D t Is used for the number of samples of (a),record set D i Belongs to class C k The set of samples of (2) is D ik . Namely D ik =D i ∩C k ,|D ik I is D ik Number of samples of the block. The method of calculating the information gain is then as follows:
(1) Calculating the empirical entropy H (D) of the dataset D
(2) Calculating empirical conditional entropy of features to data sets
Assume that given a training data set: d= { (x) 1 ,y 1 ),(x 2 ,y 2 ),...,(x N ,y N )},
Wherein,for the input example, i.e. feature vector, N is the number of features, i=1, 2,3 … … N, N is the number of samples, y i E {1, 2..k } is a classmark.
The embodiment of the application provides a method for predicting and identifying fatigue of a driver, which comprises the following steps:
and (3) acquiring a time sequence of face videos of the driver in each period, inputting the face video signals of the driver to be predicted into the prediction and recognition model established by the method for establishing the prediction and recognition model of the driver fatigue, and obtaining the prediction and recognition results of whether the driver is tired and the fatigue degree in different periods.
In some embodiments, the specific method for determining the fatigue driving recognition result is as follows: the method comprises the steps of obtaining a clustering category of a feature vector of a face video signal of a driver to be processed, calculating similarity between the feature vector of the face video signal of the driver to be processed and each vector in the clustering category, and taking fatigue driving degree corresponding to the vector with the highest similarity as a fatigue and fatigue degree recognition result of the driver to be processed. For example, the higher the score, the more tired, with 60 as the fatigue limit, 60-80 minutes of mild fatigue, and 80-100 weight fatigues.
Reference is made to the foregoing method embodiments for specific implementation methods, and details are not repeated here.
According to the method, by utilizing a remote photoplethysmography method, physiological signals such as heart rate, heart rate variability and blood pressure of a driver are analyzed and predicted, a cognitive parameter model for predicting whether the driver is tired and the tired degree by using a face video signal is established based on the abnormal change characteristic parameter data of the physiological signals and the corresponding sample characteristic parameter data of whether the driver is tired and the tired degree, and the acquired results of all the parameters are calculated and analyzed, so that the recognition result for intelligently predicting whether the driver is tired and the tired degree is realized; the method can realize rapid, simple, scientific, qualitative and quantitative recognition of whether the driver is in the fatigue driving state or not by a perfect method, and solves the problem of poor practical application effect of the existing method for judging the fatigue driving of the driver based on the facial appearance of the driver.
The present application may be a method, apparatus, system, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for performing various aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present application may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present application are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Note that all features disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic set of equivalent or similar features. Where used, further, preferably, still further and preferably, the brief description of the other embodiment is provided on the basis of the foregoing embodiment, and further, preferably, further or more preferably, the combination of the contents of the rear band with the foregoing embodiment is provided as a complete construct of the other embodiment. A further embodiment is composed of several further, preferably, still further or preferably arrangements of the strips after the same embodiment, which may be combined arbitrarily.
While the application has been described in detail in the foregoing general description and specific examples, it will be apparent to those skilled in the art that modifications and improvements can be made thereto. Accordingly, such modifications or improvements may be made without departing from the spirit of the application and are intended to be within the scope of the application as claimed.

Claims (10)

1. A method for modeling prediction and recognition of driver fatigue, comprising:
acquiring signals of face videos of each period in the driving process of a driver, and predicting physiological signals of each period of the driver based on a remote photoplethysmography;
preprocessing the obtained physiological signals to obtain physiological signal abnormal change characteristic parameter data and corresponding sample characteristic parameter data of whether fatigue and fatigue degree exist;
based on the physiological signal abnormal change characteristic parameter data and the corresponding sample characteristic parameter data of whether to fatigue and the fatigue degree, an ID3 algorithm is utilized to establish a cognitive parameter model for intelligently predicting whether to fatigue and the fatigue degree of a driver based on a face video signal and train so as to obtain the prediction and recognition model.
2. The method for modeling prediction and identification of driver fatigue according to claim 1, wherein the physiological signal includes heart rate, and the predicting physiological signals of the driver for each period based on remote photoplethysmography includes:
and extracting pulse waves contained in the skin tone image by adopting an improved Euler amplification method on the skin tone image extracted from the face video, and analyzing and processing the extracted pulse wave signals to obtain pulse wave characteristic values and heart rate signals.
3. The method for constructing a model for predicting and identifying fatigue of a driver according to claim 2, wherein the physiological signal includes blood pressure, and the predicting physiological signals of the driver for each period based on remote photoplethysmography includes:
establishing a linear relation between pulse waves and blood pressure through pulse wave conduction time, and realizing fuzzy estimation of the blood pressure according to the related information of the measured target;
and then, the blood pressure is predicted by using the trained improved BP neural network with the parameter library.
4. A driver fatigue prediction and recognition model building method according to claim 3, wherein the extracting the pulse wave contained in the skin tone image by the modified euler magnification method for the skin tone image extracted from the face video comprises:
extracting an original remote photoplethysmography signal from the face video;
constructing and training a transducer-based neural network, wherein the neural network is mainly formed by longitudinally stacking a plurality of transducer modules;
and inputting the original remote photoplethysmography signal into a trained transducer-based neural network for enhancement and noise reduction treatment to obtain a processed one-dimensional time signal, namely a pulse waveform.
5. The method for constructing a model for predicting and identifying fatigue of a driver according to claim 1, wherein the obtaining of the physiological signal abnormality variation characteristic parameter data and the corresponding sample characteristic parameter data of whether to fatigue and the degree of fatigue includes:
and (3) extracting blood oxygen saturation characteristic parameters: performing time domain analysis on the processed physiological signal data to obtain characteristic parameters of blood oxygen saturation: standard deviation of blood oxygen; wherein the method comprises the steps of
The formula for obtaining the standard deviation of blood oxygen is as follows:wherein RSD represents blood oxygen standard deviation, RAVG represents blood oxygen saturation average value, N represents total number of blood oxygen data in the collected sample, and Ri represents blood oxygen saturation value of the ith sample;
and reflecting the fatigue state of the driver according to the fluctuation degree of the obtained blood oxygen standard deviation result.
6. The method for constructing a model for predicting and identifying driver fatigue according to claim 5, wherein the obtaining of physiological signal abnormality variation characteristic parameter data and corresponding sample characteristic parameter data of whether to fatigue and the degree of fatigue includes:
pulse characteristic parameter extraction: performing time domain analysis and frequency domain analysis on the pulse signal data in the processed physiological signals to obtain characteristic parameters of the pulse signals: RR interval mean, RR interval standard deviation, low frequency power and high frequency power; the method specifically comprises the following steps:
in pulse signals, the peak of each main wave is detected by wavelet transformation, and the time difference between two adjacent main wave peaks is obtained and recorded as RR i (i=1, 2,3, … …), the average value of RRi is the average value of RR interval, RR i The standard deviation of (2) is RR interval standard deviation, and the fast Fourier transform is used to change the time domain index into the frequency domain index, so as to obtain low-frequency power LF and high-frequency power HF, and the calculation formula is as follows:
and->Wherein,
MEAN represents RR interval MEAN, and SDNN represents RR interval standard deviation.
7. The method for constructing a model for predicting and identifying driver fatigue according to claim 6, wherein the obtaining of physiological signal abnormality variation characteristic parameter data and corresponding sample characteristic parameter data of whether to fatigue and the degree of fatigue includes:
myoelectricity characteristic parameter extraction: and obtaining characteristic parameters of the electromyographic signals by performing time domain analysis and frequency domain analysis on the processed physiological signal data: integrating myoelectric value, root mean square value, average power frequency and median frequency; wherein the calculation formula is as follows:
and->
Wherein IEMG represents an integral myoelectric value, RMS represents a root mean square value, t1 is the starting time of the acquisition of the myoelectric signal, t2 is the ending time of the acquisition of the myoelectric signal, and X (t) is the magnitude of the myoelectric signal;
performing fast Fourier transform on the surface electromyographic signals to obtain power spectrums of the surface electromyographic signals, reflecting the intensity of the surface electromyographic signals in different frequency ranges, wherein the median frequency MF and the average power frequency MPF are commonly used evaluation indexes of muscle fatigue, and the calculation formula is as follows:
and->
Wherein PSD (f) is power spectral density, MF is median frequency, f is electromyographic signal frequency, f 0 For the upper frequency limit, half the sampling frequency, equal to 500hz, mpf and MF decreases indicating that muscle fatigue is present in the subject.
8. A method for predicting and identifying driver fatigue, comprising:
obtaining a time sequence of face videos of a driver in each period, inputting face video signals of the driver to be predicted into a prediction and recognition model established by the method for establishing the prediction and recognition model of the driver fatigue according to any one of claims 1 to 7, and obtaining prediction and recognition results of whether the driver is tired and the fatigue degree in different periods.
9. The method for predicting and identifying fatigue of a driver according to claim 8, wherein the obtaining the prediction and identification result of whether the driver is tired and the fatigue degree in different periods of time specifically includes:
the method comprises the steps of obtaining a clustering category of a feature vector of a face video signal of a driver to be processed, calculating similarity between the feature vector of the face video signal of the driver to be processed and each vector in the clustering category, and taking fatigue driving degree corresponding to the vector with the highest similarity as a fatigue and fatigue degree recognition result of the driver to be processed.
10. A computer storage medium having stored thereon a computer program, which when executed by a machine performs the steps of the method according to any of claims 1 to 9.
CN202310973449.5A 2023-08-03 2023-08-03 Method for establishing prediction and identification model of driver fatigue and application thereof Pending CN117017297A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310973449.5A CN117017297A (en) 2023-08-03 2023-08-03 Method for establishing prediction and identification model of driver fatigue and application thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310973449.5A CN117017297A (en) 2023-08-03 2023-08-03 Method for establishing prediction and identification model of driver fatigue and application thereof

Publications (1)

Publication Number Publication Date
CN117017297A true CN117017297A (en) 2023-11-10

Family

ID=88640625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310973449.5A Pending CN117017297A (en) 2023-08-03 2023-08-03 Method for establishing prediction and identification model of driver fatigue and application thereof

Country Status (1)

Country Link
CN (1) CN117017297A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117316458A (en) * 2023-11-27 2023-12-29 吾征智能技术(北京)有限公司 Disease risk assessment method, device, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117316458A (en) * 2023-11-27 2023-12-29 吾征智能技术(北京)有限公司 Disease risk assessment method, device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN109077715B (en) Electrocardiosignal automatic classification method based on single lead
Hughes et al. Markov models for automated ECG interval analysis
Wang et al. Arrhythmia classification algorithm based on multi-head self-attention mechanism
CN109961017A (en) A kind of cardiechema signals classification method based on convolution loop neural network
Altan et al. A new approach to early diagnosis of congestive heart failure disease by using Hilbert–Huang transform
CN110123367B (en) Computer device, heart sound recognition method, model training device, and storage medium
CN111310570B (en) Electroencephalogram signal emotion recognition method and system based on VMD and WPD
CN112603332A (en) Emotion cognition method based on electroencephalogram signal characteristic analysis
Gupta et al. Higher order derivative-based integrated model for cuff-less blood pressure estimation and stratification using PPG signals
CN117017297A (en) Method for establishing prediction and identification model of driver fatigue and application thereof
Jiménez-González et al. Blind extraction of fetal and maternal components from the abdominal electrocardiogram: An ICA implementation for low-dimensional recordings
Zhang et al. Hybrid feature fusion for classification optimization of short ECG segment in IoT based intelligent healthcare system
CN115363586A (en) Psychological stress grade assessment system and method based on pulse wave signals
CN114343635A (en) Variable phase-splitting amplitude coupling-based emotion recognition method and device
Aziz et al. Pulse plethysmograph signal analysis method for classification of heart diseases using novel local spectral ternary patterns
CN115089179A (en) Psychological emotion insights analysis method and system
Carvalho et al. Impact of the acquisition time on ECG compression-based biometric identification systems
CN103263272A (en) Single-edge multiple-spectrum dynamic spectrum data extraction method
Athaya et al. An efficient fingertip photoplethysmographic signal artifact detection method: A machine learning approach
CN114145725B (en) PPG sampling rate estimation method based on noninvasive continuous blood pressure measurement
CN113940638B (en) Pulse wave signal identification and classification method based on frequency domain dual-feature fusion
Biran et al. Automatic qrs detection and segmentation using short time fourier transform and feature fusion
Vimala et al. Classification of cardiac vascular disease from ECG signals for enhancing modern health care scenario
Rakshit et al. Wavelet Sub-bands features-based ECG signal quality assessment scheme for computer-aided monitoring system
CN115702778A (en) Sleep stage staging method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination