CN111310673B - Sleepiness prediction method, device and storage medium - Google Patents
Sleepiness prediction method, device and storage medium Download PDFInfo
- Publication number
- CN111310673B CN111310673B CN202010104790.3A CN202010104790A CN111310673B CN 111310673 B CN111310673 B CN 111310673B CN 202010104790 A CN202010104790 A CN 202010104790A CN 111310673 B CN111310673 B CN 111310673B
- Authority
- CN
- China
- Prior art keywords
- sleepiness
- calculating
- clustering
- value
- mean
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Human Computer Interaction (AREA)
- Psychology (AREA)
- Surgery (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Psychiatry (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention provides a sleepiness prediction method, which comprises the following steps: step S100, acquiring an image of a detection target, and performing face detection and skin identification, wherein the method comprises the following steps: detecting a target face region and extracting and clustering related skin regions; simultaneously measuring and calculating the eye aspect ratio; step S200, processing the clustering result of each skin area respectively; extracting and calculating an rPPG signal by comparing the clustering result with the RGB average value corresponding to each frame, and finally estimating the target heart rhythm by a fast Fourier transform method; and step S300, selecting the optimal signals detected by different clustering results to estimate the target sleepiness level. The invention can objectively and truly reflect the sleepiness of the driver.
Description
Technical Field
The invention relates to the technical field of biotechnology, in particular to a sleepiness prediction method and device for a driver.
Background
In the prior art, a method for judging sleepiness and giving an early warning by dividing a monitoring image and comparing the facial features, including the changes of the positions of key points such as a mouth, eyes and the like along with time is adopted, and a Doppler radar and a complex signal processing method are adopted to obtain fatigue data such as dysphoric emotional activity, blinking frequency, duration and the like of a tested person so as to judge whether the tested person sleeps or sleeps; the sleepiness is judged by the change of the positions of key points such as the mouth, the eyes and the like along with the time, the reference element is single, the sleepiness of a tested person cannot be objectively and truly reflected, and the method is low in technology, low in accuracy and hysteresis.
The term rppg (remote Photo plethsmograph) to which the present invention relates.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a sleepiness prediction method and a sleepiness prediction device, which are applied to early warning of fatigue driving of a driver and objectively and truly reflect sleepiness of the driver by monitoring physiological response characteristics of the driver.
The embodiment of the invention adopts the technical scheme that:
a sleepiness prediction method comprises the following steps:
step S100, acquiring an image of a detection target, and performing face detection and skin identification, wherein the method comprises the following steps: detecting a target face region and extracting and clustering related skin regions; simultaneously measuring and calculating the eye aspect ratio;
step S200, processing the clustering result of each skin area respectively; extracting and calculating an rPPG signal by comparing the clustering result with the RGB average value corresponding to each frame, and finally estimating the target heart rhythm by a fast Fourier transform method;
and step S300, selecting the optimal signals detected by different clustering results to estimate the target sleepiness level.
Further, the step S100 includes:
step S101, embedding an image into a long vector, using the long vector as the input of a neural network, and training the neural network to identify a face through a feed-forward algorithm;
step S102, inputting the image into the trained neural network, outputting face coordinates and eye coordinates, and then calculating the aspect ratio of the eyes;
in step S103, the skin image pixels of the captured face area are classified.
Further, the step S200 includes:
step S201, calculating an RGB average value of each classification of the clustered image pixels;
step S202, calculating an rPPG signal and estimating a heart rate, wherein the specific process is as follows:
in a first step, considering that for any one cluster k at any time t, an average value μ of RGB can be obtainedkThen will μkComparing the RGB average value corresponding to each frame, and converting the RGB average value into a signal function PV (t, k) related to t; i.e. rPPG signal;
second, PV (t, k) is time-sequentially averaged, PVmean(t, k) ═ PV (t, k) -DC, where DC is the average amplitude of PV;
third, to PVmean(t, k) performing a fast fourier transform thereby obtaining a frequency domain signal fpv (t);
and fourthly, taking the number of peaks of FPV (t) within 1 minute as a heart rate value HR (t) and the standard deviation SDNN (t) of the heart beat intervals within minutes.
Further, in step S300, the estimated indicators include the heart rate hr (t), the standard deviation of the heart beat interval within several minutes sdnn (t), and the eye aspect ratio; the step S300 specifically includes:
step S301, arranging the amplitude of FPV (t) of each cluster into Peak according to the sequence from large to small1,Peak2……PeaknCalculating the SNR of the signal to noise ratioiSelecting the first N FPVs (t) with the highest signal-to-noise ratio within 1 minute, and recording the values asThe optimal HR (t) is calculated correspondinglybestAnd SDNN (t)bestAs a sleepiness estimation index;
step S302, based on the past minutes HR (t)best、SDNN(t)bestAnd Eye aspect ratio Eye Index (t) determining the sleepiness of the target;
mean (ind), slope (ind) were aligned to the corresponding recognition benchmarks as follows:
when mean (Ind) < thresholdmeanAnd abs (slope (Ind)) > thresholdslopeAnd judging that the target is sleepy.
Further, step S101 specifically includes:
(a1) the image is embedded as a long vector:
let X be { X ═ X1,x2,…xi…,xnA random vector with an observation value representing face image data;
calculating the mean value mu and the covariance matrix S;
calculating an eigenvalue λ of the covariance matrix SiAnd a feature vector vi:Svi=λivi,i=1,2,…n;
Sorting the eigenvectors, and taking the eigenvector corresponding to the maximum eigenvalue:
y=WT(x- μ) wherein W ═ v1,v2,…,vk (3)
x ═ Wy + μ where W ═ v1,v2,…,vk (4)
And orthogonally processing the vector:
XTXvi=λivi (5)
XXT(Xvi)=λi(Xvi) (6)
(a2) the long vector is used as the input of the neural network, the neural network is trained to recognize the face through a feedforward algorithm, and the training objective function is as follows:
wherein m is the number of samples, W is the weight matrix of the neural network, b is the offset of each layer in the neural network, h is the activation function, x is the vector value obtained in the previous step (a1), y is the label value (0, 1), n is the label valuelNumber of layers of neural network, slIs the neuron number, and λ is the canonical coefficient;
further, step S103 specifically includes:
set the RGB value of each captured pixel to xab,xab∈R3Determining a value of N, i.e. it is desired to put the data set { x }ab,xab∈R3Obtaining N sets through clustering; in the N clusters, the initial RGB mean value of any one cluster is set as miIn the category oft represents time, and m is updated by the following methodiAnd finally obtain the category of each pixel
When it is satisfied withWhen it is time, stopping clustering and comparing the timeAs final class of this type of pixelWhere threshold is the criterion for stopping clustering.
The embodiment of the present invention further provides a sleepiness prediction apparatus, including:
a memory storing a computer program;
a processor for executing the computer program to implement the steps of the sleepiness prediction method as described above.
The embodiment of the present invention further provides a computer storage medium, in which a computer program is stored, and the computer program is used to implement the steps of the sleepiness prediction method when being executed by a processor.
The invention has the advantages that:
1) the heart rate change of the driver is analyzed by monitoring the physiological response characteristics of the driver and utilizing data of facial characteristics, eye signals, head motility and the like of the driver, the sleepiness of the driver is objectively and truly reflected by human physiological indexes, and the prediction method is objective and accurate and has extremely small error rate.
2) When the method is applied, the whole monitoring process is non-contact, and interference to a monitored person is avoided.
Drawings
FIG. 1 is a flow chart of a sleepiness prediction method of the present invention.
Detailed Description
The invention is further illustrated by the following specific figures and examples.
The embodiment of the invention provides a sleepiness prediction method which is mainly applied to driver fatigue driving early warning, objectively and truly reflects sleepiness of a driver through monitoring physiological response characteristics of the driver, and comprises the following steps of:
step S100, acquiring an image of a detection target, and performing face detection and skin identification, wherein the method comprises the following steps: detecting a target face region and extracting and clustering related skin regions; simultaneously measuring and calculating the eye aspect ratio; the method comprises the following specific steps:
step S101, embedding an image into a long vector by adopting an OpenCV facial image embedding technology, taking the long vector as the input of a neural network, and training the neural network to identify a face through a feedforward algorithm;
(a1) the image is embedded as a long vector:
let X be { X ═ X1,x2,…xi…,xnA random vector with an observation value representing face image data;
calculating the mean value mu and the covariance matrix S;
calculating an eigenvalue λ of the covariance matrix SiAnd a feature vector vi:Svi=λivi,i=1,2,…n;
Sorting the eigenvectors, and taking the eigenvector corresponding to the maximum eigenvalue:
y=WT(x- μ) wherein W ═ v1,v2,…,vk (3)
x ═ Wy + μ where W ═ v1,v2,…,vk (4)
And orthogonally processing the vector:
XTXvi=λivi (5)
XXT(Xvi)=λi(Xvi) (6)
(a2) the long vector is used as the input of the neural network, the neural network is trained to recognize the face through a feedforward algorithm, and the training objective function is as follows:
whereinM is the number of samples, W is the weight matrix of the neural network, b is the offset of each layer in the neural network, h is the activation function, x is the vector value obtained in the previous step (a1), y is the label value (0, 1), nlNumber of layers of neural network, slIs the neuron number, and λ is the canonical coefficient;
step S102, inputting the image into the trained neural network, and outputting the face coordinates And eye coordinates (taking left eye coordinates as an example),then calculating Eye aspect ratio Eye Index;
step S103, classifying the skin image pixels of the captured face area;
set the RGB value of each captured pixel to xab,xab∈R3Determining a value of N, i.e. it is desired to put the data set { x }ab,xab∈R3Obtaining N sets through clustering; in the N clusters, the initial RGB mean value of any one cluster is set as miIn the category oft represents time, and m is updated by the following methodiAnd finally obtain the category of each pixel
When it is satisfied withWhen it is time, stopping clustering and comparing the timeAs final class of this type of pixelWherein threshold is a criterion for stopping clustering;
step S200, processing the clustering result of each skin area respectively; extracting and calculating an rPPG signal by comparing the clustering result with the RGB average value corresponding to each frame, and finally estimating the target heart rhythm by a fast Fourier transform method; the method specifically comprises the following steps:
step S201, calculating an average RGB value of each of the classifications of the clustered image pixels, where the calculation formula is as follows:
step S202, calculating an rPPG signal and estimating a heart rate, wherein the specific process is as follows:
in a first step, considering that for any one cluster k at any time t, an average value μ of RGB can be obtainedkThen will μkComparing the RGB average value corresponding to each frame, and converting the RGB average value into a signal function PV (t, k) related to t; i.e. rPPG signal;
second, PV (t, k) is time-sequentially averaged, PVmean(t, k) ═ PV (t, k) -DC, where DC is the average amplitude of PV;
third, to PVmean(t, k) performing a Fast Fourier Transform (FFT) thereby obtaining a frequency domain signal fpv (t);
taking the number of peaks of FPV (t) within 1 minute as a heart rate value HR (t) and the standard deviation SDNN (t) of the heart beat interval within minutes (such as five minutes);
s300, selecting the optimal signals detected by different clustering results to estimate the target sleepiness level; the indicators used for the estimation include the heart rate hr (t), the standard deviation of the heart beat interval within minutes sdnn (t) and the eye aspect ratio; the method specifically comprises the following steps:
step S301, arranging the amplitude of FPV (t) of each cluster into Peak according to the sequence from large to small1,Peak2……PeaknCalculating the SNR of the signal to noise ratioiSelecting the first N FPVs (t) with the highest signal-to-noise ratio within 1 minute, and recording the values asThe optimal HR (t) is calculated correspondinglybestAnd SDNN (t)bestAs a sleepiness estimation index;
step S302, based on the past minutes (e.g. three minutes) HR (t)best、SDNN(t)bestAnd Eye aspect ratio Eye Index (t) determining the sleepiness of the target;
mean (ind), slope (ind) were compared to the corresponding identification standards, which were obtained from experimental data as follows:
when mean (Ind) < thresholdmeanAnd abs (slope (Ind)) > thresholdslopeIn the meantime, it is determined that the target, i.e., the driver, is sleepy.
The embodiment of the present invention further provides a sleepiness prediction apparatus, including:
a memory storing a computer program; the program instructions of the computer program are for being loaded and executed by a processor to carry out the steps of the sleepiness prediction method as described hereinbefore.
A processor for loading and executing the computer program to implement the steps of the sleepiness prediction method as described hereinbefore.
An embodiment of the present invention further provides a computer storage medium, in which a computer program is stored, and program instructions of the computer program are executed by a processor to implement the steps of the sleepiness prediction method as described above.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.
Claims (5)
1. A sleepiness prediction method is characterized by comprising the following steps:
step S100, acquiring an image of a detection target, and performing face detection and skin identification, wherein the method comprises the following steps: detecting a target face region and extracting and clustering related skin regions; simultaneously measuring and calculating the eye aspect ratio;
step S200, processing the clustering result of each skin area respectively; extracting and calculating an rPPG signal by comparing the clustering result with the RGB average value corresponding to each frame, and finally estimating the target heart rhythm by a fast Fourier transform method;
s300, selecting the optimal signals detected by different clustering results to estimate the target sleepiness level;
the step S100 includes:
step S101, embedding an image into a long vector, using the long vector as the input of a neural network, and training the neural network to identify a face through a feed-forward algorithm;
step S102, inputting the image into the trained neural network, outputting face coordinates and eye coordinates, and then calculating the aspect ratio of the eyes;
step S103, classifying the skin image pixels of the captured face area;
the step S200 includes:
step S201, calculating an RGB average value of each classification of the clustered image pixels;
step S202, calculating an rPPG signal and estimating a heart rate, wherein the specific process is as follows:
in a first step, considering that for any one cluster k at any time t, an average value μ of RGB can be obtainedkThen will μkComparing the RGB average value corresponding to each frame, and converting the RGB average value into a signal function PV (t, k) related to t; i.e. rPPG signal;
second, PV (t, k) is time-sequentially averaged, PVmean(t, k) ═ PV (t, k) -DC, where DC is the average amplitude of PV;
third, to PVmean(t, k) performing a fast fourier transform thereby obtaining a frequency domain signal fpv (t);
fourthly, taking the number of the peak values of FPV (t) within 1 minute as a heart rate value HR (t) and the standard deviation SDNN (t) of the heart beat intervals within minutes;
in step S300, the estimated indicators include the heart rate hr (t), the standard deviation sdnn (t) of the heart beat interval within several minutes, and the eye aspect ratio; the step S300 specifically includes:
step S301, arranging the amplitude of FPV (t) of each cluster into Peak according to the sequence from large to small1,Peak2……PeaknCalculating the SNR of the signal to noise ratioiSelecting the first N FPVs (t) with the highest signal-to-noise ratio within 1 minute, and recording the values asThe optimal HR (t) is calculated correspondinglybestAnd SDNN (t)bestAs a sleepiness estimation index;
step S302, based on the past minutes HR (t)best、SDNN(t)bestAnd Eye aspect ratio Eye Index (t) determining the sleepiness of the target;
mean (ind), slope (ind) were aligned to the corresponding recognition benchmarks as follows:
when mean (Ind) < thresholdmeanAnd abs (slope (Ind)) > thresholdslopeAnd judging that the target is sleepy.
2. The sleepiness prediction method of claim 1,
step S101 specifically includes:
(a1) the image is embedded as a long vector:
let X be { X ═ X1,x2,...xi...,xnA random vector with an observation value representing face image data;
calculating the mean value mu and the covariance matrix S;
calculating an eigenvalue λ of the covariance matrix SiAnd a feature vector vi:Svi=λivi,i=1,2,...n;
Sorting the eigenvectors, and taking the eigenvector corresponding to the maximum eigenvalue:
x ═ Wy + mu where W ═ vv1,v2,...,vk (4)
And orthogonally processing the vector:
(a2) the long vector is used as the input of the neural network, the neural network is trained to recognize the face through a feedforward algorithm, and the training objective function is as follows:
wherein m is the number of samples, W is the weight matrix of the neural network, b is the offset of each layer in the neural network, h is the activation function, x is the vector value obtained in the previous step (a1), y is the label value (0, 1), n is the label valuelNumber of layers of neural network, slIs the neuron number, and λ is a regular coefficient.
3. The sleepiness prediction method of claim 1,
step S103 specifically includes:
set the RGB value of each captured pixel to xab,xab∈R3Determining a value of N, i.e. it is desired to put the data set { x }ab,xab∈R3Obtaining N sets through clustering; in the N numberIn clustering, the initial RGB mean value of any one cluster is set as miIn the category oft represents time, and m is updated by the following methodiAnd finally obtain the category of each pixel
4. A sleepiness prediction apparatus, comprising:
a memory storing a computer program;
a processor for executing the computer program to implement the steps of the sleepiness prediction method as claimed in any one of claims 1 to 3.
5. A computer storage medium comprising, in combination,
the computer storage medium has stored therein a computer program which, when executed by a processor, is adapted to carry out the steps of the sleepiness prediction method according to any one of claims 1-3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010104790.3A CN111310673B (en) | 2020-02-20 | 2020-02-20 | Sleepiness prediction method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010104790.3A CN111310673B (en) | 2020-02-20 | 2020-02-20 | Sleepiness prediction method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111310673A CN111310673A (en) | 2020-06-19 |
CN111310673B true CN111310673B (en) | 2022-02-08 |
Family
ID=71148981
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010104790.3A Active CN111310673B (en) | 2020-02-20 | 2020-02-20 | Sleepiness prediction method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111310673B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2960862A1 (en) * | 2014-06-24 | 2015-12-30 | Vicarious Perception Technologies B.V. | A method for stabilizing vital sign measurements using parametric facial appearance models via remote sensors |
CN110084085A (en) * | 2018-11-06 | 2019-08-02 | 天津工业大学 | RPPG high-precision heart rate detection method based on shaped signal |
CN110384491A (en) * | 2019-08-21 | 2019-10-29 | 河南科技大学 | A kind of heart rate detection method based on common camera |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11234658B2 (en) * | 2018-03-28 | 2022-02-01 | Livmor, Inc. | Photoplethysmogram data analysis and presentation |
-
2020
- 2020-02-20 CN CN202010104790.3A patent/CN111310673B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2960862A1 (en) * | 2014-06-24 | 2015-12-30 | Vicarious Perception Technologies B.V. | A method for stabilizing vital sign measurements using parametric facial appearance models via remote sensors |
CN110084085A (en) * | 2018-11-06 | 2019-08-02 | 天津工业大学 | RPPG high-precision heart rate detection method based on shaped signal |
CN110384491A (en) * | 2019-08-21 | 2019-10-29 | 河南科技大学 | A kind of heart rate detection method based on common camera |
Non-Patent Citations (1)
Title |
---|
《Video-Based Heart Rate Measurement: Recent Advances and Future Prospects 》;Xun Chen et al;;《IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT》;20191031;第68卷(第10期);第3600-3615页; * |
Also Published As
Publication number | Publication date |
---|---|
CN111310673A (en) | 2020-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11793406B2 (en) | Image processing method and corresponding system | |
Moghaddasi et al. | Automatic assessment of mitral regurgitation severity based on extensive textural features on 2D echocardiography videos | |
Subasi | Epileptic seizure detection using dynamic wavelet network | |
CN111863244B (en) | Functional connection mental disease classification method and system based on sparse pooling graph convolution | |
JP6359123B2 (en) | Inspection data processing apparatus and inspection data processing method | |
Rohmantri et al. | Arrhythmia classification using 2D convolutional neural network | |
Mohebbian et al. | Semi-supervised active transfer learning for fetal ECG arrhythmia detection | |
Sharathappriyaa et al. | Auto-encoder based automated epilepsy diagnosis | |
Greene et al. | Classifier models and architectures for EEG-based neonatal seizure detection | |
Fikri et al. | ECG signal classification review | |
Gopalakrishnan et al. | Itl-cnn: Integrated transfer learning-based convolution neural network for ultrasound pcos image classification | |
Vrbancic et al. | Automatic detection of heartbeats in heart sound signals using deep convolutional neural networks | |
Boonyakitanont et al. | ScoreNet: A Neural network-based post-processing model for identifying epileptic seizure onset and offset in EEGs | |
El Boujnouni et al. | Automatic diagnosis of cardiovascular diseases using wavelet feature extraction and convolutional capsule network | |
Puri et al. | Detection of Alcoholism from EEG signals using Spectral and Tsallis Entropy with SVM | |
CN114595725A (en) | Electroencephalogram signal classification method based on addition network and supervised contrast learning | |
CN111310673B (en) | Sleepiness prediction method, device and storage medium | |
Mihandoost et al. | Automatic feature extraction using generalised autoregressive conditional heteroscedasticity model: an application to electroencephalogram classification | |
Manocha et al. | An overview of ischemia detection techniques | |
EBRAHIMNEZHAD et al. | Classification of arrhythmias using linear predictive coefficients and probabilistic neural network | |
Koçyiğit | Heart sound signal classification using fast independent component analysis | |
Nehary et al. | A deep convolutional neural network classification of heart sounds using fractional fourier transform | |
JP2016187555A (en) | Biological parameter estimation apparatus or method | |
Shcherbakova et al. | Determination of characteristic points of electrocardiograms using multi-start optimization with a wavelet transform | |
Übeyli | Implementing eigenvector methods/probabilistic neural networks for analysis of EEG signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |