CN114220152A - Non-contact real-time heart rate detection method based on RGB-NIR camera - Google Patents

Non-contact real-time heart rate detection method based on RGB-NIR camera Download PDF

Info

Publication number
CN114220152A
CN114220152A CN202111549774.6A CN202111549774A CN114220152A CN 114220152 A CN114220152 A CN 114220152A CN 202111549774 A CN202111549774 A CN 202111549774A CN 114220152 A CN114220152 A CN 114220152A
Authority
CN
China
Prior art keywords
heart rate
signal
face
background
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111549774.6A
Other languages
Chinese (zh)
Inventor
魏远旺
张先超
严子涵
朱耀东
王超超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiaxing University
Original Assignee
Jiaxing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiaxing University filed Critical Jiaxing University
Priority to CN202111549774.6A priority Critical patent/CN114220152A/en
Publication of CN114220152A publication Critical patent/CN114220152A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a non-contact real-time heart rate detection method based on an RGB-NIR camera, which comprises the following steps: an RGB-NIR camera; detecting and identifying a human face; selecting a human face skin interested area and a background reference area; extracting an original heart rate signal; extracting a background signal; extracting a BVP signal; heart rate information estimation; the method has the technical key points that the heart rate information is estimated by utilizing the human face skin area and the background reference area so as to improve the accuracy of the heart rate detection result under the condition of changing ambient light; accurate heart rate measurement is realized by using a remote photoplethysmography; compared with the prior art, the method has the following advantages: the extraction result is calculated according to the RGB and NIR images, and the influence of illumination is eliminated to a certain extent by comparing the human face image with the environment background, so that the accuracy is improved compared with the prior art; and selecting a camera suitable for medical health and household environment, and expanding the use scene to health monitoring.

Description

Non-contact real-time heart rate detection method based on RGB-NIR camera
Technical Field
The invention belongs to the field of image video processing, and particularly relates to a non-contact real-time heart rate detection method based on an RGB-NIR camera.
Background
The 'China cardiovascular health and disease report 2020' released in 2020 shows that China cardiovascular death accounts for the first cause of total death of urban and rural residents, wherein the rural area accounts for 46.66%, and the urban area accounts for 43.81%, and the China cardiovascular death becomes the first killer threatening the life and health of people; therefore, the work of preventing, monitoring, treating and the like of cardiovascular diseases is very important.
In recent years, the study of video-based non-contact heart rate detection has attracted much attention in the field of computer vision; the method mainly utilizes a remote photoplethysmography, and the basic principle is as follows: when illumination passes through skin tissues and then is reflected to the photosensitive sensor at the receiving end, the intensity of the obtained optical signal is changed to a certain extent; the method is characterized in that the absorption or attenuation of light of arterial Blood vessels flowing under the skin is changed from moment to moment except for areas with basically constant absorption of the human muscle, bone, vein and other tissues, and the like, and the periodic change causes slight change of the skin color, so that corresponding vital sign estimation is obtained by separating Blood Volume Pulse (BVP) signals; compared with the traditional heart rate measurement method, the method has the greatest advantages of non-contact, low cost and convenience, and is very suitable for being used in daily scenes.
At present, the heart rate detection method based on rPPG mainly faces the challenges of the change of ambient light and the influence of human head movement; in an actual scene, ambient light is often changed, and the change may bring a periodically changed noise signal, so that the final heart rate information detection is greatly influenced; particularly, when the illumination is dark, the reflection and absorption of the skin to the illumination are small, and the camera introduces much noise during imaging, which can bring certain challenges to the measurement of physiological indexes; in the detection process, the movement of the head of a subject caused by blinking, speaking, breathing or other reasons can cause the change of the reflected light of the human face skin, so that the capture of the periodic color change of the human face skin by a camera is influenced, and how to eliminate the influence is also an important challenge for heart rate detection based on rPPG.
Under the same illumination environment, the factors causing the color change of the human face area have more signal components caused by the blood pulse signal of the human face skin area than the surrounding background area, and both have the same signal components causing the periodic change of the area color caused by the illumination change.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a non-contact real-time heart rate detection method based on an RGB-NIR camera, which solves the problems in the prior art.
(II) technical scheme
In order to achieve the purpose, the invention is realized by the following technical scheme:
a non-contact real-time heart rate detection method based on an RGB-NIR camera comprises the following steps:
s1, collecting a face image by using an RGB-NIR camera;
s2, positioning, identifying and acquiring feature points of each face in each frame of image by using OpenCV, and dividing the feature points into a plurality of face areas; then, establishing a background model based on Gaussian kernel density estimation, and obtaining a plurality of background areas according to the face area;
s3, obtaining an original signal based on the face area, and taking the original signal after filtering processing as an original heart rate signal; obtaining a background signal based on the background reference region;
s4, obtaining intrinsic mode components through empirical mode decomposition;
s5, carrying out similarity analysis on the intrinsic mode components by using an orthogonal projection transformation algorithm, and extracting to obtain a heart rate signal curve;
s6, carrying out peak value statistics on the heart rate signal curve, and estimating a heart rate value;
and S7, collecting data in real time, and repeating the steps to obtain the real-time heart rate.
Further, in the S1,
opening an RGB-NIR camera to collect a video section containing a human face to obtain a video section of an RGB version and a video section of an NIR version; the original video is divided into 10s short videos by adopting a slicing operation based on a sliding window on a time axis.
Further, in the S2,
the method comprises the steps of establishing a face detector based on a DNN (digital noise network) model in OpenCV (open virtual vehicle vision network), carrying out face detection on each frame of image of an RGB (red, green and blue) version video and obtaining a face area, obtaining face characteristic points by using a DLib library, and dividing the face area into four face areas according to the obtained face characteristic points: the facial mask comprises a forehead area, a left face area, a right face area and a lower jaw area, wherein the face area is divided into an RGB version and an NIR version; establishing a background model based on Gaussian kernel density estimation, taking mass points in a face area as a center, and selecting four background areas around the face area as reference areas by utilizing a search algorithm;
calculating the Euclidean distance between the front and rear frames of face characteristic points by using the detected face characteristic points, and if the Euclidean distance is smaller than a preset threshold Dist, determining that the same face is the same, thereby realizing the tracking of the face;
otherwise, ending the current heart rate estimation process, and repeating the step S2 by taking the current frame as the first frame again; further extracting the interested area of the face according to the face characteristic points: forehead (a), left cheek (b), right cheek (c) and lower lip center (d);
the background extraction method comprises the following specific steps of enabling x1,x2,…,xNIs the gray value, x, of the nearest previous N frames of a certain pixeltThe probability density function of the pixel is:
Figure BDA0003417100660000031
wherein, the formula (1) is essentially a Gaussian kernel function at the sample point x1,x2,…,xNIf a continuous function is interpolated, the background points can be obtained according to the following formula:
Figure BDA0003417100660000032
where T is a threshold value, MtIn (x, y), 0 represents a background point and 1 represents a foreground target point.
And then extracting a background area from the current short video as a reference area.
Further, in the S3,
the original signal extraction is based on a face region, G and N channel values are extracted and equalized, and the obtained G channel and N channel sequences correspond to an input time axis one to obtain a group of original signals;
the background signal extraction is based on a background region, R, G, B and N channel values are extracted and equalized, and R, G, B and N channel sequences are in one-to-one correspondence with an input time axis to obtain a group of background signals;
the specific process of obtaining the original heart rate signal and the background signal is as follows:
firstly, performing space-time fusion processing on an RGB image and an NIR image at the same moment to respectively obtain an image face area before and after mixing and a reference area G and N channel mean value sequence;
then, acquiring a time axis according to a set value, and performing nonlinear least square method fitting on the sequence to obtain a one-dimensional signal curve on the time axis;
and finally, filtering curves of the original signal and the background signal by adopting a band-pass filter to eliminate obvious noise, and then taking the filtered original signal as an original heart rate signal and taking the filtered reference area signal as a background signal.
Further, in the S4 and S5,
the BVP signal curve extraction is to decompose an original heart rate signal and a background signal into superposition of independent components by adopting an empirical mode separation method for the original heart rate signal and the background signal to obtain a plurality of IMFs, and then obtain a BVP signal curve through orthogonal projection operation;
the specific steps for obtaining the heart rate signal curve are as follows:
firstly, performing blind source signal separation processing based on empirical mode decomposition on an original heart rate signal and a background signal, and obtaining a plurality of separated IMF components and a residual component for each signal; (ii) a
Then, similarity analysis is carried out on the separated IMF components by using an orthogonal projection transformation algorithm, one IMF component with the lowest similarity with the background signal group in the original heart rate signal group is extracted, and the IMF is used as a BVP signal curve.
Further, in the S6,
the heart rate information estimation is to perform fast Fourier transform on a BVP signal curve in a time domain, calculate a frequency domain response maximum value and calculate an estimated heart rate through a peak value and a corresponding frequency value in a frequency domain.
(III) advantageous effects
The invention provides a method for estimating heart rate information by using a human face skin area and a background reference area based on an RGB-NIR camera, which is used for improving the accuracy of a heart rate detection result under the condition of environmental illumination change;
accurate heart rate measurement is realized by using a remote photoplethysmography; compared with the prior art, the method has the following advantages:
firstly, an extraction result is calculated according to RGB and NIR images, and the influence of illumination is eliminated to a certain extent by comparing the human face image with the environment background, so that the accuracy is improved compared with the prior art;
secondly, a camera suitable for medical health and household environments is selected, the use scene is expanded to health monitoring, and the principle that prevention is larger than treatment of cardiovascular diseases is met.
Drawings
FIG. 1 is an overall flow diagram of the present invention;
FIG. 2 is a schematic diagram of the present invention for extracting a facial region of interest;
FIG. 3 is an example of an eigenmode function of the present invention derived from empirical mode decomposition of a raw heart rate signal curve;
fig. 4 is an example of similarity of IMF functions and heart rate estimation obtained by decomposing the facial skin region and the background region.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. In addition, for the convenience of description, the terms "upper", "lower", "left" and "right" are used to refer to the same direction as the upper, lower, left, right, etc. of the drawings, and the terms "first", "second", etc. are used for descriptive distinction and have no special meaning.
A non-contact real-time heart rate detection method based on an RGB-NIR camera comprises the following steps:
s1, collecting a face image by using an RGB-NIR camera;
opening an RGB-NIR camera to collect a video section containing a human face to obtain a video section of an RGB version and a video section of an NIR version; dividing an original video into 10s short videos by adopting a slicing operation based on a sliding window on a time axis;
specifically, the camera frame rate is initialized to be f, the current frame count fcount is 0, the server queue is emptied, and a boolean value isBlocked is set to indicate that signal transmission can be performed;
when the camera is turned on to start collecting, counting each frame of image by fcount, and acquiring the currently collected image in real time based on an rtsp protocol;
the following steps will only involve processing images from 0 to 10 f within the current queue.
S2, positioning, identifying and acquiring feature points of each face in each frame of image by using OpenCV, and dividing the feature points into a plurality of face areas; then, establishing a background model based on Gaussian kernel density estimation, and obtaining a plurality of background areas according to the face area;
the method comprises the steps of establishing a face detector based on a DNN (digital noise network) model in OpenCV (open virtual vehicle vision network), carrying out face detection on each frame of image of an RGB (red, green and blue) version video and obtaining a face area, obtaining face characteristic points by using a DLib library, and dividing the face area into four face areas according to the obtained face characteristic points: the facial mask comprises a forehead area, a left face area, a right face area and a lower jaw area, wherein the face area is divided into an RGB version and an NIR version; establishing a background model based on Gaussian kernel density estimation, taking mass points in a face area as a center, and selecting four background areas around the face area as reference areas by utilizing a search algorithm;
the background extraction method comprises the following specific steps of enabling x1,x2,…,xNIs the gray value, x, of the nearest previous N frames of a certain pixeltThe probability density function of the pixel is:
Figure BDA0003417100660000061
wherein, the formula (1) is essentially a Gaussian kernel function at the sample point x1,x2,…,xNIf a continuous function is interpolated, the background points can be obtained according to the following formula:
Figure BDA0003417100660000062
where T is a threshold value, MtIn (x, y), 0 represents a background point and 1 represents a foreground target point.
Then extracting a background area from the current short video as a reference area; the background area should satisfy the condition:
firstly, the image is not overlapped with any pixel of the face area;
secondly, the fluctuation of the pixel change in the region is as small as possible.
The background area is a video background obtained by a background modeling method of Gaussian kernel density estimation,then, the centroid of the face area is taken as the center, searching is started from the directions of 12 points, 3 points, 6 points and 9 points in the clockwise direction, finally four background areas are respectively obtained, the size of each background area is 20 multiplied by 20, and the default initial distance between the centroid of the face area and the center of the background area is max (Width)face,Heightface)+40。
The specific steps of reference region selection are as follows:
(1) acquiring the width weight and the height weight of the image, and establishing a dictionary type CoveredMap for recording the area covered by the human face;
(2) traversing each frame image, recording a centroid point coordinate [ c _ x, c _ y ] and a face radius R for a face in a current frame image, and adding and recording [ key ═ c _ x, c _ y ], value ═ R ] in the CoveredMap; wherein, the center points of the temples on the two sides of the human face are taken as the center of mass, and the linear distance from the center of mass to the chin is taken as the radius;
(3) obtaining a certain coordinate [ p _ x, p _ y ] in the image through a random number, calculating the distance Dp between the coordinate and a key in the CoveredMap in sequence, and if the Dp is greater than R, jumping out of the current judgment;
if Dp is not greater than R all the time, the current coordinate can be considered as a potential reference area selection point, recording pointialpoints [ i ] ═ p _ x, p _ y ], and updating CoveredMap record [ key ═ p _ x, p _ y ], value ═ D ]; repeating the step k times to obtain an array PotentialPoints containing k potential reference area selection points;
wherein D and k are input constants;
(4) traversing each frame image, selecting a point for each potential reference area, calculating whether the fluctuation of the G channel value exceeds +/-10%, if so, discarding the point, and removing the point from PotentialPoints; the coordinates in the finally obtained PotentialPoints are regarded as the center point of the reference area.
The method specifically comprises the following steps:
for each frame of image, loading a DNN model by using a readNetFromCaffe method in OpenCV: res 10-300 x 300-ssd iter-140000. coffee model, locating the face target in the frame image,
then, the detector method of a class front _ face _ detector in the Dlib library is used to obtain the coordinates of 68 feature points in the face area in the frame image, which are denoted as [ face _ point1(x, y), …, face _ point68(x, y) ].
Calculating the Euclidean distance between the front and rear frames of face characteristic points by using the detected face characteristic points, if the Euclidean distance is smaller than a preset threshold Dist, determining that the same face is present, and accordingly tracking the face, otherwise, ending the current heart rate estimation process, and starting to repeat the step 2 by taking the current frame as the first frame again; extracting regions of interest of the face (forehead (a), left cheek (b), right cheek (c) and lower lip center (d)) further according to the human face feature points;
note that the region of interest referred to in this application may also be referred to as a facial region.
S3, obtaining an original signal based on the face area, and taking the original signal after filtering processing as an original heart rate signal; obtaining a background signal based on the background reference region;
the original signal extraction is based on a face region, G and N channel values are extracted and equalized, and the obtained G channel and N channel sequences correspond to an input time axis one to obtain a group of original signals;
the background signal extraction is based on a background region, R, G, B and N channel values are extracted and equalized, and R, G, B and N channel sequences are in one-to-one correspondence with an input time axis to obtain a group of background signals;
the specific process of obtaining the original signal and the background signal is as follows:
firstly, performing space-time fusion processing on an RGB image and an NIR image at the same moment to respectively obtain an image face area before and after mixing and a reference area G and N channel mean value sequence;
then, acquiring a time axis according to a set value, and performing nonlinear least square method fitting on the sequence to obtain a one-dimensional signal curve on the time axis;
and finally, filtering curves of the original signal and the background signal by adopting a band-pass filter to eliminate obvious noise, and then taking the filtered original signal as an original heart rate signal and taking the filtered reference area signal as a background signal.
In particular, the method comprises the following steps of,
for one frame image of the RGB video and the NIR video at the same time, the following processing is performed:
(1) carrying out filtering operation on the NIR image by using Bilateral Filtering (BF) and Weighted Least square filtering (WLS) to obtain processed images Y1 and Y2; subtracting the original image I _ nir to obtain detail layer images Y3 and Y4, i.e., I _ nir-Y1 ═ Y3 and I _ nir-Y2 ═ Y4; averaging Y3 and Y4 to obtain detail information Y _ NIR of the NIR image;
(2) transferring the RGB image into a YCbCr space, and performing weighted least square filtering processing on the Y layer to obtain information Y _ RGB of the RGB image;
(3) adding Y _ nir and Y _ rgb to obtain a Y layer of the fused image in a YCbCr space, namely Y _ nir + Y _ rgb ═ Y;
(4) and recombining the Y and the CbCr layer of the original image, and converting the recombined Y and CbCr layer into an RGB space to obtain the fused RGB-NIR image.
For the fused RGB + NIR image and the original image, the following operations are performed:
(1) respectively extracting G channel values of a face area and a background area from the original RGB image and calculating the average value to obtain sequences G _ face and G _ back;
(2) for an original NIR image, converting the original NIR image into an RGB space and then performing the same processing to obtain sequences N _ face and N _ back;
(3) for the fused RGB-NIR image, G-N _ face and G-N _ back are obtained.
Calculating the relative Time of each frame image relative to the first frame according to a frame rate, obtaining a Time sequence of Time [ t (1) · t (30) ], corresponding to the sequence of G _ face [ Gf (1) ·. Gf (30) ] one by one, obtaining a coordinate point set G _ face _ node [ Gf-t (1) ·. Gf-t (30) ], presetting y ═ x ^5+ B ^ x ^4+ C ^ x ^3+ D ^ x ^2+ E ^ x + F, and obtaining a coefficient based on a nonlinear least square method to obtain a function y ═ func _ Gf (x) of an optimal fitting curve; similarly, for the sequences G _ back, N _ face, N _ back, G-N _ face, G-N _ back, we can calculate func? (x) Carrying out mean value processing on func _ Gf, func _ Nf and func _ GNf to obtain a BVP signal curve func _ face; carrying out mean value processing on func _ Gb, func _ Nb and func _ GNb to obtain a background signal curve func _ back; and filtering the one-dimensional signal curve by adopting a band-pass filter to obtain filtered func _ face _ fil and func _ back _ fil.
S4, obtaining intrinsic mode components through empirical mode decomposition; s5, carrying out similarity analysis on the eigenmode component by using an orthogonal projection transformation algorithm, and extracting to obtain a BVP signal curve;
the BVP signal curve extraction is to decompose an original heart rate signal and a background signal into superposition of independent components by adopting an empirical mode separation method for the original heart rate signal and the background signal to obtain a plurality of IMFs, and then obtain a BVP signal curve through orthogonal projection operation;
the specific steps for obtaining the heart rate signal curve are as follows:
firstly, performing blind source signal separation processing based on empirical mode decomposition on an original heart rate signal and a background signal, and obtaining a plurality of separated IMF components and a residual component for each signal; (ii) a
And then, performing similarity analysis on the separated IMF components by using an orthogonal projection transformation algorithm, extracting an IMF component with the lowest similarity with the background signal group in the original heart rate signal group, and taking the IMF as a heart rate signal curve.
In particular, the method comprises the following steps of,
performing Empirical Mode Decomposition (EMD) processing on each filtered signal curve to obtain k separated IMF components and a residual component, and recording a sequence of the separated component curves as IMFS (IMF (1... IMF (k));
where k is a deterministic value and the IMFS result is uniquely deterministic;
for each IMF (i) obtained by decomposing the BVP signal, calculating the similarity of all IMFs in the IMFS obtained by decomposing the BVP signal and the IMFS obtained by decomposing the background signal, and calculating the final similarity similar (i) based on a voting mechanism, wherein the IMF (j) with the lowest similarity is considered to contain the most heart rate information and is recorded as a heart rate signal curve Characteriodic _ curve.
S6, carrying out peak value statistics on the BVP signal curve, and estimating a heart rate value;
the heart rate information estimation is to perform fast Fourier transform on a BVP signal curve in a time domain, calculate a frequency domain response maximum value and calculate an estimated heart rate through a peak value and a corresponding frequency value in a frequency domain;
scanning a Heart rate signal curve Characteriodic _ curve, counting the number of peak values P, wherein the number of the peak values is the Heart beat fluctuation frequency within 10 seconds, and returning to the estimated Heart rate within 10 seconds, which is P/10.
And S7, collecting data in real time, and repeating the steps (S1-S5) to obtain the real-time heart rate.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (7)

1. A non-contact real-time heart rate detection method based on an RGB-NIR camera is characterized by comprising the following steps:
s1, collecting a face image by using an RGB-NIR camera;
s2, positioning, identifying and acquiring feature points of each face in each frame of image by using OpenCV, and dividing the feature points into a plurality of face areas; then, establishing a background model based on Gaussian kernel density estimation, and obtaining a plurality of background areas according to the face area;
s3, obtaining an original signal based on the face area, and taking the original signal after filtering processing as an original heart rate signal; obtaining a background signal based on the background reference region;
s4, obtaining intrinsic mode components through empirical mode decomposition;
s5, carrying out similarity analysis on the intrinsic mode components by using an orthogonal projection transformation algorithm, and extracting to obtain a heart rate signal curve;
s6, carrying out peak value statistics on the heart rate signal curve, and estimating a heart rate value;
and S7, collecting data in real time, and repeating the steps to obtain the real-time heart rate.
2. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the above-mentioned S1, the method,
opening an RGB-NIR camera to collect a video section containing a human face to obtain a video section of an RGB version and a video section of an NIR version; the original video is divided into 10s short videos by adopting a slicing operation based on a sliding window on a time axis.
3. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the above-mentioned S2, the method,
the method comprises the steps of establishing a face detector based on a DNN (digital noise network) model in OpenCV (open virtual vehicle vision network), carrying out face detection on each frame of image of an RGB (red, green and blue) version video and obtaining a face area, obtaining face characteristic points by using a DLib library, and dividing the face area into four face areas according to the obtained face characteristic points: the facial mask comprises a forehead area, a left face area, a right face area and a lower jaw area, wherein the face area is divided into an RGB version and an NIR version; establishing a background model based on Gaussian kernel density estimation, taking mass points in a face area as a center, and selecting four background areas around the face area as reference areas by utilizing a search algorithm;
calculating the Euclidean distance between the front and rear frames of face characteristic points by using the detected face characteristic points, and if the Euclidean distance is smaller than a preset threshold Dist, determining that the same face is the same, thereby realizing the tracking of the face;
otherwise, ending the current heart rate estimation process, and repeating the step S2 by taking the current frame as the first frame again; further extracting the interested area of the face according to the face characteristic points: forehead (a), left cheek (b), right cheek (c) and lower lip center (d).
4. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the above-mentioned S2, the method,
the background extraction method comprises the following specific steps of enabling x1,x2,…,xNIs the nearest first N of a certain pixelGray value of frame, xtThe probability density function of the pixel is:
Figure FDA0003417100650000021
wherein, the formula (1) is essentially a Gaussian kernel function at the sample point x1,x2,…,xNIf a continuous function is interpolated, the background points can be obtained according to the following formula:
Figure FDA0003417100650000022
where T is a threshold value, MtIn (x, y), 0 represents a background point, and 1 represents a foreground target point;
and then extracting a background area from the current short video as a reference area.
5. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the above-mentioned S3, the method,
the original signal extraction is based on a face region, G and N channel values are extracted and equalized, and the obtained G channel and N channel sequences correspond to an input time axis one to obtain a group of original signals;
the background signal extraction is based on a background region, R, G, B and N channel values are extracted and equalized, and R, G, B and N channel sequences are in one-to-one correspondence with an input time axis to obtain a group of background signals;
the specific process of obtaining the original signal and the background signal is as follows:
firstly, performing space-time fusion processing on an RGB image and an NIR image at the same moment to respectively obtain an image face area before and after mixing and a reference area G and N channel mean value sequence;
then, acquiring a time axis according to a set value, and performing nonlinear least square method fitting on the sequence to obtain a one-dimensional signal curve on the time axis;
and finally, filtering curves of the original signal and the background signal by adopting a band-pass filter to eliminate obvious noise, and then taking the filtered original signal as an original heart rate signal and taking the filtered reference area signal as a background signal.
6. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the S4 and S5,
the BVP signal curve extraction is to decompose an original heart rate signal and a background signal into superposition of independent components by adopting an empirical mode separation method for the original heart rate signal and the background signal to obtain a plurality of IMFs, and then obtain a BVP signal curve through orthogonal projection operation;
the specific steps for obtaining the heart rate signal curve are as follows:
firstly, performing blind source signal separation processing based on empirical mode decomposition on an original heart rate signal and a background signal, and obtaining a plurality of separated IMF components and a residual component for each signal; (ii) a
And then, performing similarity analysis on the separated IMF components by using an orthogonal projection transformation algorithm, extracting an IMF component with the lowest similarity with the background signal group in the original heart rate signal group, and taking the IMF as a heart rate signal curve.
7. The non-contact real-time heart rate detection method based on the RGB-NIR camera as claimed in claim 1, wherein: in the above-mentioned S6, the method,
the heart rate information estimation is to perform fast Fourier transform on a BVP signal curve in a time domain, calculate a frequency domain response maximum value and calculate an estimated heart rate through a peak value and a corresponding frequency value in a frequency domain.
CN202111549774.6A 2021-12-17 2021-12-17 Non-contact real-time heart rate detection method based on RGB-NIR camera Pending CN114220152A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111549774.6A CN114220152A (en) 2021-12-17 2021-12-17 Non-contact real-time heart rate detection method based on RGB-NIR camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111549774.6A CN114220152A (en) 2021-12-17 2021-12-17 Non-contact real-time heart rate detection method based on RGB-NIR camera

Publications (1)

Publication Number Publication Date
CN114220152A true CN114220152A (en) 2022-03-22

Family

ID=80703432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111549774.6A Pending CN114220152A (en) 2021-12-17 2021-12-17 Non-contact real-time heart rate detection method based on RGB-NIR camera

Country Status (1)

Country Link
CN (1) CN114220152A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063848A (en) * 2022-05-23 2022-09-16 华南理工大学 Non-contact human face heart rate detection method, system and medium based on photoplethysmography
CN117710242A (en) * 2023-12-20 2024-03-15 四川大学 Non-contact physiological parameter extraction method for resisting illumination and motion interference

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063848A (en) * 2022-05-23 2022-09-16 华南理工大学 Non-contact human face heart rate detection method, system and medium based on photoplethysmography
CN115063848B (en) * 2022-05-23 2024-03-19 华南理工大学 Non-contact human face heart rate detection method, system and medium based on photoplethysmography
CN117710242A (en) * 2023-12-20 2024-03-15 四川大学 Non-contact physiological parameter extraction method for resisting illumination and motion interference

Similar Documents

Publication Publication Date Title
EP3405105B1 (en) Method and apparatus for estimating heart rate
CN114220152A (en) Non-contact real-time heart rate detection method based on RGB-NIR camera
Tang et al. Non-contact heart rate monitoring by combining convolutional neural network skin detection and remote photoplethysmography via a low-cost camera
CN106073729A (en) The acquisition method of photoplethysmographic signal
CN112396011B (en) Face recognition system based on video image heart rate detection and living body detection
Park et al. Remote pulse rate measurement from near-infrared videos
CN109009052A (en) The embedded heart rate measurement system and its measurement method of view-based access control model
CN111523344A (en) Human body living body detection system and method
CN111429345A (en) Method for visually calculating heart rate and heart rate variability with ultra-low power consumption
Hu et al. A novel spatial-temporal convolutional neural network for remote photoplethysmography
CN114067435A (en) Sleep behavior detection method and system based on pseudo-3D convolutional network and attention mechanism
Wang et al. VitaSi: A real-time contactless vital signs estimation system
Jaiswal et al. rPPG-FuseNet: Non-contact heart rate estimation from facial video via RGB/MSR signal fusion
Das et al. Bvpnet: Video-to-bvp signal prediction for remote heart rate estimation
CN111179454A (en) Check-in and physiological parameter detection system and control method thereof
CN113011399A (en) Video abnormal event detection method and system based on generation cooperative judgment network
JP5154461B2 (en) Moving object tracking device
CN112364329A (en) Face authentication system and method combining heart rate detection
He et al. Remote photoplethysmography heart rate variability detection using signal to noise ratio bandpass filtering
CN110321781B (en) Signal processing method and device for non-contact measurement
Comas et al. Turnip: Time-series U-Net with recurrence for NIR imaging PPG
CN113693573B (en) Video-based non-contact multi-physiological-parameter monitoring system and method
Yang et al. Heart rate estimation from facial videos based on convolutional neural network
CN113598741B (en) Atrial fibrillation evaluation model training method, atrial fibrillation evaluation method and atrial fibrillation evaluation device
Ben Salah et al. Contactless heart rate estimation from facial video using skin detection and multi-resolution analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination