US20140276104A1 - System and method for non-contact monitoring of physiological parameters - Google Patents

System and method for non-contact monitoring of physiological parameters Download PDF

Info

Publication number
US20140276104A1
US20140276104A1 US14/213,236 US201414213236A US2014276104A1 US 20140276104 A1 US20140276104 A1 US 20140276104A1 US 201414213236 A US201414213236 A US 201414213236A US 2014276104 A1 US2014276104 A1 US 2014276104A1
Authority
US
United States
Prior art keywords
subject
roi
breathing
ptt
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/213,236
Inventor
Nongjian Tao
DangDang Shao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Arizona Board of Regents of ASU
Original Assignee
Arizona Board of Regents of ASU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arizona Board of Regents of ASU filed Critical Arizona Board of Regents of ASU
Priority to US14/213,236 priority Critical patent/US20140276104A1/en
Assigned to ARIZONA BOARD OF REGENTS, A BODY CORPORATE OF THE STATE OF ARIZONA, ACTING FOR AND ON BEHALF OF ARIZONA STATE UNIVERSITY reassignment ARIZONA BOARD OF REGENTS, A BODY CORPORATE OF THE STATE OF ARIZONA, ACTING FOR AND ON BEHALF OF ARIZONA STATE UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHAO, DANGDANG, TAO, NONGJIAN
Publication of US20140276104A1 publication Critical patent/US20140276104A1/en
Priority to US15/826,224 priority patent/US11363990B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7239Details of waveform analysis using differentiation including higher order derivatives
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/087Measuring breath flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing

Definitions

  • the subject matter disclosed herein relates generally to non-contact and non-invasive monitoring of physiological signals, such as heart rate, pulse transit time and breathing pattern, and/or other physiological parameters of a subject, such as a person.
  • PPG photoplethysmography
  • the first approach is optical detection of a person's finger pressed on the portable device or a camera built in the mobile device to perform PPG. While useful, the results are affected by how hard the person presses on the camera, and also by the ambient lighting condition. Further, the need of steady physical contact of person's finger with the portable device makes it impractical for continuously monitoring physiological signals under free-living conditions.
  • the second optical approach is based on a non-contact mode. For example, heart and breathing rates are obtained from images of a person's face, upper arms, and palms recorded with a digital camera, such as a smartphone camera and a webcam. In addition to heart and breathing rates, heart rate variability (HRV) has been analyzed from facial video images. More recently, a near-IR enhanced camera has been used to obtain heart rate from a person's facial area and breathing rate from the person's chest area.
  • HRV heart rate variability
  • the signals extracted from the images obtained or captured using these imaging-based non-contact approaches contain noise from various sources.
  • at least one conventional method used an independent component analysis (ICA) to separate a multivariate signal into additive subcomponents supposing the mutual statistical independence of the non-Gaussian source signals.
  • ICA independent component analysis
  • heart rate which typically varies between 0.8-3 Hertz (Hz)
  • Hz Hertz
  • at least one conventional method determined a movement artifact map by averaging the powers at bandwidths around the heart rate.
  • a system for monitoring one or more physiological parameters of a subject under free-living conditions includes a camera configured to capture and record a video sequence including at least one image frame of at least one region of interest (ROI) of the subject's body.
  • a computer is in signal communication with the camera to receive signals transmitted by the camera representative of the video sequence.
  • the computer includes a processor configured to process the signals associated with the video sequence recorded by the camera and a display configured to display data associated with the signals.
  • a method for monitoring a breathing pattern of a subject includes selecting a region of pixels around an edge of each shoulder of the subject to be the regions of interest (ROIs), determining a derivative of the ROIs along a vertical direction to obtained two differential images of the ROIs, determining a position of each shoulder by dividing a differential image of each selected ROI into a top portion and an equal bottom portion along the edge of the shoulder, wherein an intensity of the top portion is dA and an intensity of the bottom portion is dB, and determining a vertical movement of each shoulder for every frame of the video sequence.
  • ROIs regions of interest
  • FIG. 1 is a schematic view of an exemplary system configured for non-contact and non-invasive monitoring of physiological parameters of a subject;
  • FIG. 2( a ) shows an original image with a region of interest (ROI) (blue rectangle) near a mouth of the subject (shown on the left portion of FIG. 2( a )) and a Fast Fourier Transform (FFT) spectrum of the ROI (shown on the right portion of FIG. 2( a )) illustrating red, green and blue lines representing the R, G and B color channels, respectively;
  • ROI region of interest
  • FFT Fast Fourier Transform
  • FIG. 2( b ) shows a colormap of an FFT peak amplitude in each pixel at a heart beating frequency (heart rate) illustrating a color scale from blue to red indicating the FFT peak amplitude at heart rate;
  • FIG. 2( c ) shows a signal-to-noise ratio (SNR) colormap at heart rate
  • FIG. 2( d ) shows a heart beat waveform obtained with an exemplary method
  • FIG. 2( e ) is a heart beat detection validation illustrating a heart beat waveform obtained from a commercial device
  • FIG. 3( a ) illustrates tracking shoulder movement of the subject to detect a breathing pattern, wherein the left panel of FIG. 3( a ) shows a selected ROI on each shoulder (red box), and each ROI is divided into two sub-regions, sub-region A and sub-region B, along a vertical direction and the right panel of FIG. 3( a ) shows corresponding breathing cycles from the ROIs using a detection method;
  • FIG. 3( b ) shows a zoomed-in image showing the subject's left shoulder in the left panel of FIG. 3( b ) and the right panel of FIG. 3( b ) shows a derivative image with respect to the vertical direction, wherein the shoulder edge is shown as a bright line;
  • FIG. 4 shows different breathing patterns obtained by an exemplary differential method
  • FIG. 5 shows a workflow for a method of tracking body movement using a motion tracking algorithm
  • FIG. 6( a ) illustrates the effectiveness of an exemplary motion tracking algorithm for detecting a breathing pattern detection, wherein the left panel shows an image of a subject with a selected ROI on the subject's left shoulder and the right panel shows breathing patterns with the motion-tracking algorithm (blue curve) and without the motion-tracking algorithm (red curve);
  • FIG. 6( b ) illustrates a comparison of breathing patterns obtained with an exemplary method as described herein (red line) and with a Zephyr device (black line);
  • FIG. 6( c ) illustrates a comparison of breathing patterns obtained with an exemplary method as described herein (red line) and an Oxycon device (black line);
  • FIG. 7 shows a correlation between exhaled breath volumes obtained from an exemplary differential detection method and an Oxycon device
  • FIG. 8( a ) shows a PTT definition for three sites of the subject's body
  • FIG. 8( b ) shows corresponding ROIs of the three sites shown in FIG. 8( a );
  • FIG. 8( c ) shows PPG signals obtained from the ROIs shown in FIG. 8( b ), with a time delay of about 30 milliseconds (ms) between PPG signals obtained from the mouth and the palm;
  • FIG. 9( a ) shows an estimate peak location for a single cycle of a PPG signal by using a linear curve fitting method, wherein one cycle from the PPG signal is taken and 2 linear curves (black dash lines) are used to fit the original signal from a left part (red) and a right part (blue), independently;
  • FIG. 9( b ) shows a point of intersection of 2 linear curves (green arrow) at the estimated peak location in that particular heart beat cycle
  • FIG. 10 illustrates Bland-Altman plots showing an average of a heart rate measured by a commercial pulse oximetry and an exemplary method as described herein, plotted against a difference between them;
  • FIG. 11 illustrates Bland-Altman plots showing an average of a breathing rate measured by a commercial Zephyr device and an exemplary method as described herein, plotted against a difference between them.
  • a breathing pattern is tracked based on detection of body movement associated with breathing using a differential signal processing approach.
  • a motion-tracking algorithm is implemented to correct random body movements that are unrelated to breathing.
  • a heart beat pattern is obtained from a color change in selected regions of interest (“ROI”) near the subject's mouth, and a pulse transit time is determined by analyzing pulse patterns at different locations of the subject.
  • ROI regions of interest
  • the embodiments of the imaging-based methods described herein are suitable for tracking vital physiological parameters under a free-living condition.
  • a user can measure his/her vital signs during regular or routine activity, such as working on a computer, or checking a message using a mobile device, such as a cell phone or a tablet.
  • the applications on the computer or mobile device run in the background with no or minimal attention from the user. Additionally, the user does not have to purchase, carry and/or maintain additional devices.
  • the embodiments described herein provide a method for non-contact monitoring of several physiological signals in real-time by maximizing the signals while minimizing noise due to unwanted body movement.
  • the exhalation volume flow rate and cardiac pulse transit time is obtained in certain embodiments.
  • movement of one or more selected regions of interest (referred to herein as an “ROI”) of the body is detected and used to determine and track a breathing pattern, including a breathing frequency and an amplitude, for example, from which an exhaled volume flow rate is obtained.
  • one or more color changes of one or more selected regions of interest are used to determine heart rate and face paleness.
  • Exhalation flow rate may be an important physiological parameter and is proportional to a subject's metabolic rate.
  • PTT blood pressure pulse wave velocity
  • cardiovascular parameters such as arterial elasticity and stiffness.
  • PWV blood pressure pulse wave velocity
  • Traditionally PWV has been measured using a galvanometer and ultrasound techniques.
  • PTT was determined by performing simultaneous ECG and PPG.
  • a contact pulse oximetry is used to determine a difference in PTT of a left index finger and a left second toe.
  • the PTT difference is related to a change in arterial distensibility due to epidurally induced sympathetic block.
  • a non-contact optical imaging method is used to determine a PTT difference, along with a breath-by-breath breathing pattern, and an exhalation flow rate.
  • a system 20 is configured to monitor one or more physiological parameters of a subject 22 , including, for example, a heart rate (HR), a breathing frequency (BF), an exhalation flow rate and/or a pulse transit time (PTT), by processing images captured with one or more digital cameras using one or more algorithms.
  • HR heart rate
  • BF breathing frequency
  • PTT pulse transit time
  • the HR and the PTT are detected by tracking an image intensity change of the subject's skin
  • the BF and the VE are detected by tracking subtle body movements of the subject associated with breathing.
  • system 20 includes one or more digital cameras, such as one or more digital cameras 24 , in signal communication with a computer 26 having a display 28 configured to display data and parameters associated with signals received from camera 22 , such as optical images captured by and acquired from camera 24 .
  • Cameras 24 are configured to capture video images and process the video images into associated signals that are then transmitted to computer 26 for further processing before the video images are displayed on display 28 .
  • system 20 includes a Logitech colored Webcam (HD 720p), a Pike black and white camera (F-032B), and a Pike color camera (F-032C), used to capture video sequences or images of subject 22 , such as video images of the subject's face, palms, and upper body, respectively.
  • Different cameras colored or black and white
  • suitable cameras produce satisfactory results in terms of determining the physiological parameters.
  • a mobile device such as a cell phone or a tablet, is used by an individual to monitor his or her vital signs at any time and/or any where.
  • These mobile devices are not only equipped with wireless communication capabilities, but also other functions and components, such as a camera, a microphone, an accelerator, and/or a global positioning system (GPS) navigation device, as well as computational power for signal processing.
  • GPS global positioning system
  • a face color index can be determined with a mobile device.
  • the color of a human face provides important health information. For example, if someone is sick, his/her face is often pale. Sleep deprivation often shows up as black eyes, and liver, kidney, or thyroid disease may cause chronic black eyes. Further, a person's blood sugar level also has an impact on the color or blackness of the person's eyes.
  • accurately capturing the color change under a free-living environment is difficult because of the variability in the lighting condition in most ambient environments.
  • the method overcomes this difficulty by using light emitted from the mobile device screen (i.e., a cell phone display screen).
  • an application is downloaded on the mobile device to activate the video recorder to capture and record the video sequence. From the video sequence, a red component from selected regions of interest of the person's face are analyzed. To minimize the effect of ambient light, an image is captured before turning on the screen so that the signals from the uncontrolled ambient light can be removed from the analysis.
  • Computer 26 includes one or more processors 30 configured to process the signals associated with the video images captured by cameras 24 .
  • each processor 30 receives programmed instructions from software, firmware and data from memory 32 and performs various operations using the data and instructions.
  • Each processor 30 may include an arithmetic logic unit (ALU) that performs arithmetic and logical operations and a control unit that extracts instructions from memory 32 and decodes and executes the instructions, calling on the ALU when necessary.
  • Memory 32 generally includes a random-access memory (RAM) and a read-only memory (ROM).
  • memory 32 may include an operating system, which executes on processor 30 .
  • the operating system performs basic tasks that include recognizing input, sending output to output devices, such as display 28 , keeping track of files and directories and controlling various peripheral devices.
  • references to “processor” are to be understood to refer to central processing units, microprocessors, microcontrollers, reduced instruction set circuits (RISC), application specific integrated circuits (ASIC), logic circuits and any other circuit or processor capable of executing the functions described herein.
  • Memory 32 may include storage locations for the preset macro instructions that may be accessible using a preset switch, for example.
  • references to “software” and “firmware” are interchangeable, and are to be understood to refer to and include any computer program stored in memory for execution by processor 30 , including RAM memory, ROM memory, EPROM memory, EEPROM memory, and non-volatile RAM (NVRAM) memory.
  • RAM memory random access memory
  • ROM memory read-only memory
  • EPROM memory erasable programmable read-only memory
  • EEPROM memory electrically erasable programmable read-only memory
  • NVRAM non-volatile RAM
  • processor 30 and memory 32 are located external to camera 24 such as in computer 26 or another suitable standalone or mainframe computer system capable of performing the functions described herein.
  • video images are transferred to memory 32 or digitized.
  • processor 30 is located within camera 24 .
  • processor 30 may be in signal communication with a display of the camera 24 or mobile device in which the camera is housed or in signal communication with an external computer having a display configured to display data associated with the signals generated by camera 24 and processed by processor 30 .
  • the video sequences or images are taken under ambient light condition.
  • one or more controlled light sources 34 such as one or more light emitting diode (LED) source and/or a suitable desk lamp, are used.
  • Subject 22 sits at a distance of 30 centimeters (cm) to 80 centimeters, and, more particularly, 50 cm from a lens of camera 24 to ensure a good quality and clear focus for the captured images and associated signals.
  • the imaging method uses only ambient light of low-cost CMOS imagers (e.g., webcam), which is suitable for tracking physiological parameters under free-living conditions.
  • the method can be readily adapted to the mobile platform, such as cell phones, tablets, etc. with a built-in camera as described above.
  • a user interface 36 analyzes the captured video sequences and data.
  • user interface 36 is capable of showing a live or real time video sequence of subject 22 , which allows a selection of regions of interest (ROI), and for processor 30 to perform signal processing of the data in the ROIs to determine the heart beat and breathing signals independently, and display the results in real time on display 28 .
  • ROI regions of interest
  • camera 22 captures and records a video sequence or video images of a subject's face for a suitable time period, such as 30 seconds.
  • Processor 30 is configured to perform a Fast Fourier Transform (FFT) on the intensity signal averaged over all the pixels in each selected ROI to determine the frequency components of the video signal within a noisy time domain signal.
  • FFT Fast Fourier Transform
  • a longer recording time may produce a better signal, but is less user friendly as it requires a longer testing time.
  • the FFT spectrum of the ROI clearly reveals the heart beat signal as a peak at a frequency corresponding to the heart rate.
  • the results of a red channel 200 , a blue channel 202 , and a green channel 204 are compared, and green channel 204 has been found to give the strongest heart beat signal, or the largest peak amplitude in the FFT spectrum, as shown in FIG. 2( a ).
  • One possible reason is that oxygenated hemoglobin absorbs green light more than red light and penetrates deeper into the subject's skin when compared to blue light.
  • the SNR may also depend on the selection of ROI, in one embodiment the peak amplitude in each pixel is extracted and plotted on a colormap to analyze the variation of the heart beat signal in different areas of the face as shown in the signal colormap of FIG. 2( b ).
  • the areas around the subject's lips and nose regions have larger heart beat amplitudes, which is consistent with the fact that these regions have more blood vessels.
  • the eye regions and the face edges also appear to have large heart beat amplitudes, which is due to body movement, rather than real heart beat signals.
  • This conclusion is supported by the SNR colormap shown in FIG. 2( c ), obtained by normalizing the peak amplitude in the FFT spectrum of each pixel with the noise level near the peak.
  • the SNR colormap shows that the regions around the eyes and the edges of the subject's face have rather low SNR values.
  • the region around the lips gives the strongest and most stable heart beat signal.
  • the region around the lips is selected with an ROI size of 40 ⁇ 80 pixels, and green channel 204 of the ROI is analyzed for heart beat detection.
  • the green channel signal is first averaged within the ROI, and then processed by a low-pass filter with a cut-off frequency of 2 Hz to remove background noise at a high frequency.
  • FIG. 2( d ) shows a heart beat signal 206 obtained by such process.
  • a Zephyr wearable device or other suitable device can be used to obtain heart beat waveforms 208 as a reference to validate the results herein.
  • the heart rate calculated from the ECG measured by the Zephyr wearable device, as shown in FIG. 2( e ) is comparable to the heart rate obtained with methods as described herein.
  • the breathing pattern can be determined according to one embodiment by detecting and analyzing the body movement associated with breathing. Different parts of the body move with breathing differently. For example, the chest and the abdomen will expand and contract, the shoulder and the head will move, such as in a vertical direction up and down Additionally, a person's facial features will change or move with the associated movement of the shoulder and the head.
  • Conventional methods for measuring the body movement associated with breathing use a device worn by the user. This approach, as discussed above, is inconvenient.
  • the subject's chest and abdomen may have the largest movement with breath, but these regions are not easily accessible to camera 22 for imaging under natural and free-living conditions. For this reason, movement of the subject's face, neck and upper body is detected and analyzed to determine the breathing pattern.
  • the body movement is measured via a sequence of images or a video including a plurality of images, which does not involve direct, physical contact with the subject, and is thus less invasive, and tracks the breathing pattern with a suitable device, such as a built-in camera of a computer or a built-in camera of a mobile device.
  • a region of 40 ⁇ 40 pixels around an edge of each shoulder of subject 22 is selected to be the ROIs for breathing detection, as shown in FIG. 3( a )(left panel).
  • a derivative of the ROIs is taken along a vertical direction to obtained two differential images of the ROIs. The edges of the shoulders in the differential images are revealed as bright lines in FIG. 3( b ).
  • the locations of the bright lines shown in FIG. 3( b ) indicate the respective positions of the edges of the subject's left shoulder and right shoulder.
  • the differential image of each selected ROI is divided into two equal portions along the shoulder edge.
  • An intensity of a top portion is referred to as dA
  • an intensity of a bottom portion is referred to as dB.
  • dA ⁇ dB A difference, dA ⁇ dB , in Eq. 1 is sensitive to the vertical movement, and also immune of common noise in dA and dB. Dividing dA ⁇ dB by dA+dB further reduces noise associated with intensity fluctuations of light source 34 .
  • dI is calculated for every frame of the video sequence, and plotted against time after applying a low-pass filter with a cut-off frequency of 2 Hz, as shown in FIG. 3( a )(right panel).
  • FIG. 3( a )(right panel) Shown in FIG. 3( a )(right panel) is an example of breathing waveforms obtained with the method described above, wherein the downhill cycles correspond to exhalation periods when the thoracic cavity is shrinking and the shoulders move downwards, and the uphill cycles correspond to inhalation periods when the thoracic cavity is expanding and the shoulders move upwards.
  • the breathing pattern obtained from both the left shoulder 300 and the right shoulder 302 are shown in FIG. 3( a )(right panel), which are in good agreement with each other.
  • the subject is instructed to change his/her breathing pattern intentionally. Initially, the subject breathed normally for 6 cycles, as indicated by reference number 400 , followed by 4 cycles of deep breathing, as indicated by reference number 402 , and then 8 cycles of rapid breathing, as indicated by reference number 404 .
  • the results shown in FIG. 4 demonstrate that the described method successfully captures the breathing pattern variations.
  • a method 500 implements a motion-tracking algorithm to correct such motion artifacts based on a phase correlation method.
  • the motion-tracking algorithm checks a shift of the ROIs due to the body movement at a suitable time interval, for example, every two seconds, and corrects the shift of each ROI by updating a new location of the ROI.
  • an ROI is selected 502 to begin method 500 .
  • a differential method is used to detect an edge at a shoulder of the subject 504 and region dA and region dB are defined 506 .
  • Body movement is calculated every 100 frames of the video sequence, for example, by a phase correlation method 508 .
  • the body movement is calculated based on a shift in an x direction, indicated as shift_x, and a shift in a y direction, indicated as shift_y.
  • Region dA and region dB are updated 510 with shift_x and shift_y.
  • dI as calculated using Eq. 1 above is plotted 512 to generate a breathing curve.
  • FIG. 6( a ) The effectiveness of this method implementing the motion-tracking algorithm is shown in FIG. 6( a ), which compares the results with and without the motion-tracking algorithm.
  • the left panel of FIG. 6( a ) shows an image of subject 22 with a selected ROI on the subject's left shoulder.
  • the ROI follows the body movement (blue box 600 ).
  • the motion-tracking algorithm is disabled, the ROI is fixed in the image and the shoulder may move out of the ROI (red box 602 ).
  • the right panel of FIG. 6( a ) shows breathing patterns with the motion-tracking algorithm (blue curve 604 ) and without the motion-tracking algorithm (red curve 606 ).
  • the measured breathing signal was overwhelmed by the body movement.
  • the breathing pattern is clearly observed with the implementation of the motion-tracking algorithm.
  • the algorithm worked effectively at least in part because the breathing-related body movement of the shoulders has a small amplitude and is primarily in the vertical direction, which is different from the relatively large body movement that may occur in all directions and at time scales different from the regular breathing.
  • the breathing pattern detection method as described herein is validated by comparing the results of a breathing pattern obtained with the exemplary method (red line 610 ) and a breathing pattern obtained with a Zephyr device (black line 612 ), as shown in FIG. 6( b ), and a breathing pattern obtained with an Oxycon device (black line 614 ), as shown in FIG. 6( c ).
  • the results obtained with the image processing method described herein are in excellent agreement with the two different reference technologies, not only in a breathing frequency but also in a relative breathing amplitude.
  • FIG. 7 shows the correlation between the exhaled breath volume obtained from the differential detection method and the commercially available Oxycon instrument.
  • the exhaled breath volume is taken from the shoulder movement, or dI. Data from 6 tests can be fit with a linear curve. For every unit of dI, the volume change is about 0.15 Liters (L).
  • an energy expenditure based on the breathing frequency and amplitude is determined.
  • One suitable equation for determining the energy expenditure is indirect calorimetry, which measures consumed oxygen and produced carbon dioxide rate using the Weir equation. The equation takes the form of:
  • V E oxygen consumption rate (ml/min.)
  • VCO 2 carbon dioxide production rate (ml/min.)
  • VCO 2 V E ⁇ ( FCO 2 ⁇ 0.0003), Eq. (4)
  • V E V b *f b , Eq. (5)
  • V b is a volume of exhaled air per breathing cycle
  • f b denotes a breathing frequency
  • V E can be expressed as a total exhaled volume for a period over time.
  • V b is linearly correlated to a breathing amplitude determined with the methods disclosed above as shown in FIG. 12 . From this relationship, V b and, thus, V E , is determined and the energy expenditure is determined from Equation 2.
  • a non-contact optical imaging method is also used to determine pulse transit time (PTT) related information.
  • PTT pulse transit time
  • the variations in PTT of different body sites are obtained by analyzing a time difference of PPG signals.
  • FIG. 8( a ) the PTT from the subject's heart to the subject's mouth, and from the subject's heart to the subject's left palm and right palm were indicated by t 1 , t 2 , and t 3 , respectively.
  • the corresponding ROI selections of three locations from the video sample are shown in FIG. 8( b ) as three rectangles having different colors, namely ROI selection 800 , ROI selection 802 , and ROI selection 804 .
  • the PPG signals obtained from the ROIs were plotted in FIG. 8( c ).
  • the PPG signal detected from the mouth area (blue curve 806 ) arrived earlier than the PPG signals detected from the left palm area (red curve 808 ) and the right palm area (green curve 810 ).
  • a sample of the delay is shown in FIG. 8( c ) to illustrate a PTT difference between the mouth and the palms from signals obtained from the ROIs.
  • the time delay is about 30 milliseconds (ms) between the PPG signals obtained form the mouth and the palms.
  • the PTT difference was not obvious between left and right palms.
  • a first algorithm is based on comparing peak locations of different PPG signals using a linear curve fitting method.
  • FIGS. 9( a ) and 9 ( b ) show an estimated peak location for a single cycle of a PPG signal using a linear curve fitting method.
  • FIG. 9( a ) shows an original PPG signal sample including one heart beat cycle from a PPG signal obtained from one subject. The heart beat cycle is selected by a dash red rectangle 900 for further analysis. The peak location of the selected signal is estimated by fitting two linear curves L 1 and L 2 (black dashed lines).
  • L 1 is positioned on a rising edge (left portion of the peak) and L 2 is positioned on a falling edge (right portion of the peak) of the signal as shown in FIG. 9( b ).
  • the point of intersection (indicated by the green arrow 902 ) of the two linear curves L 1 and L 2 is the estimated peak location in the selected heart beat cycle. PTT differences were determined by comparing the peak locations of PPG signals obtained at different body locations (e.g., the subject's mouth and the subject's fingers).
  • Table I shows PPG delay estimation results among different sites. The values are calculated based on a linear curve fitting method. The estimated delay values obtained form a facial area to two palm areas are similar, about 30-31 milliseconds (ms).
  • FIG. 10 shows the Bland-Altman plot for heart rate detection.
  • the differences between the breathing rate measured by the non-contact method described herein and the breathing rate measured by a commercial pulse oximetry (y-axis) were plotted against the average breathing rate measured by the two methods (x-axis).
  • the mean difference was 0.86 beats per minute (bpm) with 95% limits of agreement ( ⁇ 1.96 standard deviation) at ⁇ 2.47 bpm and 4.19 bpm.
  • the root-mean-square error (RMSE) was 1.87 bpm and r was 0.98 (p ⁇ 0.001).
  • FIG. 11 shows the Bland-Altman plot for breathing rate detection.
  • the differences between the breathing rate measured by the non-contact method described herein and the breathing rate measured by a Zephyr device (y-axis) were plotted against the average breathing rate measured by the two methods (x-axis).
  • the mean difference was 0.02 breaths/minute (min.) with 95% limits of agreement ( ⁇ 1.96 standard deviation) at ⁇ 2.40 breaths/min. and 2.45 breaths/min.
  • RMSE was 1.20 breaths/min. and r was 0.93 (p ⁇ 0.001).
  • the embodiments described herein demonstrate exemplary optical imaging-based methods for non-contact monitoring of physiological signals, including, for example, breathing frequency, exhalation flow rate, heart rate, and pulse transit time, by detecting facial color changes associated with blood circulation and body movement associated with breathing.
  • physiological signals including, for example, breathing frequency, exhalation flow rate, heart rate, and pulse transit time
  • the breathing frequency and the exhalation volume flow rate are accurately tracked, which is robust against moderate body movements unrelated to breathing.
  • the physiological signals measured by the imaging methods are in excellent agreement with those obtained using reference technologies.
  • the difference in pulse transit time can be determined by the non-contact imaging method and the results are comparable to the related values reported by other literature.
  • results of a small-scale pilot study involving participants of different ethnic profiles, sex and ages demonstrate the basic principle of the optical imaging methods.

Abstract

A system and method for monitoring one or more physiological parameters of a subject under free-living conditions is provided. The system includes a camera configured to capture and record a video sequence including at least one image frame of at least one region of interest (ROI) of the subject's body. A computer in signal communication with the camera to receive signals transmitted by the camera representative of the video sequence includes a processor configured to process the signals associated with the video sequence recorded by the camera and a display configured to display data associated with the signals.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. provisional Application No. 61/784,646 filed on Mar. 14, 2013, which is incorporated by reference herein in its entirety.
  • BACKGROUND
  • The subject matter disclosed herein relates generally to non-contact and non-invasive monitoring of physiological signals, such as heart rate, pulse transit time and breathing pattern, and/or other physiological parameters of a subject, such as a person.
  • Monitoring vital physiological signals, such as heart rate, pulse transit time and breathing pattern, are basic requirements in the diagnosis and management of various diseases. Traditionally, these signals are measured only in hospital and clinical settings. An important recent trend is the development of portable devices for tracking the vital physiological signals non-invasively based on optical methods known as photoplethysmography (PPG). These portable devices, when combined with cell phones, tablets or other mobile devices, provide a new opportunity for everyone to monitor one's vital signs any time and any where. These mobile device based efforts can be divided into the following two approaches.
  • The first approach is optical detection of a person's finger pressed on the portable device or a camera built in the mobile device to perform PPG. While useful, the results are affected by how hard the person presses on the camera, and also by the ambient lighting condition. Further, the need of steady physical contact of person's finger with the portable device makes it impractical for continuously monitoring physiological signals under free-living conditions. The second optical approach is based on a non-contact mode. For example, heart and breathing rates are obtained from images of a person's face, upper arms, and palms recorded with a digital camera, such as a smartphone camera and a webcam. In addition to heart and breathing rates, heart rate variability (HRV) has been analyzed from facial video images. More recently, a near-IR enhanced camera has been used to obtain heart rate from a person's facial area and breathing rate from the person's chest area.
  • The signals extracted from the images obtained or captured using these imaging-based non-contact approaches contain noise from various sources. To combat the noise issue, at least one conventional method used an independent component analysis (ICA) to separate a multivariate signal into additive subcomponents supposing the mutual statistical independence of the non-Gaussian source signals. Using ICA, heart rate, which typically varies between 0.8-3 Hertz (Hz), has been detected. Further, at least one conventional method determined a movement artifact map by averaging the powers at bandwidths around the heart rate. These efforts helped to minimize unwanted noise in the measured heart rate signals. However, it is much more challenging to track breathing pattern, especially breath-by-breath, because breathing frequency is much lower than the heart rate. In a typical ambient environment, low frequency noise, particularly noise associated with body movement, is much greater than noise at high frequencies.
  • SUMMARY
  • In one aspect, a system for monitoring one or more physiological parameters of a subject under free-living conditions includes a camera configured to capture and record a video sequence including at least one image frame of at least one region of interest (ROI) of the subject's body. A computer is in signal communication with the camera to receive signals transmitted by the camera representative of the video sequence. The computer includes a processor configured to process the signals associated with the video sequence recorded by the camera and a display configured to display data associated with the signals.
  • In another aspect, a method for monitoring a breathing pattern of a subject includes selecting a region of pixels around an edge of each shoulder of the subject to be the regions of interest (ROIs), determining a derivative of the ROIs along a vertical direction to obtained two differential images of the ROIs, determining a position of each shoulder by dividing a differential image of each selected ROI into a top portion and an equal bottom portion along the edge of the shoulder, wherein an intensity of the top portion is dA and an intensity of the bottom portion is dB, and determining a vertical movement of each shoulder for every frame of the video sequence.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view of an exemplary system configured for non-contact and non-invasive monitoring of physiological parameters of a subject;
  • FIG. 2( a) shows an original image with a region of interest (ROI) (blue rectangle) near a mouth of the subject (shown on the left portion of FIG. 2( a)) and a Fast Fourier Transform (FFT) spectrum of the ROI (shown on the right portion of FIG. 2( a)) illustrating red, green and blue lines representing the R, G and B color channels, respectively;
  • FIG. 2( b) shows a colormap of an FFT peak amplitude in each pixel at a heart beating frequency (heart rate) illustrating a color scale from blue to red indicating the FFT peak amplitude at heart rate;
  • FIG. 2( c) shows a signal-to-noise ratio (SNR) colormap at heart rate;
  • FIG. 2( d) shows a heart beat waveform obtained with an exemplary method;
  • FIG. 2( e) is a heart beat detection validation illustrating a heart beat waveform obtained from a commercial device;
  • FIG. 3( a) illustrates tracking shoulder movement of the subject to detect a breathing pattern, wherein the left panel of FIG. 3( a) shows a selected ROI on each shoulder (red box), and each ROI is divided into two sub-regions, sub-region A and sub-region B, along a vertical direction and the right panel of FIG. 3( a) shows corresponding breathing cycles from the ROIs using a detection method;
  • FIG. 3( b) shows a zoomed-in image showing the subject's left shoulder in the left panel of FIG. 3( b) and the right panel of FIG. 3( b) shows a derivative image with respect to the vertical direction, wherein the shoulder edge is shown as a bright line;
  • FIG. 4 shows different breathing patterns obtained by an exemplary differential method;
  • FIG. 5 shows a workflow for a method of tracking body movement using a motion tracking algorithm;
  • FIG. 6( a) illustrates the effectiveness of an exemplary motion tracking algorithm for detecting a breathing pattern detection, wherein the left panel shows an image of a subject with a selected ROI on the subject's left shoulder and the right panel shows breathing patterns with the motion-tracking algorithm (blue curve) and without the motion-tracking algorithm (red curve);
  • FIG. 6( b) illustrates a comparison of breathing patterns obtained with an exemplary method as described herein (red line) and with a Zephyr device (black line);
  • FIG. 6( c) illustrates a comparison of breathing patterns obtained with an exemplary method as described herein (red line) and an Oxycon device (black line);
  • FIG. 7 shows a correlation between exhaled breath volumes obtained from an exemplary differential detection method and an Oxycon device;
  • FIG. 8( a) shows a PTT definition for three sites of the subject's body;
  • FIG. 8( b) shows corresponding ROIs of the three sites shown in FIG. 8( a);
  • FIG. 8( c) shows PPG signals obtained from the ROIs shown in FIG. 8( b), with a time delay of about 30 milliseconds (ms) between PPG signals obtained from the mouth and the palm;
  • FIG. 9( a) shows an estimate peak location for a single cycle of a PPG signal by using a linear curve fitting method, wherein one cycle from the PPG signal is taken and 2 linear curves (black dash lines) are used to fit the original signal from a left part (red) and a right part (blue), independently;
  • FIG. 9( b) shows a point of intersection of 2 linear curves (green arrow) at the estimated peak location in that particular heart beat cycle;
  • FIG. 10 illustrates Bland-Altman plots showing an average of a heart rate measured by a commercial pulse oximetry and an exemplary method as described herein, plotted against a difference between them; and
  • FIG. 11 illustrates Bland-Altman plots showing an average of a breathing rate measured by a commercial Zephyr device and an exemplary method as described herein, plotted against a difference between them.
  • Other aspects and advantages of certain embodiments will become apparent upon consideration of the following detailed description, wherein similar structures have similar reference numerals.
  • DETAILED DESCRIPTION
  • The embodiments described herein relate to an optical imaging-based system and associated methods for measuring one or more vital physiological signals of a subject, such as a human patient, including, without limitation, a breathing frequency, an exhalation flow rate, a heart rate, and/or a pulse transit time. In one embodiment, a breathing pattern is tracked based on detection of body movement associated with breathing using a differential signal processing approach. A motion-tracking algorithm is implemented to correct random body movements that are unrelated to breathing. In a particular embodiment, a heart beat pattern is obtained from a color change in selected regions of interest (“ROI”) near the subject's mouth, and a pulse transit time is determined by analyzing pulse patterns at different locations of the subject. The embodiments of the imaging-based methods described herein are suitable for tracking vital physiological parameters under a free-living condition. For example, a user can measure his/her vital signs during regular or routine activity, such as working on a computer, or checking a message using a mobile device, such as a cell phone or a tablet. The applications on the computer or mobile device run in the background with no or minimal attention from the user. Additionally, the user does not have to purchase, carry and/or maintain additional devices.
  • The embodiments described herein provide a method for non-contact monitoring of several physiological signals in real-time by maximizing the signals while minimizing noise due to unwanted body movement. In addition to heart rate and breathing frequency, the exhalation volume flow rate and cardiac pulse transit time (PTT) is obtained in certain embodiments. In one embodiment, movement of one or more selected regions of interest (referred to herein as an “ROI”) of the body is detected and used to determine and track a breathing pattern, including a breathing frequency and an amplitude, for example, from which an exhaled volume flow rate is obtained. In a particular embodiment, one or more color changes of one or more selected regions of interest are used to determine heart rate and face paleness. Exhalation flow rate may be an important physiological parameter and is proportional to a subject's metabolic rate. PTT is related to blood pressure pulse wave velocity (PWV), reflecting cardiovascular parameters, such as arterial elasticity and stiffness. Traditionally, PWV has been measured using a galvanometer and ultrasound techniques. Recently, PTT was determined by performing simultaneous ECG and PPG. For example, in certain conventional techniques a contact pulse oximetry is used to determine a difference in PTT of a left index finger and a left second toe. The PTT difference is related to a change in arterial distensibility due to epidurally induced sympathetic block. Conversely, in the embodiments described herein a non-contact optical imaging method is used to determine a PTT difference, along with a breath-by-breath breathing pattern, and an exhalation flow rate.
  • Referring to FIGS. 1-11, and particularly to FIG. 1, a system 20 is configured to monitor one or more physiological parameters of a subject 22, including, for example, a heart rate (HR), a breathing frequency (BF), an exhalation flow rate and/or a pulse transit time (PTT), by processing images captured with one or more digital cameras using one or more algorithms. In the embodiment shown, the HR and the PTT are detected by tracking an image intensity change of the subject's skin, and the BF and the VE are detected by tracking subtle body movements of the subject associated with breathing.
  • Referring further to FIG. 1, system 20 includes one or more digital cameras, such as one or more digital cameras 24, in signal communication with a computer 26 having a display 28 configured to display data and parameters associated with signals received from camera 22, such as optical images captured by and acquired from camera 24. Cameras 24 are configured to capture video images and process the video images into associated signals that are then transmitted to computer 26 for further processing before the video images are displayed on display 28. In a certain embodiment, system 20 includes a Logitech colored Webcam (HD 720p), a Pike black and white camera (F-032B), and a Pike color camera (F-032C), used to capture video sequences or images of subject 22, such as video images of the subject's face, palms, and upper body, respectively. Different cameras (colored or black and white) have different inherent noise, but suitable cameras produce satisfactory results in terms of determining the physiological parameters.
  • In alternative embodiments, a mobile device, such as a cell phone or a tablet, is used by an individual to monitor his or her vital signs at any time and/or any where. These mobile devices are not only equipped with wireless communication capabilities, but also other functions and components, such as a camera, a microphone, an accelerator, and/or a global positioning system (GPS) navigation device, as well as computational power for signal processing.
  • For example, a face color index can be determined with a mobile device. The color of a human face provides important health information. For example, if someone is sick, his/her face is often pale. Sleep deprivation often shows up as black eyes, and liver, kidney, or thyroid disease may cause chronic black eyes. Further, a person's blood sugar level also has an impact on the color or blackness of the person's eyes. However, accurately capturing the color change under a free-living environment is difficult because of the variability in the lighting condition in most ambient environments. In one embodiment, the method overcomes this difficulty by using light emitted from the mobile device screen (i.e., a cell phone display screen). In one embodiment, an application is downloaded on the mobile device to activate the video recorder to capture and record the video sequence. From the video sequence, a red component from selected regions of interest of the person's face are analyzed. To minimize the effect of ambient light, an image is captured before turning on the screen so that the signals from the uncontrolled ambient light can be removed from the analysis.
  • Computer 26 includes one or more processors 30 configured to process the signals associated with the video images captured by cameras 24. In one embodiment, each processor 30 receives programmed instructions from software, firmware and data from memory 32 and performs various operations using the data and instructions. Each processor 30 may include an arithmetic logic unit (ALU) that performs arithmetic and logical operations and a control unit that extracts instructions from memory 32 and decodes and executes the instructions, calling on the ALU when necessary. Memory 32 generally includes a random-access memory (RAM) and a read-only memory (ROM). However, there may be other types of memory such as programmable read-only memory (PROM), erasable programmable read-only memory (EPROM) and electrically erasable programmable read-only memory (EEPROM). In addition, memory 32 may include an operating system, which executes on processor 30. The operating system performs basic tasks that include recognizing input, sending output to output devices, such as display 28, keeping track of files and directories and controlling various peripheral devices.
  • As used herein, references to “processor” are to be understood to refer to central processing units, microprocessors, microcontrollers, reduced instruction set circuits (RISC), application specific integrated circuits (ASIC), logic circuits and any other circuit or processor capable of executing the functions described herein. Memory 32 may include storage locations for the preset macro instructions that may be accessible using a preset switch, for example.
  • As used herein, references to “software” and “firmware” are interchangeable, and are to be understood to refer to and include any computer program stored in memory for execution by processor 30, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and non-volatile RAM (NVRAM) memory. The above memory types are exemplary only, and are thus not limiting as to the types of memory usable for storage of a computer program. In various embodiments, processor 30 and memory 32 are located external to camera 24 such as in computer 26 or another suitable standalone or mainframe computer system capable of performing the functions described herein. In one embodiment, video images are transferred to memory 32 or digitized. In an alternative embodiment, processor 30 is located within camera 24. In this embodiment, processor 30 may be in signal communication with a display of the camera 24 or mobile device in which the camera is housed or in signal communication with an external computer having a display configured to display data associated with the signals generated by camera 24 and processed by processor 30.
  • In one embodiment, the video sequences or images are taken under ambient light condition. In a particular embodiment, one or more controlled light sources 34, such as one or more light emitting diode (LED) source and/or a suitable desk lamp, are used. Subject 22 sits at a distance of 30 centimeters (cm) to 80 centimeters, and, more particularly, 50 cm from a lens of camera 24 to ensure a good quality and clear focus for the captured images and associated signals. In one embodiment, the imaging method uses only ambient light of low-cost CMOS imagers (e.g., webcam), which is suitable for tracking physiological parameters under free-living conditions. The method can be readily adapted to the mobile platform, such as cell phones, tablets, etc. with a built-in camera as described above. The use of personal mobile devices reduces the privacy concern of imaging-based detection. Because the approach is noninvasive, an additional benefit is that the results truly reflect the person's health status, without the known “white coat effect,” a phenomenon in which patients exhibit elevated blood pressure in a clinical setting.
  • A user interface 36, such as a Matlab-based user interface or other suitable user interface, analyzes the captured video sequences and data. In one embodiment, user interface 36 is capable of showing a live or real time video sequence of subject 22, which allows a selection of regions of interest (ROI), and for processor 30 to perform signal processing of the data in the ROIs to determine the heart beat and breathing signals independently, and display the results in real time on display 28.
  • Heart Beat Monitoring. In one embodiment, camera 22 captures and records a video sequence or video images of a subject's face for a suitable time period, such as 30 seconds. Processor 30 is configured to perform a Fast Fourier Transform (FFT) on the intensity signal averaged over all the pixels in each selected ROI to determine the frequency components of the video signal within a noisy time domain signal. In certain embodiments, a longer recording time may produce a better signal, but is less user friendly as it requires a longer testing time. The FFT spectrum of the ROI clearly reveals the heart beat signal as a peak at a frequency corresponding to the heart rate. In order to optimize the signal-to-noise ratio (SNR), the results of a red channel 200, a blue channel 202, and a green channel 204 are compared, and green channel 204 has been found to give the strongest heart beat signal, or the largest peak amplitude in the FFT spectrum, as shown in FIG. 2( a). One possible reason is that oxygenated hemoglobin absorbs green light more than red light and penetrates deeper into the subject's skin when compared to blue light. Because the SNR may also depend on the selection of ROI, in one embodiment the peak amplitude in each pixel is extracted and plotted on a colormap to analyze the variation of the heart beat signal in different areas of the face as shown in the signal colormap of FIG. 2( b). The areas around the subject's lips and nose regions have larger heart beat amplitudes, which is consistent with the fact that these regions have more blood vessels. Referring to FIG. 2( b), the eye regions and the face edges also appear to have large heart beat amplitudes, which is due to body movement, rather than real heart beat signals. This conclusion is supported by the SNR colormap shown in FIG. 2( c), obtained by normalizing the peak amplitude in the FFT spectrum of each pixel with the noise level near the peak. As shown in FIG. 2( c), the SNR colormap shows that the regions around the eyes and the edges of the subject's face have rather low SNR values.
  • The signal analysis described herein leads to a conclusion that the region around the lips gives the strongest and most stable heart beat signal. For real time determination of heart rate, the region around the lips is selected with an ROI size of 40×80 pixels, and green channel 204 of the ROI is analyzed for heart beat detection. In one embodiment, the green channel signal is first averaged within the ROI, and then processed by a low-pass filter with a cut-off frequency of 2 Hz to remove background noise at a high frequency. FIG. 2( d) shows a heart beat signal 206 obtained by such process. As described herein and referring to FIG. 2( e), a Zephyr wearable device or other suitable device can be used to obtain heart beat waveforms 208 as a reference to validate the results herein. The heart rate calculated from the ECG measured by the Zephyr wearable device, as shown in FIG. 2( e), is comparable to the heart rate obtained with methods as described herein.
  • Breathing Pattern Monitoring. Unlike heart rate monitoring, the breathing pattern can be determined according to one embodiment by detecting and analyzing the body movement associated with breathing. Different parts of the body move with breathing differently. For example, the chest and the abdomen will expand and contract, the shoulder and the head will move, such as in a vertical direction up and down Additionally, a person's facial features will change or move with the associated movement of the shoulder and the head. Conventional methods for measuring the body movement associated with breathing use a device worn by the user. This approach, as discussed above, is inconvenient.
  • The subject's chest and abdomen may have the largest movement with breath, but these regions are not easily accessible to camera 22 for imaging under natural and free-living conditions. For this reason, movement of the subject's face, neck and upper body is detected and analyzed to determine the breathing pattern. In this embodiment, the body movement is measured via a sequence of images or a video including a plurality of images, which does not involve direct, physical contact with the subject, and is thus less invasive, and tracks the breathing pattern with a suitable device, such as a built-in camera of a computer or a built-in camera of a mobile device. In one embodiment, a region of 40×40 pixels around an edge of each shoulder of subject 22 is selected to be the ROIs for breathing detection, as shown in FIG. 3( a)(left panel). A derivative of the ROIs is taken along a vertical direction to obtained two differential images of the ROIs. The edges of the shoulders in the differential images are revealed as bright lines in FIG. 3( b).
  • The locations of the bright lines shown in FIG. 3( b) indicate the respective positions of the edges of the subject's left shoulder and right shoulder. To accurately determine the shoulder positions, the differential image of each selected ROI is divided into two equal portions along the shoulder edge. An intensity of a top portion is referred to as dA, and an intensity of a bottom portion is referred to as dB. When the shoulders move up and down with breathing, dA increases (or decreases) and dB decreases (or increases). The vertical movement of shoulders can be determined by:
  • dI = dA - d B dA + d B . Eq . ( 1 )
  • A difference, dA−dB , in Eq. 1 is sensitive to the vertical movement, and also immune of common noise in dA and dB. Dividing dA−dB by dA+dB further reduces noise associated with intensity fluctuations of light source 34. dI is calculated for every frame of the video sequence, and plotted against time after applying a low-pass filter with a cut-off frequency of 2 Hz, as shown in FIG. 3( a)(right panel).
  • Shown in FIG. 3( a)(right panel) is an example of breathing waveforms obtained with the method described above, wherein the downhill cycles correspond to exhalation periods when the thoracic cavity is shrinking and the shoulders move downwards, and the uphill cycles correspond to inhalation periods when the thoracic cavity is expanding and the shoulders move upwards. The breathing pattern obtained from both the left shoulder 300 and the right shoulder 302 are shown in FIG. 3( a)(right panel), which are in good agreement with each other.
  • To further demonstrate the reliability of the method for real-time monitoring of a breathing pattern, the subject is instructed to change his/her breathing pattern intentionally. Initially, the subject breathed normally for 6 cycles, as indicated by reference number 400, followed by 4 cycles of deep breathing, as indicated by reference number 402, and then 8 cycles of rapid breathing, as indicated by reference number 404. The results shown in FIG. 4 demonstrate that the described method successfully captures the breathing pattern variations.
  • The accuracy of a breathing pattern measurement may be affected by large body movements unrelated to breathing during these measurements. In one embodiment, a method 500 implements a motion-tracking algorithm to correct such motion artifacts based on a phase correlation method. The motion-tracking algorithm checks a shift of the ROIs due to the body movement at a suitable time interval, for example, every two seconds, and corrects the shift of each ROI by updating a new location of the ROI. Referring to FIG. 5, an ROI is selected 502 to begin method 500. A differential method is used to detect an edge at a shoulder of the subject 504 and region dA and region dB are defined 506. Body movement is calculated every 100 frames of the video sequence, for example, by a phase correlation method 508. In this embodiment, the body movement is calculated based on a shift in an x direction, indicated as shift_x, and a shift in a y direction, indicated as shift_y. Region dA and region dB are updated 510 with shift_x and shift_y. dI as calculated using Eq. 1 above is plotted 512 to generate a breathing curve.
  • The effectiveness of this method implementing the motion-tracking algorithm is shown in FIG. 6( a), which compares the results with and without the motion-tracking algorithm. The left panel of FIG. 6( a) shows an image of subject 22 with a selected ROI on the subject's left shoulder. When the exemplary motion-tracking algorithm as described herein is enabled, the ROI follows the body movement (blue box 600). In contrast, when the motion-tracking algorithm is disabled, the ROI is fixed in the image and the shoulder may move out of the ROI (red box 602). The right panel of FIG. 6( a) shows breathing patterns with the motion-tracking algorithm (blue curve 604) and without the motion-tracking algorithm (red curve 606). Without applying the motion-tracking algorithm, the measured breathing signal was overwhelmed by the body movement. In contrast, the breathing pattern is clearly observed with the implementation of the motion-tracking algorithm. The algorithm worked effectively at least in part because the breathing-related body movement of the shoulders has a small amplitude and is primarily in the vertical direction, which is different from the relatively large body movement that may occur in all directions and at time scales different from the regular breathing.
  • Referring to FIGS. 6( b) and 6(c), the breathing pattern detection method as described herein is validated by comparing the results of a breathing pattern obtained with the exemplary method (red line 610) and a breathing pattern obtained with a Zephyr device (black line 612), as shown in FIG. 6( b), and a breathing pattern obtained with an Oxycon device (black line 614), as shown in FIG. 6( c). The results obtained with the image processing method described herein are in excellent agreement with the two different reference technologies, not only in a breathing frequency but also in a relative breathing amplitude.
  • Determination of Exhalation Flow Rate. The amplitude of the breathing-related shoulder movement is associated with an exhalation volume per breathing cycle, or an exhalation flow rate. The relationship was examined by plotting the amplitude vs. exhalation flow rate obtained with the Oxycon instrument. Six tests were carried out, and in each test, the subject changed the exhalation flow rate. FIG. 7 shows a plot of a breathing amplitude (from the differential signal, dI, (peak to peak)) of the tests vs. an exhaled breath volume obtained with the Oxycon instrument (Oxycon volume (L)), which shows a linear relationship (R2=0.81) between dI and the exhaled breath volume. This observation demonstrates a method to remotely determine exhalation flow rate under a free-living condition. FIG. 7 shows the correlation between the exhaled breath volume obtained from the differential detection method and the commercially available Oxycon instrument. In the differential detection method, the exhaled breath volume is taken from the shoulder movement, or dI. Data from 6 tests can be fit with a linear curve. For every unit of dI, the volume change is about 0.15 Liters (L).
  • In one embodiment, an energy expenditure based on the breathing frequency and amplitude is determined. One suitable equation for determining the energy expenditure is indirect calorimetry, which measures consumed oxygen and produced carbon dioxide rate using the Weir equation. The equation takes the form of:

  • EE (kCal/day)=[3.9 (VO 2)+1.1 (VCO 2)]×1.44,   Eq. (2)
  • where VO2 is oxygen consumption rate (ml/min.), and VCO2 is carbon dioxide production rate (ml/min.), respectively. VO2 and VCO2 can be further expressed in terms of VE, and given by:

  • VO 2 =V E×(0.2093−FO 2) and   Eq. (3)

  • VCO 2 =V E×(FCO 2−0.0003),   Eq. (4)
  • where FO2 and FCO2 are a fraction of oxygen and a fraction of carbon dioxide, respectively, in an exhaled breath. For most people, FO2 and FCO2 tend to be constant, at least under the same conditions. This means that the energy expenditure is simply proportional to VE, which is given by:

  • V E =V b *f b,   Eq. (5)
  • where Vb is a volume of exhaled air per breathing cycle, and fb denotes a breathing frequency. Alternatively, VE can be expressed as a total exhaled volume for a period over time.
  • In one embodiment, Vb is linearly correlated to a breathing amplitude determined with the methods disclosed above as shown in FIG. 12. From this relationship, Vb and, thus, VE, is determined and the energy expenditure is determined from Equation 2.
  • Pulse Transit Time. In one embodiment, a non-contact optical imaging method is also used to determine pulse transit time (PTT) related information. The variations in PTT of different body sites are obtained by analyzing a time difference of PPG signals. In FIG. 8( a), the PTT from the subject's heart to the subject's mouth, and from the subject's heart to the subject's left palm and right palm were indicated by t1, t2, and t3, respectively. The corresponding ROI selections of three locations from the video sample are shown in FIG. 8( b) as three rectangles having different colors, namely ROI selection 800, ROI selection 802, and ROI selection 804. The PPG signals obtained from the ROIs were plotted in FIG. 8( c). Time differences were found between PPG signals from different regions of the subject's body in every heart cycle. The PPG signal detected from the mouth area (blue curve 806) arrived earlier than the PPG signals detected from the left palm area (red curve 808) and the right palm area (green curve 810). A sample of the delay is shown in FIG. 8( c) to illustrate a PTT difference between the mouth and the palms from signals obtained from the ROIs. The time delay is about 30 milliseconds (ms) between the PPG signals obtained form the mouth and the palms. The PTT difference was not obvious between left and right palms.
  • Several signal processing algorithms can be utilized to determine the time differences in PTT among the different body sites. In one embodiment, a first algorithm is based on comparing peak locations of different PPG signals using a linear curve fitting method. FIGS. 9( a) and 9(b) show an estimated peak location for a single cycle of a PPG signal using a linear curve fitting method. FIG. 9( a) shows an original PPG signal sample including one heart beat cycle from a PPG signal obtained from one subject. The heart beat cycle is selected by a dash red rectangle 900 for further analysis. The peak location of the selected signal is estimated by fitting two linear curves L1 and L2 (black dashed lines). L1 is positioned on a rising edge (left portion of the peak) and L2 is positioned on a falling edge (right portion of the peak) of the signal as shown in FIG. 9( b). The point of intersection (indicated by the green arrow 902) of the two linear curves L1 and L2 is the estimated peak location in the selected heart beat cycle. PTT differences were determined by comparing the peak locations of PPG signals obtained at different body locations (e.g., the subject's mouth and the subject's fingers).
  • To validate the results, simultaneous measurements of the physiological signals were carried out with different reference technologies. For heart rate measurement, a Zephyr ECG was used. For breathing pattern measurement, two reference technologies, viz., a Zephyr wearable device and an Oxycon metabolic analysis instrument, were used. The Zephyr device used a movement sensor integrated in a belt wrapped around the subject's chest. Because the Zephyr device does not provide exhaled flow rate, the Oxycon instrument is used to measure both the breathing frequency and the exhalation volume flow rate via a turbine flow meter attached to a mask worn by the subject. To validate PTT results, several feature extraction algorithms can be used, and EPIC Motion Sensors (PS25451).
  • As shown in Table I below, nine tests taken from one subject were analyzed to obtain the average value of PTT difference between the subject's mouth area and the subject's palm areas. Each test lasted for 30 seconds. The PTT difference for each test is an average result from all the available heart beat cycles in that time period. Table I shows PPG delay estimation results among different sites. The values are calculated based on a linear curve fitting method. The estimated delay values obtained form a facial area to two palm areas are similar, about 30-31 milliseconds (ms).
  • TABLE I
    PTT Difference PTT Difference
    Test Heart Rate from Left Palm from Right Palm
    No. (bpm) to Mouth (ms) to Mouth (ms)
    1 72 22.50 22.28
    2 78 34.50 32.37
    3 78 28.36 29.35
    4 72 30.39 32.23
    5 72 23.35 27.82
    6 71 33.97 35.64
    7 106 25.99 28.26
    8 96 34.56 36.04
    9 96 34.13 35.61
    Average 29.75 31.07
    SD/Average 16% 15%

    Matlab functions, findpeaks and xcorr, were also used to estimate the value of PTT difference. Function findpeaks provides the peak location of the input data by searching for the local maximum value of the sequence. Function xcorr realizes phase shift estimation between two signals by taking their convolution and searching for the delay that gives the largest convoluted result. However, the standard deviations of the calculated PTT differences obtained from these two methods were higher than the standard deviation obtained from the first one (linear curve fitting method). Therefore, the first method was used to estimate the PTT differences among different body sites. Test results in Table I show that the difference in PTT between the palm and the mouth is about 30 ms. The results are consistent with the values of PTT difference between ears and fingers reported by other researchers.
  • Small-Scale Pilot Study. To demonstrate the robustness of the developed methods to monitor physiological signals, a small-scale pilot study was conducted for statistical analyses. Ten participants were enrolled in the Institutional Review Board (IRB) study, which was approved by Arizona State University. The participants were of different genders (6 males, 4 females), age (27.3±4.5 years old, mean±S.D.), ethnic profiles, and skin colors. Informed consents were obtained from all participants following approved protocol.
  • Bland-Altman plots were used to analyze the agreements between presented physiological signal detection methods and reference technologies. FIG. 10 shows the Bland-Altman plot for heart rate detection. The differences between the breathing rate measured by the non-contact method described herein and the breathing rate measured by a commercial pulse oximetry (y-axis) were plotted against the average breathing rate measured by the two methods (x-axis). The mean difference was 0.86 beats per minute (bpm) with 95% limits of agreement (±1.96 standard deviation) at −2.47 bpm and 4.19 bpm. The root-mean-square error (RMSE) was 1.87 bpm and r was 0.98 (p<0.001). FIG. 11 shows the Bland-Altman plot for breathing rate detection. The differences between the breathing rate measured by the non-contact method described herein and the breathing rate measured by a Zephyr device (y-axis) were plotted against the average breathing rate measured by the two methods (x-axis). The mean difference was 0.02 breaths/minute (min.) with 95% limits of agreement (±1.96 standard deviation) at −2.40 breaths/min. and 2.45 breaths/min. RMSE was 1.20 breaths/min. and r was 0.93 (p<0.001).
  • Both the method described herein and the reference technologies can introduce error to the test results. For statistical analyses, p<0.05 is considered to be a significant correlation between two compared methods. So the overall error rates were acceptable. A pilot study was also conducted for PTT difference calculation. Ten tests were taken from 4 subjects. The average PTT difference between the mouth area and the palm areas was about 30-40 ms, as shown in Table II. Table II shows PTT difference estimate results among four subjects. The values were calculated based on a linear curve fitting method. The average PTT difference between the mouth area and the palm areas was about 35 ms from the left palm area to the mouth area and 37 ms from the right palm area to the mouth area.
  • TABLE II
    PTT Difference PTT Difference
    Test from Left Palm from Right Palm
    No. Gender to Mouth (ms) to Mouth (ms)
    1 Female 29.75 31.07
    2 Female 35.02 32.06
    3 Male 32.96 41.67
    4 Male 42.03 43.29
    Average 34.94 37.02
    SD/Average 15% 17%
  • The embodiments described herein demonstrate exemplary optical imaging-based methods for non-contact monitoring of physiological signals, including, for example, breathing frequency, exhalation flow rate, heart rate, and pulse transit time, by detecting facial color changes associated with blood circulation and body movement associated with breathing. By implementing differential detection and motion-tracking algorithms, the breathing frequency and the exhalation volume flow rate are accurately tracked, which is robust against moderate body movements unrelated to breathing. The physiological signals measured by the imaging methods are in excellent agreement with those obtained using reference technologies. As demonstrated herein, the difference in pulse transit time can be determined by the non-contact imaging method and the results are comparable to the related values reported by other literature. Furthermore, results of a small-scale pilot study involving participants of different ethnic profiles, sex and ages demonstrate the basic principle of the optical imaging methods.
  • The described system and methods are not limited to the specific embodiments described herein. In addition, components of each system and/or steps of each method may be practiced independent and separate from other components and method steps, respectively, described herein. Each component and method also can be used in combination with other systems and methods.
  • The foregoing description of embodiments and examples has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the forms described. Numerous modifications are possible in light of the above teachings. Some of those modifications have been discussed and others will be understood by those skilled in the art. The embodiments were chosen and described for illustration of various embodiments. The scope is, of course, not limited to the examples or embodiments set forth herein, but can be employed in any number of applications and equivalent devices by those of ordinary skill in the art. Rather, it is hereby intended the scope be defined by the claims appended hereto. Additionally, the features of various implementing embodiments may be combined to form further embodiments.

Claims (20)

What is claimed is:
1. A system for monitoring one or more physiological parameters of a subject under free-living conditions, the system comprising:
a camera configured to capture and record a video sequence including at least one image frame of at least one region of interest (ROI) of the subject's body; and
a computer in signal communication with the camera to receive signals transmitted by the camera representative of the video sequence, the computer including a processor configured to process the signals associated with the video sequence recorded by the camera, and a display configured to display data associated with the signals.
2. The system of claim 1 wherein the computer is external to the camera.
3. The system of claim 1 wherein the system is configured to monitor one or more of the following physiological parameters: a heart beat; a heart rate (HR), a breathing pattern, a breathing amplitude, a breathing frequency (BF), an exhalation flow rate and/or a pulse transit time (PTT).
4. The system of claim 3 wherein the HR and the PTT are detected by tracking an image intensity change of the subject's skin.
5. The system of claim 3 wherein the BF and the exhalation flow rate are detected by tracking subtle body movements of the subject associated with breathing.
6. The system of claim 1 wherein the processor is configured to:
select the at least one ROI of the subject's body;
detect body movement of the at least one ROI in the video sequence; and
determine a breathing pattern based on the detected body movement.
7. The system of claim 6 wherein the processor is configured to:
select a region of pixels around an edge of each shoulder of the subject to be the regions of interest (ROIs);
determine a derivative of the ROIs along a vertical direction to obtained two differential images of the ROIs;
divide the differential image of each selected ROI of the ROIs into a top portion and an equal bottom portion along the edge of a respective shoulder to define an intensity of the top portion as dA, and an intensity of the bottom portion as dB; and
determine a vertical movement of the shoulders by:
dI = dA - d B dA + d B .
8. The system of claim 7 wherein the processor is configured to:
calculate dI for every frame of the video sequence; and
plot dI against time after applying a low-pass filter with a cut-off frequency of 2 Hz.
9. The system of claim 1 wherein the camera comprises a plurality of cameras configured to capture video sequences of a plurality of regions of interest of the subject's body.
10. The system of claim 1 wherein the camera is housed within a mobile device.
11. The system of claim 1 further comprising a light source configured to illuminate the at least one ROI of the subject's body.
12. The system of claim 1 further comprising a user interface configured to select at least one ROI and perform signal processing of data associated with the at least one ROI to determine a heart beat and a breathing signal, and display results in real time on the display.
13. The system of claim 1 wherein the camera records the video sequence of the subject's face, and the processor is configured to perform a Fast Fourier Transform (FFT) on an intensity signal averaged over a plurality of pixels in the at least one ROI, an FFT spectrum of the at least one ROI representing a heart beat signal as a peak at a frequency corresponding to the heart rate.
14. The system of claim 13 wherein the processor is configured to extract a peak amplitude in each pixel of the plurality of pixels and plot a peak amplitude on a colormap to analyze a variation of the heart beat signal in different areas of the subject's face.
15. The system of claim 1 wherein the processor is configured to:
determine a volume flow rate of the exhaled air from the breathing pattern; and
determining an energy expenditure based on the determined volume flow rate of the exhaled air.
16. A method for monitoring a breathing pattern of a subject, the method comprising:
selecting a region of pixels around an edge of each shoulder of the subject to be the regions of interest (ROIs);
determining a derivative of the ROIs along a vertical direction to obtained two differential images of the ROIs;
determining a position of each shoulder by dividing a differential image of each selected ROI into a top portion and an equal bottom portion along the edge of the shoulder, wherein an intensity of the top portion is dA and an intensity of the bottom portion is dB; and
determining a vertical movement of each shoulder for every frame of the video sequence.
17. The method of claim 16 further comprising implementing a motion-tracking algorithm to correct motion artifacts, comprising:
selecting at least one region of interest (ROI);
calculating body movement every 100 frames of the video sequence within the top portion and the bottom portion based on a shift in an x direction and a shift in a y direction;
updating each of the top portion and the bottom portion with the shift_x and the shift_y; and
plotting dI to generate a breathing curve.
18. The method of claim 17 further comprising determining an exhalation flow rate, wherein an exhaled breath volume is calculated from dI.
19. The method of claim 16 further comprising determining a pulse transit time comprising:
analyzing a time difference of a plurality of PPG signals including a first PTT associated with transit time from the subject's heart to the subject's mouth (t1), a second PTT associated with transit time from the subject's heart to the subject's left palm (t2), and a third PTT associated with transit time from the subject's heart to the subject's right palm (t3);
selecting a corresponding ROI selection for each of the first PTT, the second PTT, and the third PTT from the video sequence; and
plotting the plurality of PPG signals obtained from the ROI selection to find the time differences of the plurality of PPG signals from different regions of the subject's body in every heart cycle.
20. The method of claim 19 further comprising determining time differences in PTT among the different regions based on comparing peak locations of the plurality of PPG signals using a linear curve fitting method, comprising:
selecting a heart beat cycle signal for analysis, wherein a peak location of the selected heart beat cycle signal is estimated by fitting two linear curves L1 and L2, L1 positioned on a rising edge of the peak location and L2 is positioned on a falling edge of the peak location;
determining an estimated peak location as a point of intersection of the two linear curves L1 and L2; and
determining time differences in PTT by comparing peak locations of the plurality of PPG signals obtained at different body locations.
US14/213,236 2013-03-14 2014-03-14 System and method for non-contact monitoring of physiological parameters Abandoned US20140276104A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/213,236 US20140276104A1 (en) 2013-03-14 2014-03-14 System and method for non-contact monitoring of physiological parameters
US15/826,224 US11363990B2 (en) 2013-03-14 2017-11-29 System and method for non-contact monitoring of physiological parameters

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361784646P 2013-03-14 2013-03-14
US14/213,236 US20140276104A1 (en) 2013-03-14 2014-03-14 System and method for non-contact monitoring of physiological parameters

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/826,224 Continuation US11363990B2 (en) 2013-03-14 2017-11-29 System and method for non-contact monitoring of physiological parameters

Publications (1)

Publication Number Publication Date
US20140276104A1 true US20140276104A1 (en) 2014-09-18

Family

ID=51530501

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/213,236 Abandoned US20140276104A1 (en) 2013-03-14 2014-03-14 System and method for non-contact monitoring of physiological parameters
US15/826,224 Active 2037-03-20 US11363990B2 (en) 2013-03-14 2017-11-29 System and method for non-contact monitoring of physiological parameters

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/826,224 Active 2037-03-20 US11363990B2 (en) 2013-03-14 2017-11-29 System and method for non-contact monitoring of physiological parameters

Country Status (1)

Country Link
US (2) US20140276104A1 (en)

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150367780A1 (en) * 2014-06-20 2015-12-24 Robert Bosch Gmbh Method for ascertaining the heart rate of the driver of a vehicle
US20160042529A1 (en) * 2014-08-11 2016-02-11 Nongjian Tao Systems and Methods for Non-Contact Tracking and Analysis of Physical Activity
JP2016082482A (en) * 2014-10-20 2016-05-16 シャープ株式会社 Image recorder
WO2016094749A1 (en) * 2014-12-11 2016-06-16 Rdi, Llc Method of analyzing, displaying, organizing and responding to vital signals
US20160171168A1 (en) * 2014-12-12 2016-06-16 Optum, Inc. Computer readable storage media for remote patient management and methods and systems for utilizing same
CN105698918A (en) * 2014-11-24 2016-06-22 广州汽车集团股份有限公司 Method and device for visually comparing vibration noise colormaps
US20160338590A1 (en) * 2015-05-20 2016-11-24 Comprehensive Telemedicine Multipurpose Diagnostic Examination Apparatus And System
US20170007137A1 (en) * 2015-07-07 2017-01-12 Research And Business Foundation Sungkyunkwan University Method of estimating blood pressure based on image
US20170061637A1 (en) * 2015-02-11 2017-03-02 Sandia Corporation Object detection and tracking system
JP2017055949A (en) * 2015-09-16 2017-03-23 シャープ株式会社 Measurement apparatus, measurement system, measurement method, and computer program
CN106580301A (en) * 2016-12-21 2017-04-26 广州心与潮信息科技有限公司 Physiological parameter monitoring method, device and hand-held device
US20170119304A1 (en) * 2014-06-06 2017-05-04 Koninklijke Philips N.V. Device, system and method for detecting apnoea of a subject
US20170127988A1 (en) * 2015-11-09 2017-05-11 Arizona Board Of Regents On Behalf Of Arizona State University Noncontact monitoring of blood oxygen saturation using camera
US20170173390A1 (en) * 2004-02-06 2017-06-22 Q-Tec Systems Llc Method and apparatus for exercise monitoring combining exercise monitoring and visual data with wireless wearable devices
US20170196467A1 (en) * 2016-01-07 2017-07-13 Panasonic Intellectual Property Management Co., Ltd. Biological information measuring device including light source, light detector, and control circuit
WO2017125744A1 (en) * 2016-01-21 2017-07-27 Oxehealth Limited Method and apparatus for estimating breathing rate
US20170238842A1 (en) * 2016-02-19 2017-08-24 Covidien Lp Systems and methods for video-based monitoring of vital signs
WO2017163248A1 (en) * 2016-03-22 2017-09-28 Multisense Bv System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors
JP2017530815A (en) * 2014-10-13 2017-10-19 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Device and method for detecting vital sign information of a subject
KR20180114596A (en) * 2017-04-11 2018-10-19 성균관대학교산학협력단 Apparatus, and method for examining pulmonary function using face image, and computer readable recording medium for recoring the same
US20180332035A1 (en) * 2017-05-15 2018-11-15 Otis Elevator Company Mobile device with continuous user authentication
US10335045B2 (en) 2016-06-24 2019-07-02 Universita Degli Studi Di Trento Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions
US10398328B2 (en) 2015-08-25 2019-09-03 Koninklijke Philips N.V. Device and system for monitoring of pulse-related information of a subject
CN110650678A (en) * 2017-08-01 2020-01-03 三星电子株式会社 Electronic device for determining biometric information and method of operation thereof
US10748016B2 (en) 2017-04-24 2020-08-18 Oxehealth Limited In-vehicle monitoring
US10779771B2 (en) 2016-01-22 2020-09-22 Oxehealth Limited Signal processing method and apparatus
CN111712187A (en) * 2018-02-13 2020-09-25 松下知识产权经营株式会社 Life information display device, life information display method, and program
US10796140B2 (en) 2016-01-21 2020-10-06 Oxehealth Limited Method and apparatus for health and safety monitoring of a subject in a room
US10806354B2 (en) 2016-01-21 2020-10-20 Oxehealth Limited Method and apparatus for estimating heart rate
CN112053387A (en) * 2020-07-24 2020-12-08 贵阳像树岭科技有限公司 Non-contact respiration monitoring method based on visual calculation
US10885349B2 (en) 2016-11-08 2021-01-05 Oxehealth Limited Method and apparatus for image processing
US10909678B2 (en) 2018-03-05 2021-02-02 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US10936081B2 (en) 2014-08-22 2021-03-02 Google Llc Occluded gesture recognition
US10939824B2 (en) 2017-11-13 2021-03-09 Covidien Lp Systems and methods for video-based monitoring of a patient
US10947576B2 (en) 2017-07-25 2021-03-16 Arizona Board Of Regents On Behalf Of Arizona State University Rapid antibiotic susceptibility testing by tracking sub-micron scale motion of single bacterial cells
US11045095B2 (en) 2016-03-11 2021-06-29 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for non-contact monitoring of ballistocardiogram, photoplethysmogram, blood pressure and abnormal heart rhythm
US11147474B2 (en) * 2017-08-25 2021-10-19 Baidu Online Network Technology (Beijing) Co., Ltd. Living body detecting method and apparatus, device and computer storage medium
US11163371B2 (en) 2014-10-02 2021-11-02 Google Llc Non-line-of-sight radar-based gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US11182910B2 (en) 2016-09-19 2021-11-23 Oxehealth Limited Method and apparatus for image processing
US20210393148A1 (en) * 2020-06-18 2021-12-23 Rockwell Collins, Inc. Physiological state screening system
US11219412B2 (en) 2015-03-23 2022-01-11 Google Llc In-ear health monitoring
US11229372B2 (en) 2016-09-21 2022-01-25 Arizona Board of Regents on Behalf of Arizona State of University Systems and methods for computer monitoring of remote photoplethysmography based on chromaticity in a converted color space
US11293875B2 (en) 2017-09-27 2022-04-05 Arizona Board Of Regents On Behalf Of Arizona State University Method and apparatus for continuous gas monitoring using micro-colorimetric sensing and optical tracking of color spatial distribution
US11311202B2 (en) 2017-11-14 2022-04-26 Arizona Board Of Regents On Behalf Of Arizona State University Robust real-time heart rate monitoring method based on heartbeat harmonics using small-scale radar
US11311252B2 (en) 2018-08-09 2022-04-26 Covidien Lp Video-based patient monitoring systems and associated methods for detecting and monitoring breathing
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11403754B2 (en) 2019-01-02 2022-08-02 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US11412943B2 (en) 2016-07-16 2022-08-16 Olesya Chornoguz Methods and systems for obtaining physiologic information
US11481040B2 (en) 2015-10-06 2022-10-25 Google Llc User-customizable machine-learning in radar-based gesture detection
US11484208B2 (en) 2020-01-31 2022-11-01 Covidien Lp Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods
US11497477B2 (en) * 2015-10-01 2022-11-15 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method thereof
US11510584B2 (en) 2018-06-15 2022-11-29 Covidien Lp Systems and methods for video-based patient monitoring during surgery
US11543345B2 (en) 2019-04-25 2023-01-03 Arizona Board Of Regents On Behalf Of Arizona State University Chemical complementary metal-oxide semiconductor (CCMOS) colorimetric sensors for multiplex detection and analysis
US11563920B2 (en) 2019-01-02 2023-01-24 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject field
US11576590B2 (en) 2017-03-13 2023-02-14 Arizona Board Of Regents On Behalf Of Arizona State University Imaging-based spirometry systems and methods
US11617520B2 (en) 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
US11690536B2 (en) 2019-01-02 2023-07-04 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US11712176B2 (en) 2018-01-08 2023-08-01 Covidien, LP Systems and methods for video-based non-contact tidal volume monitoring
CN116740621A (en) * 2023-08-14 2023-09-12 中国科学院长春光学精密机械与物理研究所 Non-contact respiration detection method, equipment and medium
US11771380B2 (en) 2019-03-19 2023-10-03 Arizona Board Of Regents On Behalf Of Arizona State University Vital sign monitoring system using an optical sensor
US11783483B2 (en) 2019-03-19 2023-10-10 Arizona Board Of Regents On Behalf Of Arizona State University Detecting abnormalities in vital signs of subjects of videos
WO2024049973A1 (en) * 2022-09-02 2024-03-07 Board Of Regents Of The University Of Nebraska Systems and methods for determination of pulse arrival time with wearable electronic devices

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113676758A (en) * 2021-07-13 2021-11-19 北京奇艺世纪科技有限公司 Index generation method, client, server, electronic device, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050234331A1 (en) * 2004-03-23 2005-10-20 Fuji Photo Film Co., Ltd. Method, apparatus and program for obtaining differential image
US7035432B2 (en) * 2003-07-22 2006-04-25 Ronjo Company Method of monitoring sleeping infant
US20100061596A1 (en) * 2008-09-05 2010-03-11 Varian Medical Systems Technologies, Inc. Video-Based Breathing Monitoring Without Fiducial Tracking
US20130218028A1 (en) * 2012-02-21 2013-08-22 Xerox Corporation Deriving arterial pulse transit time from a source video image
US20150320338A1 (en) * 2012-12-19 2015-11-12 Koninklijke Philips N.V. Detection of respiratory disorders

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5974340A (en) * 1997-04-29 1999-10-26 Cardiac Pacemakers, Inc. Apparatus and method for monitoring respiratory function in heart failure patients to determine efficacy of therapy
US6980679B2 (en) * 1998-10-23 2005-12-27 Varian Medical System Technologies, Inc. Method and system for monitoring breathing activity of a subject
JP3764949B2 (en) * 2003-06-09 2006-04-12 住友大阪セメント株式会社 Condition analysis device
US7785001B2 (en) 2004-05-10 2010-08-31 Arizona Board Of Regents Apparatus and method for sensing change in environmental conditions
JPWO2007052755A1 (en) * 2005-11-04 2009-04-30 株式会社東芝 Respiration monitoring device, respiratory monitoring system, medical processing system, respiratory monitoring method, respiratory monitoring program
JP2010528297A (en) 2007-05-23 2010-08-19 アリゾナ ボード オブ リージェンツ フォー アンド オン ビハーフ オブ アリゾナ ステイト ユニバーシティ System and method for integrated electrochemical detection and electrical detection
WO2009132262A1 (en) 2008-04-25 2009-10-29 Arizona Board Of Regents And On Behalf Of Arizona State University Surface impedance imaging methods and apparatuses
US10667727B2 (en) * 2008-09-05 2020-06-02 Varian Medical Systems, Inc. Systems and methods for determining a state of a patient
WO2010030874A1 (en) 2008-09-11 2010-03-18 Arizona Board Of Regents For And On Behalf Of Arizona State University Systems and methods for integrated detection
WO2010036940A2 (en) 2008-09-25 2010-04-01 Arizona Board Of Regents And On Behalf Of Arizona State University Apparatus and method for sensing change in environmental conditions
GB0822605D0 (en) * 2008-12-11 2009-01-21 Pneumacare Ltd Method and apparatus for monitoring an object
US20110144517A1 (en) * 2009-01-26 2011-06-16 Miguel Angel Cervantes Video Based Automated Detection of Respiratory Events
BRPI1011713A2 (en) 2009-06-05 2016-03-22 Univ Arizona integrated optoelectrochemical sensor for nitrogen oxides in gaseous samples
US8483458B2 (en) * 2009-09-10 2013-07-09 General Electric Company Method and system for measuring visceral fat mass using dual energy x-ray absorptiometry
US20110251493A1 (en) * 2010-03-22 2011-10-13 Massachusetts Institute Of Technology Method and system for measurement of physiological parameters
JP5976636B2 (en) * 2010-05-07 2016-08-23 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Motion compensation and patient feedback in medical imaging systems
JP5874636B2 (en) * 2010-08-27 2016-03-02 コニカミノルタ株式会社 Diagnosis support system and program
JP5742179B2 (en) * 2010-11-05 2015-07-01 ソニー株式会社 Imaging apparatus, image processing apparatus, image processing method, and program
US10143401B2 (en) 2011-06-13 2018-12-04 Arizona Board Of Regents Acting For And On Behalf Of Arizona State University Metabolic analyzer
JP6219952B2 (en) 2012-08-14 2017-10-25 ボルボ ラストバグナー アーベー How to determine the operating status of a driver
JP5807192B2 (en) * 2013-01-21 2015-11-10 パナソニックIpマネジメント株式会社 Measuring apparatus and measuring method
EP2948046B1 (en) 2013-01-22 2019-05-15 Arizona Board of Regents on behalf of Arizona State University Portable metabolic analyzer system
GB201302451D0 (en) * 2013-02-12 2013-03-27 Isis Innovation Method and system for signal analysis
US10209232B2 (en) 2014-01-02 2019-02-19 Arizona Board Of Regents On Behalf Of Arizona State University Specific, reversible, and wide-dynamic range sensor for real time detection of carbon dioxide
US10408757B2 (en) 2014-01-03 2019-09-10 Arizona Board Of Regents On Behalf Of Arizona State University Plasmonic imaging and detection of single DNA molecules
US10078795B2 (en) 2014-08-11 2018-09-18 Nongjian Tao Systems and methods for non-contact tracking and analysis of physical activity using imaging
US9909993B2 (en) 2014-12-15 2018-03-06 Arizona Board Of Regents On Behalf Of Arizona State University Label-free detection of small and large molecule interactions, and activities in biological systems
US10222372B2 (en) 2015-08-03 2019-03-05 Arizona Board Of Regents On Behalf Of Arizona State University Antibiotic susceptibility testing via plasmonic imaging and tracking
US10413226B2 (en) 2015-11-09 2019-09-17 Arizona Board Of Regents On Behalf Of Arizona State University Noncontact monitoring of blood oxygen saturation using camera
WO2017156084A2 (en) 2016-03-11 2017-09-14 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for non-contact monitoring of ballistocardiogram, photoplethysmogram, blood pressure and abnormal heart rhythm
WO2018057753A1 (en) 2016-09-21 2018-03-29 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for computer monitoring of remote photoplethysmography based on chromaticity in a converted color space
US11576590B2 (en) 2017-03-13 2023-02-14 Arizona Board Of Regents On Behalf Of Arizona State University Imaging-based spirometry systems and methods
US11293875B2 (en) 2017-09-27 2022-04-05 Arizona Board Of Regents On Behalf Of Arizona State University Method and apparatus for continuous gas monitoring using micro-colorimetric sensing and optical tracking of color spatial distribution
WO2019136097A1 (en) 2018-01-02 2019-07-11 Arizona Board Of Regents On Behalf Of Arizona State University Method and system for assessing metabolic rate and maintaining indoor air quality and efficient ventilation energy use with passive environmental sensors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7035432B2 (en) * 2003-07-22 2006-04-25 Ronjo Company Method of monitoring sleeping infant
US20050234331A1 (en) * 2004-03-23 2005-10-20 Fuji Photo Film Co., Ltd. Method, apparatus and program for obtaining differential image
US20100061596A1 (en) * 2008-09-05 2010-03-11 Varian Medical Systems Technologies, Inc. Video-Based Breathing Monitoring Without Fiducial Tracking
US20130218028A1 (en) * 2012-02-21 2013-08-22 Xerox Corporation Deriving arterial pulse transit time from a source video image
US20150320338A1 (en) * 2012-12-19 2015-11-12 Koninklijke Philips N.V. Detection of respiratory disorders

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170173390A1 (en) * 2004-02-06 2017-06-22 Q-Tec Systems Llc Method and apparatus for exercise monitoring combining exercise monitoring and visual data with wireless wearable devices
US20170119304A1 (en) * 2014-06-06 2017-05-04 Koninklijke Philips N.V. Device, system and method for detecting apnoea of a subject
US10524725B2 (en) * 2014-06-06 2020-01-07 Koninklijke Philips N.V. Device, system and method for detecting apnoea of a subject
US20150367780A1 (en) * 2014-06-20 2015-12-24 Robert Bosch Gmbh Method for ascertaining the heart rate of the driver of a vehicle
US10043074B2 (en) * 2014-06-20 2018-08-07 Robert Bosch Gmbh Method for ascertaining the heart rate of the driver of a vehicle
US10078795B2 (en) * 2014-08-11 2018-09-18 Nongjian Tao Systems and methods for non-contact tracking and analysis of physical activity using imaging
US10740650B2 (en) * 2014-08-11 2020-08-11 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for non-contact tracking and analysis of exercise
US20160042529A1 (en) * 2014-08-11 2016-02-11 Nongjian Tao Systems and Methods for Non-Contact Tracking and Analysis of Physical Activity
US20190325257A1 (en) * 2014-08-11 2019-10-24 Nongjian Tao Systems and Methods for Non-Contact Tracking and Analysis of Physical Activity Using Imaging
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US11221682B2 (en) 2014-08-22 2022-01-11 Google Llc Occluded gesture recognition
US10936081B2 (en) 2014-08-22 2021-03-02 Google Llc Occluded gesture recognition
US11816101B2 (en) 2014-08-22 2023-11-14 Google Llc Radar recognition-aided search
US11163371B2 (en) 2014-10-02 2021-11-02 Google Llc Non-line-of-sight radar-based gesture recognition
JP2017530815A (en) * 2014-10-13 2017-10-19 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Device and method for detecting vital sign information of a subject
JP2016082482A (en) * 2014-10-20 2016-05-16 シャープ株式会社 Image recorder
CN105698918A (en) * 2014-11-24 2016-06-22 广州汽车集团股份有限公司 Method and device for visually comparing vibration noise colormaps
WO2016094749A1 (en) * 2014-12-11 2016-06-16 Rdi, Llc Method of analyzing, displaying, organizing and responding to vital signals
US20160171168A1 (en) * 2014-12-12 2016-06-16 Optum, Inc. Computer readable storage media for remote patient management and methods and systems for utilizing same
US20170061637A1 (en) * 2015-02-11 2017-03-02 Sandia Corporation Object detection and tracking system
US9665942B2 (en) * 2015-02-11 2017-05-30 Sandia Corporation Object detection and tracking system
US11219412B2 (en) 2015-03-23 2022-01-11 Google Llc In-ear health monitoring
US20160338590A1 (en) * 2015-05-20 2016-11-24 Comprehensive Telemedicine Multipurpose Diagnostic Examination Apparatus And System
US10058247B2 (en) * 2015-05-20 2018-08-28 Comprehensive Telemedicine Multipurpose diagnostic examination apparatus and system
US20170007137A1 (en) * 2015-07-07 2017-01-12 Research And Business Foundation Sungkyunkwan University Method of estimating blood pressure based on image
US9795306B2 (en) * 2015-07-07 2017-10-24 Research & Business Foundation Sungkyunkwan University Method of estimating blood pressure based on image
US10398328B2 (en) 2015-08-25 2019-09-03 Koninklijke Philips N.V. Device and system for monitoring of pulse-related information of a subject
JP2017055949A (en) * 2015-09-16 2017-03-23 シャープ株式会社 Measurement apparatus, measurement system, measurement method, and computer program
US11771404B2 (en) 2015-10-01 2023-10-03 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method thereof
US11497477B2 (en) * 2015-10-01 2022-11-15 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method thereof
US11693092B2 (en) 2015-10-06 2023-07-04 Google Llc Gesture recognition using multiple antenna
US11592909B2 (en) 2015-10-06 2023-02-28 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US11481040B2 (en) 2015-10-06 2022-10-25 Google Llc User-customizable machine-learning in radar-based gesture detection
US11698438B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US11698439B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US20170127988A1 (en) * 2015-11-09 2017-05-11 Arizona Board Of Regents On Behalf Of Arizona State University Noncontact monitoring of blood oxygen saturation using camera
US10413226B2 (en) * 2015-11-09 2019-09-17 Arizona Board Of Regents On Behalf Of Arizona State University Noncontact monitoring of blood oxygen saturation using camera
US11020030B2 (en) 2015-11-09 2021-06-01 Arizona Board Of Regents On Behalf Of Arizona State University Noncontact monitoring of blood oxygen saturation, using camera
US20170196467A1 (en) * 2016-01-07 2017-07-13 Panasonic Intellectual Property Management Co., Ltd. Biological information measuring device including light source, light detector, and control circuit
US10799129B2 (en) * 2016-01-07 2020-10-13 Panasonic Intellectual Property Management Co., Ltd. Biological information measuring device including light source, light detector, and control circuit
WO2017125744A1 (en) * 2016-01-21 2017-07-27 Oxehealth Limited Method and apparatus for estimating breathing rate
US10796140B2 (en) 2016-01-21 2020-10-06 Oxehealth Limited Method and apparatus for health and safety monitoring of a subject in a room
US10806354B2 (en) 2016-01-21 2020-10-20 Oxehealth Limited Method and apparatus for estimating heart rate
US10952683B2 (en) 2016-01-21 2021-03-23 Oxehealth Limited Method and apparatus for estimating breathing rate
US10779771B2 (en) 2016-01-22 2020-09-22 Oxehealth Limited Signal processing method and apparatus
US20220257143A1 (en) * 2016-02-19 2022-08-18 Covidien Lp Systems and methods for video-based monitoring of vital signs
US11684287B2 (en) 2016-02-19 2023-06-27 Covidien Lp System and methods for video-based monitoring of vital signs
US11350850B2 (en) * 2016-02-19 2022-06-07 Covidien, LP Systems and methods for video-based monitoring of vital signs
US20170238842A1 (en) * 2016-02-19 2017-08-24 Covidien Lp Systems and methods for video-based monitoring of vital signs
US10667723B2 (en) * 2016-02-19 2020-06-02 Covidien Lp Systems and methods for video-based monitoring of vital signs
US11317828B2 (en) 2016-02-19 2022-05-03 Covidien Lp System and methods for video-based monitoring of vital signs
US10702188B2 (en) 2016-02-19 2020-07-07 Covidien Lp System and methods for video-based monitoring of vital signs
US11045095B2 (en) 2016-03-11 2021-06-29 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for non-contact monitoring of ballistocardiogram, photoplethysmogram, blood pressure and abnormal heart rhythm
WO2017163248A1 (en) * 2016-03-22 2017-09-28 Multisense Bv System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors
US10335045B2 (en) 2016-06-24 2019-07-02 Universita Degli Studi Di Trento Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions
US11412943B2 (en) 2016-07-16 2022-08-16 Olesya Chornoguz Methods and systems for obtaining physiologic information
US11182910B2 (en) 2016-09-19 2021-11-23 Oxehealth Limited Method and apparatus for image processing
US11229372B2 (en) 2016-09-21 2022-01-25 Arizona Board of Regents on Behalf of Arizona State of University Systems and methods for computer monitoring of remote photoplethysmography based on chromaticity in a converted color space
US10885349B2 (en) 2016-11-08 2021-01-05 Oxehealth Limited Method and apparatus for image processing
CN106580301A (en) * 2016-12-21 2017-04-26 广州心与潮信息科技有限公司 Physiological parameter monitoring method, device and hand-held device
US11576590B2 (en) 2017-03-13 2023-02-14 Arizona Board Of Regents On Behalf Of Arizona State University Imaging-based spirometry systems and methods
KR101978900B1 (en) 2017-04-11 2019-05-15 성균관대학교산학협력단 Apparatus, and method for examining pulmonary function using face image, and computer readable recording medium for recoring the same
KR20180114596A (en) * 2017-04-11 2018-10-19 성균관대학교산학협력단 Apparatus, and method for examining pulmonary function using face image, and computer readable recording medium for recoring the same
US10748016B2 (en) 2017-04-24 2020-08-18 Oxehealth Limited In-vehicle monitoring
US20180332035A1 (en) * 2017-05-15 2018-11-15 Otis Elevator Company Mobile device with continuous user authentication
US11198897B2 (en) 2017-07-25 2021-12-14 Arizona Board Of Regents On Behalf Of Arizona State University Rapid antibiotic susceptibility testing by tracking sub-micron scale motion of single bacterial cells
US10947576B2 (en) 2017-07-25 2021-03-16 Arizona Board Of Regents On Behalf Of Arizona State University Rapid antibiotic susceptibility testing by tracking sub-micron scale motion of single bacterial cells
EP3609395A4 (en) * 2017-08-01 2020-04-29 Samsung Electronics Co., Ltd. Electronic device for determining biometric information and method of operating same
CN110650678A (en) * 2017-08-01 2020-01-03 三星电子株式会社 Electronic device for determining biometric information and method of operation thereof
US11147474B2 (en) * 2017-08-25 2021-10-19 Baidu Online Network Technology (Beijing) Co., Ltd. Living body detecting method and apparatus, device and computer storage medium
US11293875B2 (en) 2017-09-27 2022-04-05 Arizona Board Of Regents On Behalf Of Arizona State University Method and apparatus for continuous gas monitoring using micro-colorimetric sensing and optical tracking of color spatial distribution
US10939824B2 (en) 2017-11-13 2021-03-09 Covidien Lp Systems and methods for video-based monitoring of a patient
US11937900B2 (en) 2017-11-13 2024-03-26 Covidien Lp Systems and methods for video-based monitoring of a patient
US11311202B2 (en) 2017-11-14 2022-04-26 Arizona Board Of Regents On Behalf Of Arizona State University Robust real-time heart rate monitoring method based on heartbeat harmonics using small-scale radar
US11712176B2 (en) 2018-01-08 2023-08-01 Covidien, LP Systems and methods for video-based non-contact tidal volume monitoring
CN111712187A (en) * 2018-02-13 2020-09-25 松下知识产权经营株式会社 Life information display device, life information display method, and program
US10909678B2 (en) 2018-03-05 2021-02-02 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US11510584B2 (en) 2018-06-15 2022-11-29 Covidien Lp Systems and methods for video-based patient monitoring during surgery
US11547313B2 (en) 2018-06-15 2023-01-10 Covidien Lp Systems and methods for video-based patient monitoring during surgery
US11311252B2 (en) 2018-08-09 2022-04-26 Covidien Lp Video-based patient monitoring systems and associated methods for detecting and monitoring breathing
US11617520B2 (en) 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
US11403754B2 (en) 2019-01-02 2022-08-02 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US11690536B2 (en) 2019-01-02 2023-07-04 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject
US11563920B2 (en) 2019-01-02 2023-01-24 Oxehealth Limited Method and apparatus for monitoring of a human or animal subject field
US11776146B2 (en) 2019-01-28 2023-10-03 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11783483B2 (en) 2019-03-19 2023-10-10 Arizona Board Of Regents On Behalf Of Arizona State University Detecting abnormalities in vital signs of subjects of videos
US11771380B2 (en) 2019-03-19 2023-10-03 Arizona Board Of Regents On Behalf Of Arizona State University Vital sign monitoring system using an optical sensor
US11543345B2 (en) 2019-04-25 2023-01-03 Arizona Board Of Regents On Behalf Of Arizona State University Chemical complementary metal-oxide semiconductor (CCMOS) colorimetric sensors for multiplex detection and analysis
US11484208B2 (en) 2020-01-31 2022-11-01 Covidien Lp Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods
US20210393148A1 (en) * 2020-06-18 2021-12-23 Rockwell Collins, Inc. Physiological state screening system
CN112053387A (en) * 2020-07-24 2020-12-08 贵阳像树岭科技有限公司 Non-contact respiration monitoring method based on visual calculation
WO2024049973A1 (en) * 2022-09-02 2024-03-07 Board Of Regents Of The University Of Nebraska Systems and methods for determination of pulse arrival time with wearable electronic devices
CN116740621A (en) * 2023-08-14 2023-09-12 中国科学院长春光学精密机械与物理研究所 Non-contact respiration detection method, equipment and medium

Also Published As

Publication number Publication date
US20180140255A1 (en) 2018-05-24
US11363990B2 (en) 2022-06-21

Similar Documents

Publication Publication Date Title
US11363990B2 (en) System and method for non-contact monitoring of physiological parameters
Shao et al. Noncontact monitoring breathing pattern, exhalation flow rate and pulse transit time
JP6727599B2 (en) Biometric information display device, biometric information display method, and biometric information display program
US11350850B2 (en) Systems and methods for video-based monitoring of vital signs
US11771381B2 (en) Device, system and method for measuring and processing physiological signals of a subject
Jeong et al. Introducing contactless blood pressure assessment using a high speed video camera
EP3073905B1 (en) Device and method for obtaining pulse transit time and/or pulse wave velocity information of a subject
US20170202505A1 (en) Unobtrusive skin tissue hydration determining device and related method
Molinaro et al. Contactless vital signs monitoring from videos recorded with digital cameras: An overview
JP6620999B2 (en) Biological information measuring device, biological information measuring program, and biological information measuring method
Bosi et al. Real-time monitoring of heart rate by processing of Microsoft Kinect™ 2.0 generated streams
Sinhal et al. Estimating vital signs through non-contact video-based approaches: A survey
US20220287592A1 (en) Behavior task evaluation system and behavior task evaluation method
Chon et al. Wearable Wireless Sensor for Multi-Scale Physiological Monitoring
US20200155008A1 (en) Biological information detecting apparatus and biological information detecting method
US11324406B1 (en) Contactless photoplethysmography for physiological parameter measurement
Shao Monitoring Physiological Signals Using Camera
King Heart Rate Estimation by Video-Based Reflec-tance Photoplethysmography
Berggren et al. Non-contact measurement of heart rate using a camera
Krishnamoorthy et al. Channel Intensity and Edge-Based Estimation of Heart Rate via Smartphone Recordings. Computers 2023, 12, 43
Ali et al. Contactless Real-Time Vital Signs Monitoring Using a Webcam
Chu Remote Vital Signs Monitoring with Depth Cameras
Nikolic-Popovic Heart Rate and Heart Rate Variability Estimation in the Presence of Motion Artifacts
Kurylyak et al. Environmental and physiological parameters measurement in images and video

Legal Events

Date Code Title Description
AS Assignment

Owner name: ARIZONA BOARD OF REGENTS, A BODY CORPORATE OF THE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAO, NONGJIAN;SHAO, DANGDANG;REEL/FRAME:032734/0866

Effective date: 20140414

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION