WO2020171554A1 - Method and apparatus for measuring body temperature using a camera - Google Patents

Method and apparatus for measuring body temperature using a camera Download PDF

Info

Publication number
WO2020171554A1
WO2020171554A1 PCT/KR2020/002334 KR2020002334W WO2020171554A1 WO 2020171554 A1 WO2020171554 A1 WO 2020171554A1 KR 2020002334 W KR2020002334 W KR 2020002334W WO 2020171554 A1 WO2020171554 A1 WO 2020171554A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
features
body temperature
channel
camera
Prior art date
Application number
PCT/KR2020/002334
Other languages
French (fr)
Inventor
Poonam THAPAR
Vijay Narayan Tiwari
Ajay Kumar Jaiswal
Ashish Goyal
Harshit AGRAWAL
Tushar SIRCAR
Sandesh Raghunath SHETTY
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2020171554A1 publication Critical patent/WO2020171554A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/026Control of working procedures of a pyrometer, other than calibration; Bandwidth calculation; Gain control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Definitions

  • Embodiments herein relate to non-contact temperature estimation, and more particularly to methods and systems for estimating body temperature based on at least one feature identified using at least one frame captured by a camera.
  • Body temperature of a subject can be measured using devices such as thermometers, which require contact of the device with the skin of the subject. Measuring the body temperature using such devices may be inconvenient as the measurement is received after a certain period of time. Besides the usage of devices requiring contact with the skin for temperature measurement can lead to the spreading of infections.
  • products/solutions which do not require contact with the skin for measurement of the body temperature, and can provide the measurement of temperature instantly, have gained interest amongst users and clinicians.
  • such products or solutions can be expensive, sensitive, and/or bulky.
  • the products/solutions are likely to provide the measurement of temperature only if they are situated within a certain range that is close enough from the subject, and are prevalent as standalone devices.
  • thermometers can measure the temperature of a subject, if the thermometer is situated at a distance of less than 5cm from the subject. If the distance increases, the accuracy can be severely affected.
  • FLIR thermal imaging cameras (operating in the range of 3000nm-12000nm and sensitive to human body radiations) can be used for estimating non-contact body temperature.
  • FLIR Forward Looking Infrared
  • the principal object of the embodiments herein is to disclose methods and systems for measuring body temperature of a subject using a Red Green Blue (RGB) and/or a Near Infrared (NIR) camera, wherein the body temperature of the subject can be measured automatically, in real-time, and without requirement of having any contact or proximity with the subject.
  • RGB Red Green Blue
  • NIR Near Infrared
  • Another object of the embodiments herein is to utilize at least one feature, which can be identified from at least one frame captured using the RGB and/or NIR camera, and signal processing methods, to estimate the body temperature of the subject.
  • the embodiments provide methods and systems for measuring body temperature of a subject using at least one of a Red Green Blue (RGB) and a Near Infrared (NIR) camera.
  • RGB Red Green Blue
  • NIR Near Infrared
  • the RGB and the NIR camera can be included in a device.
  • the embodiments include automatically measuring the body temperature of the subject, in real-time, without requirement of having any contact or proximity with the subject.
  • an embodiment herein discloses a method for measuring body temperature of a subject.
  • the method comprises performing, by at least one processor of a device, a preprocessing of a plurality of frames of a media captured by at least one camera in the device; extracting, by the at least one processor, a plurality of features, comprising of photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and measuring, by the at least one processor, the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
  • an embodiment herein discloses a device for measuring body temperature of a subject.
  • the device comprises a memory, and at least one processor coupled to the memory.
  • the memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to perform a preprocessing of a plurality of frames of a media captured by at least one camera; extract a plurality of features, comprising of imaging photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and measure the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
  • an embodiment herein discloses a method for estimating body temperature of a subject.
  • the method comprises capturing, at least one processor of a device, a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device; estimating, by the at least one processor, a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media; extracting, by the at least one processor, a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and estimating, by the at least one processor, the body temperature of the subject by analyzing the plurality of features and the estimated heart rate of subject.
  • an embodiment herein discloses a device for estimating body temperature of a subject.
  • the device comprises a memory, and at least one processor coupled to the memory.
  • the memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to capture a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device; estimate a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media; extract a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and estimate the body temperature of the subject by analysis of the plurality of features and the estimated heart rate of subject.
  • features such as statistical features, Gaussian filtering based features, and time-domain and frequency domain Photoplethysmography (PPG) features, can be identified using the captured RGB and/or NIR frames.
  • the embodiments include utilizing Machine Learning (ML) based temperature classification model to classify temperature in at least three classes, viz., hot, cold and normal , measured based on the identified features.
  • the embodiments include utilizing a ML regression based temperature estimation model to ascertain an exact measurement of the body temperature of the subject based on the classified temperature class from the aforementioned ML based classification model.
  • ML Machine Learning
  • the embodiments include ascertaining if the measurement is influenced by conditions such as physical activity, exposure to heat, and so on. If the measurement is influenced by such conditions, the embodiments include measuring the body temperature at periodic intervals. The embodiments include notifying the measured body temperature to the subject. The embodiments include transmitting the measured temperature to a remote device.
  • FIG. 1 depicts various units of a device for measuring body temperature of a subject using an Red Green Blue (RGB) and/or Near Infrared (NIR) camera, according to embodiments as disclosed herein;
  • RGB Red Green Blue
  • NIR Near Infrared
  • FIG. 2 is a flowchart depicting a method for measuring body temperature of the subject using the RGB and/or NIR camera, according to embodiments as disclosed herein;
  • FIGS. 3a, 3b and 3c depict classification of the body temperature based on blood pulsation at heart beat frequency, according to embodiments as disclosed herein;
  • FIGS. 4a and 4b depict correlation between blood perfusion of a subject and distance between the subject and the device, according to embodiments as disclosed herein;
  • FIG. 5 is a graph depicting a variation of pixel values in different lighting conditions after application of Gaussian filtering, according to embodiments as disclosed herein.
  • Embodiments herein disclose methods and systems for measuring body temperature of a subject using at least one of a Red Green Blue (RGB) and a Near Infrared (NIR) camera.
  • the embodiments include performing a preprocessing of a plurality of frames of a media captured by at least one of the RGB camera and the NIR camera.
  • the preprocessing can include localizing a facial region of the subject in each of the plurality of frames, wherein the localized facial regions in the frames can be considered as Region of Interest (ROIs); applying noise removal techniques on the ROIs in each of the plurality of frames such as Gaussian filtering to smoothen edges of blocks in the ROIs.
  • ROIs Region of Interest
  • the embodiments include extracting a plurality of features comprising of statistical features, and time domain and frequency domain based Photoplethysmography (PPG) features.
  • the features can be extracted from either a single or any combination of an R channel, a G channel, a B channel, and/or an IR channel channels in each of the ROIs, in each of the plurality of frames.
  • the embodiments include measuring the body temperature through classifiers and regression models based on the extracted plurality of features.
  • FIGS. 1 through 5 where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
  • FIG. 1 depicts various units of a device 100 configured to measure body temperature of a subject using an RGB and/or NIR camera, according to embodiments as disclosed herein.
  • the device 100 comprises of a camera unit 101, a preprocessing unit 102, a feature extraction unit 103, a temperature estimation unit 104, a monitoring unit 105 and a transmitter/receiver unit 106.
  • Examples of the device 100 can be, but not limited to, a smart phone, a smart television, an Internet of Things (IoT) device, a smart-robot, a wearable computing device, an, or any other device capable of capturing media frames.
  • the device 100 includes at least one of the RGB camera and the NIR camera, capable of capturing media such as images and videos.
  • the device 100 does not require contact with the subject to measure the body temperature of the subject.
  • the device 100 further comprises a memory and at least one processor coupled to the memory.
  • the memory stores instructions to be executed by the at least one processor.
  • the at least one processor is configured to execute instructions stored in the memory and to perform various other processes. For example, at least one of the camera unit 101, the preprocessing unit 102, the feature extraction unit 103, the temperature estimation unit 104, the monitoring unit 105 and the transmitter/receiver unit 106 may be controlled by the at least one processor, when the instructions are executed by the processor.
  • the camera unit 101 can comprise of at least one of an RGB camera and an NIR camera.
  • the camera unit 101 can capture a media of the subject, wherein the media comprises of at least one frame.
  • the capturing can be triggered on detecting at least one predefined event occurring.
  • the capturing can be triggered at specific times based on preconfigured commands provided to the camera unit 101 by the subject or other users.
  • the subject can capture a media, which can be used to measure the temperature of the body.
  • the subject is having a fever ailment.
  • the subject or the other users may configure the camera unit 101 to capture media of the subject at the specific times for periodically measuring the body temperature of the subject.
  • the subject is having a fever ailment.
  • the subject or the other users may configure the camera unit 101 to capture media of the subject on detecting that the subject has stopped moving.
  • the preprocessing unit 102 can extract a plurality of frames from the captured media. Each of the plurality of frames can be processed sequentially or in parallel. Consider that the frames are processed in sequence.
  • the preprocessing unit 102 can perform a facial detection followed by skin detection. The facial detection can be refined in order to remove regions that cover (a portion) of the facial skin (such as hair or a cloth).
  • the facial region in each of the plurality of frames can be considered as a Region of Interest (ROI).
  • ROI Region of Interest
  • the preprocessing unit 102 can, for each of the plurality of frames, by interpolating from the first frame, localize a facial region using a Haar cascade based or any other face detector model.
  • the preprocessing unit 102 can detect key facial structures, i.e., facial landmarks, on the localized facial region. In an embodiment, the preprocessing unit 102 can use the facial landmarks to capture different parts of the face such as forehead, left cheek, right cheek, left chin, right chin, and so on.
  • the preprocessing unit 102 can perform de-channeling.
  • the ROI pixels in each of the plurality of frames can be segregated into an R (Red) channel, a G (Green) channel, a B (Blue) channel, and/or an Infrared (IR) channel.
  • the preprocessing unit 102 can extract signals from imaging data in each of the R, G, B, and/or IR channels.
  • the preprocessing unit 102 can subject the signals in the different channels to de-noising, motion reduction and filtering to obtain clean version of the signals.
  • low pass filtering can be applied to each signal (signals extracted from each of the channels in each ROI) to remove Gaussian noise in each signal using Gaussian kernels.
  • the preprocessing unit 102 can convert the signals belonging to the channels of each ROI into frequency domain, to obtain a frequency/power spectrum, by performing a Fast Fourier Transform (FFT). Thereafter, based on the frequency/power spectrum, the high frequency components can be removed (filtered).
  • the preprocessing unit 102 can perform peak detection in the frequency domain to determine the heart rate (number of heart beats per minute) and the breathing rate (number of breaths taken by the subject in a minute) of the subject.
  • a first peak located at low frequency-end of the spectrum can represent the breathing frequency.
  • the reciprocal of the breathing frequency corresponds to breathing rate.
  • a second peak (with a frequency higher than that at which the first peak is located) can represent the frequency of heart beat.
  • the reciprocal of the frequency of heart beat corresponds to heart rate.
  • the preprocessing unit 102 can convert the signals in the frequency domain into time domain, by performing an Inverse Fast Fourier Transform (IFFT). Thereafter, an average value of pixels (average of values of the R, G, B, and IR channels) of each ROI (in each frame).
  • IFFT Inverse Fast Fourier Transform
  • the feature extraction unit 103 can extract a plurality of features.
  • the feature extraction unit 103 can capture different parts of the face such as forehead, left cheek, right cheek, left chin and right chin for each of the R, G, B, and/or IR channels of each ROI. It is to be noted that the signals that have been initially extracted from the R, G, B, and/or IR channels, and subjected to de-noising and filtering (preprocessing), have been converted to obtain imaging data.
  • the feature extraction unit 103 can capture different parts of the face using the imaging data in each of the R, G, B, and/or IR channels.
  • the feature extraction unit 103 can eliminate a predefined number of frames (each of the frames including the ROIs) at the beginning and end of the media that might be subjected to adjustment or movement errors of the subject.
  • the feature extraction unit 103 can extract statistical features in each of the different parts of the facial region in each of the R, G, B, and/or IR channels of each ROI. Examples of the statistical features can be, but not limited to, mean value, median value, skewness value, Kurtosis index value, 10-percentile value, 25-percentile value, 75-percentile value, 40-percentile value, 60-percentile value, 90-percentile value, standard deviation value, minimum value, maximum value, and mode.
  • the feature extraction unit 103 can determine a plurality of statistical features in each of the different parts of the facial. As five parts of the facial region, viz., the forehead region, left cheek region, right cheek region, left chin region and right chin region, are captured, the feature extraction unit 103 can extract a plurality of statistical features for each of the R, G, B, and/or IR channels in each ROI.
  • the feature extraction unit 103 can segment the imaging data in each of the R, G, B, and/or IR channels in each preprocessed ROI into a plurality of blocks.
  • the imaging data in each of the R, G, B, and/or IR channels, of each of the preprocessed ROIs can be segmented into 9 blocks.
  • Each of the blocks can be divided into a plurality of grids, wherein each grid is of a predefined size. In an example, consider that each grid is of size 10x10.
  • the feature extraction unit 103 can perform Gaussian filtering to smoothen the edges of each of the grids (blocks).
  • the feature extraction unit 103 can determine statistical features in each of the grids once the Gaussian filtering is performed.
  • Examples of the statistical features can be, but not limited to, mean value, median value, skewness value, Kurtosis index value, standard deviation value, minimum value, maximum value, and mode.
  • a plurality of features is extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
  • the feature extraction unit 103 can perform an FFT on each of the grids in each of the blocks in each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
  • the feature extraction unit 103 can extract features from the frequency spectrum of the grids, obtained using the FFT.
  • the features are relevant to the heart rate and the breathing rate of the subject.
  • the features can be considered as frequency domain PPG features.
  • the result of applying FFT on each grid provides values at different harmonics of a fundamental frequency of a discrete time signal obtained from grid pixel data.
  • Examples of the features can be, but not limited to, frequency of a peak with highest amplitude (breathing frequency), the highest amplitude, frequency of a peak with a second highest amplitude (heart beat frequency), the second highest amplitude, frequencies of five higher order harmonics of the heart beat frequency, frequencies of five lower order harmonics of the heart beat frequency, amplitudes of the five higher order harmonics and the five lower order harmonics, and mean of amplitudes of first 7 harmonics of the fundamental frequency of the FFT.
  • a plurality of frequency domain PPG features is extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
  • the feature extraction unit 103 can perform an IFFT on each of the grids in each of the blocks to convert the signal from frequency domain to time domain.
  • the feature extraction unit 103 can extract features from time domain signals representing the grids, obtained using the IFFT.
  • the features can be considered as time domain PPG features.
  • the time domain signals can be normalized for both real and imaginary parts.
  • the features extracted from the time domain signals are curvature index of the time domain signals, mean of upslope deviation of curvature points and mean of downslope deviation of curvature points.
  • a plurality of time domain PPG features are extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
  • the plurality of features extracted from each grid of each ROI for the multiple frames can be averaged for each feature for each grid.
  • the average of the plurality of features can be represented in a form of a feature vector.
  • the feature vector for each ROI can be fed as input to a Machine Learning (ML) based classification model.
  • ML Machine Learning
  • the temperature estimation unit 104 can classify the estimated temperature of the subject based on the feature vector in three classes, viz., hot temperature, cold temperature and normal temperature. The classification is based on the feature vector.
  • the temperature estimation unit 104 can determine an absolute value of the body temperature using a plurality of ML based regression models.
  • the regression models can utilize probability distribution of the plurality of classes of the estimated body temperature and estimate the absolute body temperature by combining the results of the plurality of regression models.
  • the temperature estimation unit 104 can determine an absolute value of the body temperature using multiple regression models.
  • the regression models can utilize probability distribution of the three classes, viz., hot, medium and cold, of the body temperature and estimate the absolute body temperature by combining the results of the regression models.
  • the monitoring unit 105 can determine a pattern of variation of the frequency of heart beat, breathing frequency and the measured body temperature. If the subject or a user of the device 100 configures the camera unit 101 to periodically capture media frames, the body temperature is measured periodically. Due to the periodic measurements, the monitoring unit 105 will be able to detect the variations in the heart beat, breathing frequency and the body temperature. The monitoring unit 105 can correlate the variations between the frequency of heart beat, breathing frequency and the measured body temperature. The monitoring unit 105 can raise an alert if the measured body temperature is determined as not within a predefined normal range, based on the correlation between the frequency of heart beat, breathing frequency and the measured body temperature. In an example, the alert can be send to the subject or a user configured smart device or a remote device such as smartphone, smart watch, and so on, as a notification or pop-up message.
  • the transmitter/receiver unit 106 can transmit the measured body temperature, heart rate, and breathing rate to a nearby device, a remote device, cloud, a server, and so on at periodic intervals and/or or on pre-defined events occurring (such as the temperature going below a threshold, the temperature going above a threshold, and so on). This can aid in remote health monitoring.
  • FIG. 1 shows exemplary units of the device 100, but it is to be understood that other embodiments are not limited thereon.
  • the device 100 may include less or more number of units.
  • the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention.
  • One or more units can be combined together to perform same or substantially similar function in the device 100.
  • FIG. 2 is a flowchart 200 depicting a method for measuring body temperature of a subject using an RGB and/or NIR camera, according to embodiments as disclosed herein.
  • the method includes preprocessing a plurality of frames of a media captured by an RGB camera, a NIR camera, or both RGB and NIR cameras.
  • the preprocessing comprises of localizing a facial region of the subject in each of the plurality of frames and detecting facial landmarks on the localized facial region.
  • the localized facial region in each of the plurality of frames can be considered as ROI.
  • five different parts of the ROI viz., forehead, left cheek, right cheek, left chin, and right chin, can be detected.
  • the embodiments include splitting each of the ROIs into an R channel, a G channel, a B channel, and an IR channel.
  • the embodiments include extracting signals from imaging data in each of the channels.
  • the signals can be fed to a Gaussian filter for removing Gaussian noise, in order to de-noise the signals.
  • the signals can be converted to frequency domain and high frequency components can be removed.
  • the embodiments include detecting peaks in the frequency domain representation of the signals. The lower frequency peak corresponds to breathing rate frequency. The next subsequent peak corresponds to heart beat frequency.
  • the signals can be again converted to time domain.
  • the embodiments include converting the signals into imaging data that can correspond to preprocessed R, G, B, and/or IR channels of each of the (preprocessed) ROIs.
  • the method includes extracting features from each of the R, G, B, and/or IR channels of the preprocessed frames, within the ROIs.
  • the features can comprise of statistical features, features obtained after Gaussian filtering, frequency domain PPG features and time-domain PPG features.
  • the statistical features can be obtained from each of the facial regions, viz., forehead, left cheek, right cheek, left chin, and right chin, in each of the R, G, B, and/or IR channels of the preprocessed ROIs.
  • the imaging data in each of the R, G, B, and/or IR channels can be split into a predefined number of blocks. Each block can be split into grids of predefined size.
  • the embodiments include performing Gaussian filtering on the grids.
  • the embodiments include performing an FFT on each of the grids, wherein the FFT is performed after extracting time domain signals based on the imaging data in the grids. Once the FFT is performed, the frequency domain PPG features can be extracted from the FFT of the grids. The frequency domain PPG features are relevant to the breathing frequency and frequency of heart beat.
  • the embodiments include performing an IFFT to reconstruct the time domain signals representing the imaging data of the grids. Once the IFFT is performed, the time domain PPG features can be extracted from the reconstructed signals.
  • the method includes determining the body temperature based on the extracted features using ML based classification and regression models.
  • the classifier can classify the body temperature as hot, cold, and normal by determining a conditional probability distribution function.
  • the regression model can determine the absolute temperature using the extracted features.
  • the embodiments include notifying the subject or other user once the body temperature is determined (measured).
  • the embodiments include monitoring the heart rate and the breathing rate of the subject and detecting variations in the heart rate and the breathing rate.
  • the variations can be correlated with the measurement of body temperature, in order to ensure that a rise or fall in the body temperature is due to an ailment or heath issue. If the rise or fall in the body temperature is due to an ailment, the subject is likely to be warned or notified. The subject may not be notified if the rise or fall in the body temperature is transient. In an example, consider that the measured body temperature is classified as high.
  • the embodiments include ascertaining whether the high value of the body temperature is due to an ailment such as fever or a transient high temperature due to performance of a rigorous exercise.
  • the embodiments may not consider transient changes in the breathing rate and the heat rate alone as true indicator of fever, since other factors as weather conditions, emotional changes, exercise, and so on, may also cause the heart rate to increase.
  • the embodiments can check whether the increase in the heart rate is persistent.
  • the embodiments can periodically measure the body temperature.
  • the embodiments may identify the subject as having a fever and provide an alert for fever if the breathing rate and the heart rate is more than that when the subject is resting and if the increase in the heart rate and breathing rate is persistent.
  • the device 100 placed in vicinity of the subject, can be configured to periodically capture videos of the subject and measure the body temperature of the subject.
  • the device 100 can be used for health monitoring as the device 100 can extract features from the ROIs in the frames of the captured video and measure the body temperature of the subject based on the extracted features.
  • the device 100 can provide an alert if it is determined that the body temperature is not in the normal range.
  • the various actions in the flowchart 200 may be performed in the order presented, in a different order, or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
  • FIGS. 3a, 3b and 3c depict classification of the body temperature based on blood pulsation at the heart beat frequency, according to embodiments as disclosed herein.
  • de-noising and filtering is applied for each of the parts of the facial in each of the R, G, B, and/or IR channels of each ROI.
  • a series of signal processing techniques is applied to obtain a correlation matrix, which depicts a correlation between the blood pulsation at the heart beat frequency and the body temperature.
  • a for normal body temperature (body temperature measured as normal)
  • the value of correlation is low.
  • FIG. 3b for cold body temperature (body temperature measured as cold)
  • the value of correlation is lower than that for normal body temperature.
  • hot body temperature body temperature measured as hot
  • the value of correlation is high around the cheek region and the eye region.
  • the high correlation values of correlation can indicate that the amount of blood pulsation is proportional to body temperature.
  • FIGS. 4a and 4b depict correlation between blood perfusion of a subject and distance between the subject and the device 100, according to embodiments as disclosed herein.
  • FIGS. 4a and 4b depict correlation between blood perfusion of a subject and distance between the subject and the device 100, according to embodiments as disclosed herein.
  • the blood pulsation correlation matrix plots for same subject seated at different distances from the device 100 have been depicted.
  • the accuracy of measured body temperature can be robust to variations in distance between the subject and the device 100.
  • the values of correlation can be measured when the distances between the subject and the device 100 is 47cm (FIG. 4a) and 95cm (FIG. 4b).
  • the variation in the values of correlation with respect to the distance between the subject and the device 100 is minimal.
  • the variation in the correlation values due to the variation of distance between the subject and the device 100 is negligible.
  • FIG. 5 is a graph depicting a variation of pixel values in different lighting conditions after application of Gaussian filtering, according to embodiments as disclosed herein.
  • the accuracy of measured body temperature can be robust to variations in lighting conditions, in which the subject and the device 100 are present.
  • the device 100 captures the video in different lighting conditions.
  • the values of pixels can be different due to the variation in the lighting conditions.
  • the difference in the values of the pixels can be neutralized by the application of Gaussian filtering on the grids of the blocks in the facial region (ROI) of each of the R, G, B, and IR channels.
  • the superimposed plots, as depicted in FIG. 5, represent the pixel values in different lighting conditions.
  • the difference in the pixel values is negligible.
  • the embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements.
  • the network elements shown in FIG. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
  • the embodiments disclosed herein describe methods and systems for measuring body temperature of a user using at least one of a RGB and an NIR camera. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device.
  • the method is implemented in a preferred embodiment through or together with a software program written in e.g. Very high speed integrated circuit Hardware Description Language (VHDL) another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device.
  • VHDL Very high speed integrated circuit Hardware Description Language
  • the hardware device can be any kind of portable device that can be programmed.
  • the device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein.
  • the method embodiments described herein could be implemented partly in hardware and partly in software.
  • the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.

Abstract

Embodiments herein disclose method and device for measuring body temperature using a camera in the device. The camera is an RGB and/or a NIR camera. A plurality of frames of a media captured by the RGB and/or NIR camera is preprocessed. A plurality of features comprising of statistical features, and time domain and frequency domain based PPG features are extracted from the preprocessed frames. The features can be extracted from an R channel, a G channel, a B channel, and/or an IR channel in each of the plurality of frames, wherein the features are extracted, from each of the R, G, B, and/or IR channels, within ROIs in each of the plurality of frames. The body temperature is measured using classifiers and regression models, based on the extracted plurality of features. The body temperature is periodically measured, monitored, and transmitted to nearby/remote devices.

Description

METHOD AND APPARATUS FOR MEASURING BODY TEMPERATURE USING A CAMERA
Embodiments herein relate to non-contact temperature estimation, and more particularly to methods and systems for estimating body temperature based on at least one feature identified using at least one frame captured by a camera.
Body temperature of a subject can be measured using devices such as thermometers, which require contact of the device with the skin of the subject. Measuring the body temperature using such devices may be inconvenient as the measurement is received after a certain period of time. Besides the usage of devices requiring contact with the skin for temperature measurement can lead to the spreading of infections. In light of such problems, products/solutions, which do not require contact with the skin for measurement of the body temperature, and can provide the measurement of temperature instantly, have gained interest amongst users and clinicians. However, such products or solutions can be expensive, sensitive, and/or bulky. The products/solutions are likely to provide the measurement of temperature only if they are situated within a certain range that is close enough from the subject, and are prevalent as standalone devices.
For example, non-contact based infrared thermometers can measure the temperature of a subject, if the thermometer is situated at a distance of less than 5cm from the subject. If the distance increases, the accuracy can be severely affected.
In another example, Forward Looking Infrared (FLIR) thermal imaging cameras (operating in the range of 3000nm-12000nm and sensitive to human body radiations) can be used for estimating non-contact body temperature. However, such FLIR cameras can be expensive and bulky.
The principal object of the embodiments herein is to disclose methods and systems for measuring body temperature of a subject using a Red Green Blue (RGB) and/or a Near Infrared (NIR) camera, wherein the body temperature of the subject can be measured automatically, in real-time, and without requirement of having any contact or proximity with the subject.
Another object of the embodiments herein is to utilize at least one feature, which can be identified from at least one frame captured using the RGB and/or NIR camera, and signal processing methods, to estimate the body temperature of the subject.
Accordingly, the embodiments provide methods and systems for measuring body temperature of a subject using at least one of a Red Green Blue (RGB) and a Near Infrared (NIR) camera. In an embodiment, the RGB and the NIR camera can be included in a device. The embodiments include automatically measuring the body temperature of the subject, in real-time, without requirement of having any contact or proximity with the subject.
In accordance with an aspect of the present disclosure, an embodiment herein discloses a method for measuring body temperature of a subject. The method comprises performing, by at least one processor of a device, a preprocessing of a plurality of frames of a media captured by at least one camera in the device; extracting, by the at least one processor, a plurality of features, comprising of photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and measuring, by the at least one processor, the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
In accordance with another aspect of the present disclosure, an embodiment herein discloses a device for measuring body temperature of a subject. The device comprises a memory, and at least one processor coupled to the memory. The memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to perform a preprocessing of a plurality of frames of a media captured by at least one camera; extract a plurality of features, comprising of imaging photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and measure the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
In accordance with another aspect of the present disclosure, an embodiment herein discloses a method for estimating body temperature of a subject. The method comprises capturing, at least one processor of a device, a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device; estimating, by the at least one processor, a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media; extracting, by the at least one processor, a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and estimating, by the at least one processor, the body temperature of the subject by analyzing the plurality of features and the estimated heart rate of subject.
In accordance with another aspect of the present disclosure, an embodiment herein discloses a device for estimating body temperature of a subject. The device comprises a memory, and at least one processor coupled to the memory. The memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to capture a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device; estimate a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media; extract a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and estimate the body temperature of the subject by analysis of the plurality of features and the estimated heart rate of subject.
In some embodiment, features such as statistical features, Gaussian filtering based features, and time-domain and frequency domain Photoplethysmography (PPG) features, can be identified using the captured RGB and/or NIR frames. The embodiments include utilizing Machine Learning (ML) based temperature classification model to classify temperature in at least three classes, viz., hot, cold and normal , measured based on the identified features. The embodiments include utilizing a ML regression based temperature estimation model to ascertain an exact measurement of the body temperature of the subject based on the classified temperature class from the aforementioned ML based classification model. If the measured body temperature is determined (based on predefined statistical and empirical estimates) as extremely low or high, the embodiments include ascertaining if the measurement is influenced by conditions such as physical activity, exposure to heat, and so on. If the measurement is influenced by such conditions, the embodiments include measuring the body temperature at periodic intervals. The embodiments include notifying the measured body temperature to the subject. The embodiments include transmitting the measured temperature to a remote device.
These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
Embodiments herein are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
FIG. 1 depicts various units of a device for measuring body temperature of a subject using an Red Green Blue (RGB) and/or Near Infrared (NIR) camera, according to embodiments as disclosed herein;
FIG. 2 is a flowchart depicting a method for measuring body temperature of the subject using the RGB and/or NIR camera, according to embodiments as disclosed herein;
FIGS. 3a, 3b and 3c depict classification of the body temperature based on blood pulsation at heart beat frequency, according to embodiments as disclosed herein;
FIGS. 4a and 4b depict correlation between blood perfusion of a subject and distance between the subject and the device, according to embodiments as disclosed herein; and
FIG. 5 is a graph depicting a variation of pixel values in different lighting conditions after application of Gaussian filtering, according to embodiments as disclosed herein.
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
Embodiments herein disclose methods and systems for measuring body temperature of a subject using at least one of a Red Green Blue (RGB) and a Near Infrared (NIR) camera. The embodiments include performing a preprocessing of a plurality of frames of a media captured by at least one of the RGB camera and the NIR camera. The preprocessing can include localizing a facial region of the subject in each of the plurality of frames, wherein the localized facial regions in the frames can be considered as Region of Interest (ROIs); applying noise removal techniques on the ROIs in each of the plurality of frames such as Gaussian filtering to smoothen edges of blocks in the ROIs. The embodiments include extracting a plurality of features comprising of statistical features, and time domain and frequency domain based Photoplethysmography (PPG) features. The features can be extracted from either a single or any combination of an R channel, a G channel, a B channel, and/or an IR channel channels in each of the ROIs, in each of the plurality of frames. The embodiments include measuring the body temperature through classifiers and regression models based on the extracted plurality of features.
Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
FIG. 1 depicts various units of a device 100 configured to measure body temperature of a subject using an RGB and/or NIR camera, according to embodiments as disclosed herein. As depicted in FIG. 1, the device 100 comprises of a camera unit 101, a preprocessing unit 102, a feature extraction unit 103, a temperature estimation unit 104, a monitoring unit 105 and a transmitter/receiver unit 106. Examples of the device 100 can be, but not limited to, a smart phone, a smart television, an Internet of Things (IoT) device, a smart-robot, a wearable computing device, an, or any other device capable of capturing media frames. The device 100 includes at least one of the RGB camera and the NIR camera, capable of capturing media such as images and videos. The device 100 does not require contact with the subject to measure the body temperature of the subject. In an embodiment herein, the device 100 further comprises a memory and at least one processor coupled to the memory. The memory stores instructions to be executed by the at least one processor. The at least one processor is configured to execute instructions stored in the memory and to perform various other processes. For example, at least one of the camera unit 101, the preprocessing unit 102, the feature extraction unit 103, the temperature estimation unit 104, the monitoring unit 105 and the transmitter/receiver unit 106 may be controlled by the at least one processor, when the instructions are executed by the processor.
The camera unit 101 can comprise of at least one of an RGB camera and an NIR camera. The camera unit 101 can capture a media of the subject, wherein the media comprises of at least one frame. In an embodiment herein, the capturing can be triggered on detecting at least one predefined event occurring. In an embodiment herein, the capturing can be triggered at specific times based on preconfigured commands provided to the camera unit 101 by the subject or other users.
For example, if the subject perceives any issues with health, the subject can capture a media, which can be used to measure the temperature of the body. In another example, consider that the subject is having a fever ailment. In such an instant, the subject or the other users may configure the camera unit 101 to capture media of the subject at the specific times for periodically measuring the body temperature of the subject. In another example, consider that the subject is having a fever ailment. In such an instant, the subject or the other users may configure the camera unit 101 to capture media of the subject on detecting that the subject has stopped moving.
The preprocessing unit 102 can extract a plurality of frames from the captured media. Each of the plurality of frames can be processed sequentially or in parallel. Consider that the frames are processed in sequence. In an embodiment, the preprocessing unit 102 can perform a facial detection followed by skin detection. The facial detection can be refined in order to remove regions that cover (a portion) of the facial skin (such as hair or a cloth). The facial region in each of the plurality of frames can be considered as a Region of Interest (ROI). In an embodiment, the preprocessing unit 102 can, for each of the plurality of frames, by interpolating from the first frame, localize a facial region using a Haar cascade based or any other face detector model. The preprocessing unit 102 can detect key facial structures, i.e., facial landmarks, on the localized facial region. In an embodiment, the preprocessing unit 102 can use the facial landmarks to capture different parts of the face such as forehead, left cheek, right cheek, left chin, right chin, and so on.
Once the skin detection has been performed, the preprocessing unit 102 can perform de-channeling. The ROI pixels in each of the plurality of frames can be segregated into an R (Red) channel, a G (Green) channel, a B (Blue) channel, and/or an Infrared (IR) channel. The preprocessing unit 102 can extract signals from imaging data in each of the R, G, B, and/or IR channels. The preprocessing unit 102 can subject the signals in the different channels to de-noising, motion reduction and filtering to obtain clean version of the signals. In an embodiment, low pass filtering can be applied to each signal (signals extracted from each of the channels in each ROI) to remove Gaussian noise in each signal using Gaussian kernels.
The preprocessing unit 102 can convert the signals belonging to the channels of each ROI into frequency domain, to obtain a frequency/power spectrum, by performing a Fast Fourier Transform (FFT). Thereafter, based on the frequency/power spectrum, the high frequency components can be removed (filtered). The preprocessing unit 102 can perform peak detection in the frequency domain to determine the heart rate (number of heart beats per minute) and the breathing rate (number of breaths taken by the subject in a minute) of the subject. A first peak located at low frequency-end of the spectrum can represent the breathing frequency. The reciprocal of the breathing frequency corresponds to breathing rate. A second peak (with a frequency higher than that at which the first peak is located) can represent the frequency of heart beat. The reciprocal of the frequency of heart beat corresponds to heart rate.
The preprocessing unit 102 can convert the signals in the frequency domain into time domain, by performing an Inverse Fast Fourier Transform (IFFT). Thereafter, an average value of pixels (average of values of the R, G, B, and IR channels) of each ROI (in each frame).
The feature extraction unit 103 can extract a plurality of features. The feature extraction unit 103 can capture different parts of the face such as forehead, left cheek, right cheek, left chin and right chin for each of the R, G, B, and/or IR channels of each ROI. It is to be noted that the signals that have been initially extracted from the R, G, B, and/or IR channels, and subjected to de-noising and filtering (preprocessing), have been converted to obtain imaging data. The feature extraction unit 103 can capture different parts of the face using the imaging data in each of the R, G, B, and/or IR channels.
The feature extraction unit 103 can eliminate a predefined number of frames (each of the frames including the ROIs) at the beginning and end of the media that might be subjected to adjustment or movement errors of the subject. The feature extraction unit 103 can extract statistical features in each of the different parts of the facial region in each of the R, G, B, and/or IR channels of each ROI. Examples of the statistical features can be, but not limited to, mean value, median value, skewness value, Kurtosis index value, 10-percentile value, 25-percentile value, 75-percentile value, 40-percentile value, 60-percentile value, 90-percentile value, standard deviation value, minimum value, maximum value, and mode. Thus, the feature extraction unit 103 can determine a plurality of statistical features in each of the different parts of the facial. As five parts of the facial region, viz., the forehead region, left cheek region, right cheek region, left chin region and right chin region, are captured, the feature extraction unit 103 can extract a plurality of statistical features for each of the R, G, B, and/or IR channels in each ROI.
The feature extraction unit 103 can segment the imaging data in each of the R, G, B, and/or IR channels in each preprocessed ROI into a plurality of blocks. In an embodiment, the imaging data in each of the R, G, B, and/or IR channels, of each of the preprocessed ROIs, can be segmented into 9 blocks. Each of the blocks can be divided into a plurality of grids, wherein each grid is of a predefined size. In an example, consider that each grid is of size 10x10. The feature extraction unit 103 can perform Gaussian filtering to smoothen the edges of each of the grids (blocks). The feature extraction unit 103 can determine statistical features in each of the grids once the Gaussian filtering is performed. Examples of the statistical features can be, but not limited to, mean value, median value, skewness value, Kurtosis index value, standard deviation value, minimum value, maximum value, and mode. Thus, a plurality of features is extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
The feature extraction unit 103 can perform an FFT on each of the grids in each of the blocks in each of the R, G, B, and/or IR channels of each of the preprocessed ROIs. The feature extraction unit 103 can extract features from the frequency spectrum of the grids, obtained using the FFT. The features are relevant to the heart rate and the breathing rate of the subject. The features can be considered as frequency domain PPG features. The result of applying FFT on each grid provides values at different harmonics of a fundamental frequency of a discrete time signal obtained from grid pixel data.
Examples of the features can be, but not limited to, frequency of a peak with highest amplitude (breathing frequency), the highest amplitude, frequency of a peak with a second highest amplitude (heart beat frequency), the second highest amplitude, frequencies of five higher order harmonics of the heart beat frequency, frequencies of five lower order harmonics of the heart beat frequency, amplitudes of the five higher order harmonics and the five lower order harmonics, and mean of amplitudes of first 7 harmonics of the fundamental frequency of the FFT. Thus, a plurality of frequency domain PPG features is extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
The feature extraction unit 103 can perform an IFFT on each of the grids in each of the blocks to convert the signal from frequency domain to time domain. The feature extraction unit 103 can extract features from time domain signals representing the grids, obtained using the IFFT. The features can be considered as time domain PPG features. The time domain signals can be normalized for both real and imaginary parts. In an embodiment, the features extracted from the time domain signals are curvature index of the time domain signals, mean of upslope deviation of curvature points and mean of downslope deviation of curvature points. Thus, a plurality of time domain PPG features are extracted from each of the R, G, B, and/or IR channels of each of the preprocessed ROIs.
The plurality of features extracted from each grid of each ROI for the multiple frames can be averaged for each feature for each grid. The average of the plurality of features can be represented in a form of a feature vector. There can be either a feature vector for each grid or there can be a combined feature vector for each grid in each ROI. In the embodiment, the feature vector for each ROI can be fed as input to a Machine Learning (ML) based classification model.
In an example herein, the temperature estimation unit 104 can classify the estimated temperature of the subject based on the feature vector in three classes, viz., hot temperature, cold temperature and normal temperature. The classification is based on the feature vector.
The temperature estimation unit 104 can determine an absolute value of the body temperature using a plurality of ML based regression models. The regression models can utilize probability distribution of the plurality of classes of the estimated body temperature and estimate the absolute body temperature by combining the results of the plurality of regression models.
In an example herein, the temperature estimation unit 104 can determine an absolute value of the body temperature using multiple regression models. The regression models can utilize probability distribution of the three classes, viz., hot, medium and cold, of the body temperature and estimate the absolute body temperature by combining the results of the regression models.
The monitoring unit 105 can determine a pattern of variation of the frequency of heart beat, breathing frequency and the measured body temperature. If the subject or a user of the device 100 configures the camera unit 101 to periodically capture media frames, the body temperature is measured periodically. Due to the periodic measurements, the monitoring unit 105 will be able to detect the variations in the heart beat, breathing frequency and the body temperature. The monitoring unit 105 can correlate the variations between the frequency of heart beat, breathing frequency and the measured body temperature. The monitoring unit 105 can raise an alert if the measured body temperature is determined as not within a predefined normal range, based on the correlation between the frequency of heart beat, breathing frequency and the measured body temperature. In an example, the alert can be send to the subject or a user configured smart device or a remote device such as smartphone, smart watch, and so on, as a notification or pop-up message.
The transmitter/receiver unit 106 can transmit the measured body temperature, heart rate, and breathing rate to a nearby device, a remote device, cloud, a server, and so on at periodic intervals and/or or on pre-defined events occurring (such as the temperature going below a threshold, the temperature going above a threshold, and so on). This can aid in remote health monitoring.
FIG. 1 shows exemplary units of the device 100, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the device 100 may include less or more number of units. Further, the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention. One or more units can be combined together to perform same or substantially similar function in the device 100.
FIG. 2 is a flowchart 200 depicting a method for measuring body temperature of a subject using an RGB and/or NIR camera, according to embodiments as disclosed herein. At step 201, the method includes preprocessing a plurality of frames of a media captured by an RGB camera, a NIR camera, or both RGB and NIR cameras. The preprocessing comprises of localizing a facial region of the subject in each of the plurality of frames and detecting facial landmarks on the localized facial region. The localized facial region in each of the plurality of frames can be considered as ROI. In an embodiment five different parts of the ROI, viz., forehead, left cheek, right cheek, left chin, and right chin, can be detected.
The embodiments include splitting each of the ROIs into an R channel, a G channel, a B channel, and an IR channel. The embodiments include extracting signals from imaging data in each of the channels. The signals can be fed to a Gaussian filter for removing Gaussian noise, in order to de-noise the signals. The signals can be converted to frequency domain and high frequency components can be removed. The embodiments include detecting peaks in the frequency domain representation of the signals. The lower frequency peak corresponds to breathing rate frequency. The next subsequent peak corresponds to heart beat frequency. The signals can be again converted to time domain. The embodiments include converting the signals into imaging data that can correspond to preprocessed R, G, B, and/or IR channels of each of the (preprocessed) ROIs.
At step 202, the method includes extracting features from each of the R, G, B, and/or IR channels of the preprocessed frames, within the ROIs. The features can comprise of statistical features, features obtained after Gaussian filtering, frequency domain PPG features and time-domain PPG features. The statistical features can be obtained from each of the facial regions, viz., forehead, left cheek, right cheek, left chin, and right chin, in each of the R, G, B, and/or IR channels of the preprocessed ROIs. The imaging data in each of the R, G, B, and/or IR channels can be split into a predefined number of blocks. Each block can be split into grids of predefined size. The embodiments include performing Gaussian filtering on the grids. Once the filtering is performed, the features obtained after Gaussian filtering can be extracted. The embodiments include performing an FFT on each of the grids, wherein the FFT is performed after extracting time domain signals based on the imaging data in the grids. Once the FFT is performed, the frequency domain PPG features can be extracted from the FFT of the grids. The frequency domain PPG features are relevant to the breathing frequency and frequency of heart beat. The embodiments include performing an IFFT to reconstruct the time domain signals representing the imaging data of the grids. Once the IFFT is performed, the time domain PPG features can be extracted from the reconstructed signals.
At step 203, the method includes determining the body temperature based on the extracted features using ML based classification and regression models. The classifier can classify the body temperature as hot, cold, and normal by determining a conditional probability distribution function. The regression model can determine the absolute temperature using the extracted features. The embodiments include notifying the subject or other user once the body temperature is determined (measured).
The embodiments include monitoring the heart rate and the breathing rate of the subject and detecting variations in the heart rate and the breathing rate. The variations can be correlated with the measurement of body temperature, in order to ensure that a rise or fall in the body temperature is due to an ailment or heath issue. If the rise or fall in the body temperature is due to an ailment, the subject is likely to be warned or notified. The subject may not be notified if the rise or fall in the body temperature is transient. In an example, consider that the measured body temperature is classified as high. The embodiments include ascertaining whether the high value of the body temperature is due to an ailment such as fever or a transient high temperature due to performance of a rigorous exercise. The embodiments may not consider transient changes in the breathing rate and the heat rate alone as true indicator of fever, since other factors as weather conditions, emotional changes, exercise, and so on, may also cause the heart rate to increase. The embodiments can check whether the increase in the heart rate is persistent. The embodiments can periodically measure the body temperature. The embodiments may identify the subject as having a fever and provide an alert for fever if the breathing rate and the heart rate is more than that when the subject is resting and if the increase in the heart rate and breathing rate is persistent.
In an example, the device 100, placed in vicinity of the subject, can be configured to periodically capture videos of the subject and measure the body temperature of the subject. The device 100 can be used for health monitoring as the device 100 can extract features from the ROIs in the frames of the captured video and measure the body temperature of the subject based on the extracted features. The device 100 can provide an alert if it is determined that the body temperature is not in the normal range.
The various actions in the flowchart 200 may be performed in the order presented, in a different order, or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
FIGS. 3a, 3b and 3c depict classification of the body temperature based on blood pulsation at the heart beat frequency, according to embodiments as disclosed herein. For each of the parts of the facial in each of the R, G, B, and/or IR channels of each ROI; de-noising and filtering is applied. A series of signal processing techniques is applied to obtain a correlation matrix, which depicts a correlation between the blood pulsation at the heart beat frequency and the body temperature. As depicted in FIG. 3a, for normal body temperature (body temperature measured as normal), the value of correlation is low. As depicted in FIG. 3b, for cold body temperature (body temperature measured as cold), the value of correlation is lower than that for normal body temperature. As depicted in FIG. 3c, for hot body temperature (body temperature measured as hot), the value of correlation is high around the cheek region and the eye region. The high correlation values of correlation can indicate that the amount of blood pulsation is proportional to body temperature.
FIGS. 4a and 4b depict correlation between blood perfusion of a subject and distance between the subject and the device 100, according to embodiments as disclosed herein. As depicted in the FIGS. 4a and 4b, there is a variation of the blood perfusion based on the distance of the subject from the device 100. The blood pulsation correlation matrix plots for same subject seated at different distances from the device 100 have been depicted. The accuracy of measured body temperature can be robust to variations in distance between the subject and the device 100. The values of correlation can be measured when the distances between the subject and the device 100 is 47cm (FIG. 4a) and 95cm (FIG. 4b). As depicted in the FIGS. 4a and 4b, the variation in the values of correlation with respect to the distance between the subject and the device 100 is minimal. Thus, the variation in the correlation values due to the variation of distance between the subject and the device 100 is negligible.
FIG. 5 is a graph depicting a variation of pixel values in different lighting conditions after application of Gaussian filtering, according to embodiments as disclosed herein. The accuracy of measured body temperature can be robust to variations in lighting conditions, in which the subject and the device 100 are present. Consider that the device 100 captures the video in different lighting conditions. The values of pixels can be different due to the variation in the lighting conditions. The difference in the values of the pixels can be neutralized by the application of Gaussian filtering on the grids of the blocks in the facial region (ROI) of each of the R, G, B, and IR channels. The superimposed plots, as depicted in FIG. 5, represent the pixel values in different lighting conditions. The difference in the pixel values is negligible.
The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in FIG. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
The embodiments disclosed herein describe methods and systems for measuring body temperature of a user using at least one of a RGB and an NIR camera. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment through or together with a software program written in e.g. Very high speed integrated circuit Hardware Description Language (VHDL) another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of portable device that can be programmed. The device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. The method embodiments described herein could be implemented partly in hardware and partly in software. Alternatively, the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.
The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Claims (15)

  1. A method for measuring body temperature of a subject, the method comprising:
    performing, by at least one processor of a device, a preprocessing of a plurality of frames of a media captured by at least one camera in the device;
    extracting, by the at least one processor, a plurality of features, comprising of photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and
    measuring, by the at least one processor, the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
  2. The method of claim 1, wherein the ROIs comprise of a localized facial region of the subject, wherein the localized facial region comprises of a forehead region of the subject, a left cheek region of the subject, a right cheek region of the subject, a left chin region of the subject, and a right chin region of the subject.
  3. The method of claim 2, wherein the statistical features are extracted from at least one of R channel, G channel, B channel, and IR channel, of each of the ROIs.
  4. The method of claim 3, wherein the statistical features comprises at least one of a mean value, a median value, a skewness value, a Kurtosis index value, a 10-percentile value, a 25-percentile value, a 75-percentile value, a 40-percentile value, a 60-percentile value, a 90-percentile value, a standard deviation value, a minimum value, a maximum value, and a mode.
  5. The method of claim 1, wherein the preprocessing comprises at least one of:
    localizing a facial region of the subject; and
    de-noising the ROIs in each of the plurality of frames.
  6. The method of claim 1, wherein the statistical features are extracted by:
    splitting imaging data in the at least one of R channel, G channel, B channel, and IR channel, of each of the ROIs into a plurality of blocks, wherein each of the blocks is split into a plurality of grids of predefined size;
    applying a Gaussian filter of predefined size for smoothening each of the grids in each of the blocks; and
    extracting the statistical features comprising at least one of a mean value, a median value, a skewness value, a Kurtosis index value, a standard deviation value, a minimum value, a maximum value, and a mode from each of the grids.
  7. The method of claim 6, wherein the PPG features are extracted from frequency domain representation of the grids, wherein the extracted PPG features are relevant to a frequency of heart beat of the subject and breathing frequency of the subject.
  8. The method of claim 6, wherein the PPG features are extracted from time domain representation of the grids, wherein the extracted imaging PPG features comprises at least one of curvature index of time domain representation of the grids, mean of upslope deviation of curvature points, and mean of down slope deviation of curvature points.
  9. The method of claim 1, wherein the method further includes:
    monitoring the heart rate and the breathing rate to determine a pattern of variation in the heart rate and the breathing rate;
    correlating the pattern of variation in the heart rate and the breathing rate with a measured body temperature, wherein the measured body temperature is not within a normal range of body temperature; and
    notifying that the measured body temperature is not within a normal range of body temperature based on the correlation between the measured body temperature and the pattern of variation in the heart rate and the breathing rate.
  10. The method of claim 1, wherein the method further comprises transmitting the measured body temperature to a remote device.
  11. The method of claim 1, wherein the camera is at least one of a Red Green Blue (RGB) camera and a Near Infrared (NIR) camera.
  12. A device (100) for measuring body temperature of a subject, the device (100) comprising:
    a memory; and
    at least one processor coupled to the memory,
    wherein the memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to:
    perform a preprocessing of a plurality of frames of a media captured by at least one camera;
    extract a plurality of features, comprising of imaging photoplethysmography, PPG, features and statistical features, from at least one of an R channel, a G channel, a B channel, and an IR channel, in each of the plurality of frames, wherein the plurality of features are extracted from within region of interests, ROIs, in each of the plurality of frames; and
    measure the body temperature through at least one machine learning, ML, based classifier and at least one ML based regression model, based on the plurality of features.
  13. The device of claim 12, wherein the at least one processor is further configured to perform one of the methods in claim 2 to 11.
  14. A method for estimating body temperature of a subject, the method comprising:
    capturing, by at least one processor of a device, a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device;
    estimating, by the at least one processor, a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media;
    extracting, by the at least one processor, a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and
    estimating by the at least one processor, the body temperature of the subject by analysing the plurality of features and the estimated heart rate of subject.
  15. A device for estimating body temperature of a subject, the device comprising:
    a memory; and
    at least one processor coupled to the memory,
    wherein the memory stores instructions that, when the executed by the at least one processor, cause the at least one processor to:
    capture a media of the subject using at least one of an red green blue, RGB, camera and an near infrared, NIR, camera of the device;
    estimate a heart rate of the subject based on a frequency domain analysis of a plurality of frames of the captured media;
    extract a plurality of features captured by at least one of the RGB camera and the NIR camera from a facial region of the subject in the captured media; and
    estimate the body temperature of the subject by analysis of the plurality of features and the estimated heart rate of subject.
PCT/KR2020/002334 2019-02-19 2020-02-18 Method and apparatus for measuring body temperature using a camera WO2020171554A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941006430 2019-02-19
IN201941006430 2019-10-17

Publications (1)

Publication Number Publication Date
WO2020171554A1 true WO2020171554A1 (en) 2020-08-27

Family

ID=72145162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/002334 WO2020171554A1 (en) 2019-02-19 2020-02-18 Method and apparatus for measuring body temperature using a camera

Country Status (1)

Country Link
WO (1) WO2020171554A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220087536A1 (en) * 2020-09-21 2022-03-24 Radiant Innovation Inc. Temperature measurement method and measurement device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030065256A1 (en) * 2001-10-01 2003-04-03 Gilles Rubinstenn Image capture method
US20140243683A1 (en) * 2006-09-25 2014-08-28 Song Xiao System and method for health evaluation
KR20160015785A (en) * 2014-07-31 2016-02-15 삼성전자주식회사 Apparatus and method for improving accuracy of contactless thermometer module
US9282896B2 (en) * 2014-07-04 2016-03-15 Arc Devices Limited Thermometer having a digital infrared sensor
WO2017084428A1 (en) * 2015-11-17 2017-05-26 努比亚技术有限公司 Information processing method, electronic device and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030065256A1 (en) * 2001-10-01 2003-04-03 Gilles Rubinstenn Image capture method
US20140243683A1 (en) * 2006-09-25 2014-08-28 Song Xiao System and method for health evaluation
US9282896B2 (en) * 2014-07-04 2016-03-15 Arc Devices Limited Thermometer having a digital infrared sensor
KR20160015785A (en) * 2014-07-31 2016-02-15 삼성전자주식회사 Apparatus and method for improving accuracy of contactless thermometer module
WO2017084428A1 (en) * 2015-11-17 2017-05-26 努比亚技术有限公司 Information processing method, electronic device and computer storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220087536A1 (en) * 2020-09-21 2022-03-24 Radiant Innovation Inc. Temperature measurement method and measurement device

Similar Documents

Publication Publication Date Title
KR101729327B1 (en) A monitoring system for body heat using the dual camera
KR101806400B1 (en) A surveillance system for body heat by the dual camera using the black body
US10713798B2 (en) Low-complexity motion detection based on image edges
EP3057487B1 (en) Device and method for obtaining a vital sign of a subject
EP3308702B1 (en) Pulse estimation device, and pulse estimation method
WO2016006027A1 (en) Pulse wave detection method, pulse wave detection program, and pulse wave detection device
CN107169419B (en) Non-contact human body sign detection method and device based on machine vision
WO2014028671A2 (en) Real-time physiological characteristic detection based on reflected components of light
KR100822476B1 (en) Remote emergency monitoring system and method
JP6115263B2 (en) Pulse wave detection device, pulse wave detection method, and pulse wave detection program
KR101754152B1 (en) Thermal Patient Monitering System by Using Multiple Band Camera and Method thereof
CN111738132B (en) Method and device for measuring human body temperature, electronic equipment and readable storage medium
CN111429345A (en) Method for visually calculating heart rate and heart rate variability with ultra-low power consumption
Alzahrani et al. Preprocessing realistic video for contactless heart rate monitoring using video magnification
WO2020171554A1 (en) Method and apparatus for measuring body temperature using a camera
Szankin et al. Long distance vital signs monitoring with person identification for smart home solutions
US10750959B2 (en) Heart rate estimation from face videos using quality based fusion
CN107847134B (en) Brain activity estimation device
CN111310717A (en) Intelligent screening and identity recognition device for non-sensible body temperature of sports people
Lee et al. Liveness detection using frequency entropy of image sequences
Ayesha et al. A web application for experimenting and validating remote measurement of vital signs
JP6167849B2 (en) Pulse wave detection device, pulse wave detection method, and pulse wave detection program
Yoshikawa et al. Dynamic Offset Correction for Smartphone Thermal Cameras Using a Wristband Sensor
WO2023075137A1 (en) Apparatus and method of multimodal contactless vital sign monitoring
Caroppo et al. Vision-Based Heart Rate Monitoring in the Smart Living Domains

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20758721

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20758721

Country of ref document: EP

Kind code of ref document: A1