CN107403407A - A kind of breathing tracking based on thermal imaging - Google Patents
A kind of breathing tracking based on thermal imaging Download PDFInfo
- Publication number
- CN107403407A CN107403407A CN201710660817.5A CN201710660817A CN107403407A CN 107403407 A CN107403407 A CN 107403407A CN 201710660817 A CN201710660817 A CN 201710660817A CN 107403407 A CN107403407 A CN 107403407A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- thermal
- mfrac
- mover
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000029058 respiratory gaseous exchange Effects 0.000 title claims abstract description 29
- 238000001931 thermography Methods 0.000 title claims abstract description 19
- 238000009826 distribution Methods 0.000 claims abstract description 35
- 238000000034 method Methods 0.000 claims abstract description 28
- 238000013139 quantization Methods 0.000 claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 7
- 230000001815 facial effect Effects 0.000 claims abstract description 5
- 230000000241 respiratory effect Effects 0.000 claims description 18
- 238000005516 engineering process Methods 0.000 claims description 13
- 239000011159 matrix material Substances 0.000 claims description 9
- 210000003928 nasal cavity Anatomy 0.000 claims description 7
- 230000003595 spectral effect Effects 0.000 claims description 6
- 230000010354 integration Effects 0.000 claims description 5
- 238000011002 quantification Methods 0.000 claims description 5
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000005311 autocorrelation function Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 239000011521 glass Substances 0.000 claims description 3
- 230000004060 metabolic process Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 abstract description 3
- 230000036387 respiratory rate Effects 0.000 abstract description 3
- 238000012806 monitoring device Methods 0.000 abstract 1
- 238000012544 monitoring process Methods 0.000 description 6
- 230000036541 health Effects 0.000 description 5
- 230000004075 alteration Effects 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/08—Projecting images onto non-planar surfaces, e.g. geodetic screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
Abstract
A kind of breathing tracking based on thermal imaging proposed in the present invention, its main contents include:Optimum quantization for high heat dynamic range scene, nasal region is followed the trail of, integrated by hot voxel and carry out breathing estimation, its process is, first pass through the thermal range interested of whole facial Temperature Distribution of the search comprising each frame, optimally quantify heat distribution series, compared by the normal heat diagram picture with combining under artifact, the different shape in nostril is obtained from the image based on heat gradient, the characteristic point for nostril movement locus is represented using converted image collection, feature is extracted finally by the average value for calculating heat distribution in bounding box, integrated by hot voxel and carry out breathing estimation.Thermal imaging system proposed by the present invention will not be influenceed by lighting condition, improve the Stability and veracity of nostril region tracking;Meanwhile the thermal imaging cost is low, new possibility is opened to start the respiratory rate monitoring device of smart mobile phone.
Description
Technical Field
The invention relates to the field of respiratory tracking, in particular to a respiratory tracking method based on a thermal imaging technology.
Background
With the improvement of living standard, the health consciousness and health care requirement of people are continuously strengthened, people pay more and more attention to early detection and early intervention of diseases, namely, the health condition of human body is monitored by a real-time dynamic monitoring mode, and diseases are prevented in the bud. Among them, the respiratory monitoring function plays an increasingly important role in the medical, health care and fitness fields. It can be applied in diagnosing and treating lung related diseases, nursing neonates and sleeping researches, and detecting breathing in motion and sleeping in some wearable devices, and analyzing the health condition of a user by recording the motion and sleeping data of the user. Research shows that the thermal imaging technology can detect infrared energy in a non-contact mode, convert the infrared energy into an electric signal, further generate a thermal image and a temperature value on a display, and can calculate the temperature value. Its advantage is "green" does not have the wound, can remote accurate tracking hot target, can also accurate guidance, can realize all-weather control, consequently can use in the monitoring to human breathing. However, the conventional method generally requires wearing a respiratory belt or a nasal probe to track and measure the respiratory rate, but is often affected by motion artifacts and respiratory dynamics amplification noise, resulting in erroneous sensor readings, and therefore, if the thermal imaging technology is used for respiratory monitoring, the monitoring effect is better.
The invention provides a respiration tracking method based on thermal imaging technology, which comprises the steps of firstly searching an interested heat range containing the whole face temperature distribution of each frame, optimally quantizing a heat distribution sequence, obtaining different forms of nostrils from images based on heat gradient by comparing with normal heat images under combined artifacts, collecting characteristic points representing motion tracks of the nostrils by using converted images, finally extracting characteristics by calculating an average value of heat distribution in a boundary frame, and carrying out respiration estimation by thermal voxel integration. The thermal imaging system provided by the invention is not influenced by illumination conditions, and the accuracy and stability of nostril region tracking are improved; meanwhile, the thermal imaging technology is low in cost, and opens up new possibility for developing respiratory rate monitoring equipment of a smart phone.
Disclosure of Invention
The invention aims to provide a respiratory tracking method based on thermal imaging technology, aiming at the problem of influence of motion artifacts and respiratory dynamics amplification noise, firstly searching a heat range of interest containing the whole facial temperature distribution of each frame, optimally quantizing a heat distribution sequence, obtaining different forms of nostrils from images based on heat gradient compared with normal heat images under combined artifacts, collecting characteristic points representing motion tracks of the nostrils by using converted images, finally extracting characteristics by calculating an average value of heat distribution in a boundary frame, and carrying out respiratory estimation by thermal voxel integration.
In order to solve the above problems, the present invention provides a respiration tracking method based on thermal imaging technology, which mainly comprises:
optimal quantization for high thermal dynamic range scenarios;
(II) tracking a nasal cavity region;
(III) respiratory estimation by thermosomatization.
The optimal quantization for the high-heat dynamic range scene is characterized in that in the heat image processing, the quantization represents the conversion process between continuous temperature values and equivalent digital color mapping thereof; will scale the original temperature [ T0,Tk-1](i.e., temperature range of interest) to the color scale [ u ]0,uk-1]The transformation between is defined as u ═ (T); only linear relationships are considered here; for quantization in a time-varying thermal dynamic range scene, the thermal distribution sequence is optimally quantized by searching for a thermal range of interest that encompasses the entire facial temperature distribution for each frame.
Further, the quantification of the heat distribution sequence, first, removes statistical extreme values, reduces unexpected noise (e.g. sunlight projected on human glasses) and extreme temperature points (e.g. fog mirrors) caused by temperature calculation errors of moving heat imaging; to accomplish this, the heat range T 'is determined by removing heat signals exceeding 1.96 standard deviations'min,T′max]Setting the initial candidates, which in fact corresponds to a 95% confidence, as shown in the following equation:
wherein,is the sample mean of c (x), which is a one-dimensional temperature distributionn m is the spatial resolution of the collected heat distribution matrix, σcIs the standard deviation of c (x).
Further, the optimal threshold selection is to find out the optimal threshold separating the object from the background by iteratively analyzing the color histogram; this may help search for a time-varying temperature histogram that distinguishes between the caloric values of a person's skin and non-skin areas and shows various dynamic ranges; the optimal quantization is to iteratively calculate the optimal threshold value ToptTo accomplish the following steps:
Topt(0)=T′min(2)
wherein, when c (x) is less than or equal to Topt(t),c(x)>ToptAt (t), μ1(t)+μ2(t) is an average value; this process is repeated until T is satisfiedopt(p)-Topt(p-1) is approximately equal to 0; the final temperature range of interest selected was:
T0=Topt(p),Tk-1=T′max(4)
the lower limit is only the optimal range, provided that the average temperature of the background, including hair and air, is lower than the average temperature of human skin.
Wherein said nasal cavity region tracking, given a color mapped image by an optimal quantization method, calculates a thermal gradient magnitude matrix from each frame and adapts to a median flow algorithm, a tracker that uses a forward-backward error estimate of the difference tracking points; the loss of feature points is compensated by a two-dimensional normalized correlation based on gradients.
Further, the thermal gradient magnitude diagram, due to human body thermal metabolism and the relatively low thermal conductivity of human skin, makes the thermal distribution of adjacent skin very similar (e.g., thermal diagram of nasal region), the shape of nostrils and nostril region tends to blur, resulting in differences between weak key surface point features; to obtain clearer characteristics, the boundary between the nostrils may be enhanced by converting the quantified thermal image u into a thermal gradient magnitude map Φ by:
by comparing with normal thermal images under combined artifacts (i.e. motion and respiratory dynamics), different morphologies of the nostrils can be obtained from thermal gradient based images; the transformed images may then be used to collect feature points representing motion trajectories for the nostrils.
Further, the tracking is performed by selecting a nostril from the first frame as a region of interest (ROI) with a size of N × N, resulting in ΦROI(ii) a This may be done by manual selection or automation; the tracker implements a median flow algorithm that defines the forward-backward computation as:
wherein the thermal gradient image sequence S in time t is (Φ t )+1,…,Φt+k) The forward trajectory is Backward trace generated by backward tracing to first frameWhereinIs the Euclidean distance between two points; in this algorithm, points are tracked based on the difference given by:
similar to the concept of back propagation in artificial neural networks, point features are computed to compute errors and predict point locations backwards by observing past trajectories.
Further, the two-dimensional normalized correlation enhances tracking performance of nostrils with high level of deformability on hot surfaces by searching for new locations of the ROI with the maximum of the gradient-based normalized correlation, expressed as:
wherein x is at the center of nxn; when the number of tracking points is less than a certain threshold, resetting the ROI and finding new gradient-based point features; this method may also be applied to automatic ROI selection for the first frame in cases where there are a large number of nostril image sets.
Wherein, said estimation of breathing by thermosomatization, heat exchange of the nares during inhalation and exhalation may be indicative of a breathing pattern of the person; this may be characterized by calculating the mean of the heat distribution in the bounding box (i.e. nostril ROI); projecting the two-dimensional heat matrix onto a three-dimensional space, wherein unit heat elements can be regarded as voxels; may be configured such that:
wherein, Λt(T) is the integral of the thermal voxel of the nostril in the cross-section with temperature T,is the absolute temperature distribution, T, over the tracking area Is an upper boundary of the integrated concave volume, which is set as a moving average (n 2) of the average temperature value, not only with stable boundaries, but also taking into accountGlobal thermal variation; however, once the ROI tracker is erroneously placed in a different region at a certain time, the moving average will need to be reset from the next frame, removing the value from the misplaced bounding box; the misalignment can be detected by examining the sudden change (i.e. differential) in the statistical skewness on the heat distribution:
as shown in the above formula.
Further, the respiration rate, for the estimation of the respiration cycle, can use frequency domain and time domain methods; similarly, short-time power spectral density is implemented to analyze voxel characteristicsSelf-similarity of (c); this is a Fourier transform F based on a short-time autocorrelation functionf(ii) a To reduce the ripple in the frequency domain due to the truncated short time window, a gaussian window is used:
wherein the window wi(k) Has a length ofDetermined by the respiration rate of interest, fsIs the sampling frequency; w is ai(k) Normalized by feature scaling and filtered by a third order elliptic filter (here 3dB passband ripple and 6dB stopband attenuation) with passband cut-off frequencies of 0.1Hz and 0.85 Hz; finally, the breathing rate is estimated by finding the lookup delay k that maximizes the power spectral density:
SV(f)=Ff(Rww)=∑kRww(k)e-j2πfk(12)
wherein R iswwIs the short-time autocorrelation w of the filteri(k)。
Drawings
Fig. 1 is a system flow chart of a respiratory tracking method based on thermal imaging technology.
Fig. 2 shows the nasal cavity region tracking based on the respiration tracking method of the thermal imaging technology.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application can be combined with each other without conflict, and the present invention is further described in detail with reference to the drawings and specific embodiments.
Fig. 1 is a system flow chart of a respiratory tracking method based on thermal imaging technology. The method mainly comprises the steps of optimal quantification of high-heat dynamic range scenes, nasal cavity area tracking and respiratory estimation through thermal voxel integration.
The method is used for the optimal quantification of a high-heat dynamic range scene, and in the heat image processing, the quantification represents the conversion process between continuous temperature values and equivalent digital color mapping thereof; will scale the original temperature [ T0,Tk-1](i.e., temperature range of interest) to the color scale [ u ]0,uk-1]The transformation between is defined as u ═ (T); only linear relationships are considered here; for quantization in a time-varying thermal dynamic range scene, the thermal distribution sequence is optimally quantized by searching for a thermal range of interest that encompasses the entire facial temperature distribution for each frame.
Removing statistical extreme values, and reducing unexpected noise (such as sunlight projected on human glasses) and extreme temperature points (such as fog mirrors) caused by temperature calculation errors of moving heat imaging; to accomplish this, the heat range is determined by removing heat signals exceeding 1.96 standard deviationsEnclose [ T'min,T′max]Setting the initial candidates, which in fact corresponds to a 95% confidence, as shown in the following equation:
wherein,is the sample mean of c (x), which is a one-dimensional temperature distributionn m is the spatial resolution of the collected heat distribution matrix, σcIs the standard deviation of c (x).
Finding out an optimal threshold value for separating the object from the background by iteratively analyzing the color histogram; this may help search for a time-varying temperature histogram that distinguishes between the caloric values of a person's skin and non-skin areas and shows various dynamic ranges; the optimal quantization is to iteratively calculate the optimal threshold value ToptTo accomplish the following steps:
Topt(0)=T′min(2)
wherein, when c (x) is less than or equal to Topt(t),c(x)>ToptAt (t), μ1(t)+μ2(t) is an average value; this process is repeated until T is satisfiedopt(p)-Topt(p-1) is approximately equal to 0; the final temperature range of interest selected was:
T0=Topt(p),Tk-1=T′max(4)
the lower limit is only the optimal range, provided that the average temperature of the background, including hair and air, is lower than the average temperature of human skin.
By performing respiratory estimation by thermal voxel integration, heat exchange of the nares during inhalation and exhalation may be indicative of a person's breathing pattern; this may be characterized by calculating the mean of the heat distribution in the bounding box (i.e. nostril ROI); projecting the two-dimensional heat matrix onto a three-dimensional space, wherein unit heat elements can be regarded as voxels; may be configured such that:
wherein, Λt(T) is the integral of the thermal voxel of the nostril in the cross-section with temperature T,is the absolute temperature distribution, T, over the tracking area Is the upper boundary of the integrated concave volume, which is set to a moving average (n 2) of average temperature values, not only with stable boundaries, but also taking into account global thermal variations; however, once the ROI tracker is erroneously placed in a different region at a certain time, the moving average will need to be reset from the next frame, removing the value from the misplaced bounding box; the misalignment can be detected by examining the sudden change (i.e. differential) in the statistical skewness on the heat distribution:
as shown in the above formula.
For the estimation of the breathing cycle, frequency domain and time domain methods may be used; similarly, short-time power spectral density is implemented to analyze voxel characteristicsSelf-similarity of (c); this is a Fourier transform F based on a short-time autocorrelation functionf(ii) a To reduce the ripple in the frequency domain due to the truncated short time window, a gaussian window is used:
wherein the window wi(k) Has a length of Determined by the respiration rate of interest, fsIs the sampling frequency; w is ai(k) Normalized by feature scaling and filtered by a third order elliptic filter (here 3dB passband ripple and 6dB stopband attenuation) with passband cut-off frequencies of 0.1Hz and 0.85 Hz; finally, the breathing rate is estimated by finding the lookup delay k that maximizes the power spectral density:
SV(f)=Ff(Rww)=∑kRww(k)e-j2πfk(8)
wherein R iswwIs the short-time autocorrelation w of the filteri(k)。
Fig. 2 shows the nasal cavity region tracking based on the respiration tracking method of the thermal imaging technology. A tracker that calculates a thermal gradient magnitude matrix from each frame, and adapts to a median flow algorithm, using a forward-backward error estimate of the difference tracking points, given a color mapped image by an optimal quantization method; the loss of feature points is compensated by a two-dimensional normalized correlation based on gradients.
Since human body thermal metabolism and the relatively low thermal conductivity of human skin make the heat distribution of adjacent skin very similar (e.g., a thermal map of the nasal region), the shape of the nostrils and nostril region tends to blur, resulting in differences between weak key surface point features; to obtain clearer characteristics, the boundary between the nostrils may be enhanced by converting the quantified thermal image u into a thermal gradient magnitude map Φ by:
by comparing with normal thermal images under combined artifacts (i.e. motion and respiratory dynamics), different morphologies of the nostrils can be obtained from thermal gradient based images; the transformed images may then be used to collect feature points representing motion trajectories for the nostrils.
By selecting the nostril from the first frame as the region of interest (ROI) with size N × N, Φ is obtainedROI(ii) a This may be done by manual selection or automation; the tracker implements a median flow algorithm that defines the forward-backward computation as:
wherein the thermal gradient image sequence S in time t is (Φ t )+1,…,Φt+k) The forward trajectory is Backward trace generated by backward tracing to first frameWhereinIs the Euclidean distance between two points; in this algorithm, points are tracked based on the difference given by:
similar to the concept of back propagation in artificial neural networks, point features are computed to compute errors and predict point locations backwards by observing past trajectories.
The tracking performance of nostrils with high level of deformability on the hot surface is enhanced by searching for new locations of the ROI with the maximum of the gradient-based normalized correlation, expressed as:
wherein x is at the center of nxn; when the number of tracking points is less than a certain threshold, resetting the ROI and finding new gradient-based point features; this method may also be applied to automatic ROI selection for the first frame in cases where there are a large number of nostril image sets.
It will be appreciated by persons skilled in the art that the invention is not limited to details of the foregoing embodiments and that the invention can be embodied in other specific forms without departing from the spirit or scope of the invention. In addition, various modifications and alterations of this invention may be made by those skilled in the art without departing from the spirit and scope of this invention, and such modifications and alterations should also be viewed as being within the scope of this invention. It is therefore intended that the following appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
Claims (10)
1. A respiration tracking method based on thermal imaging technology is characterized by mainly comprising the steps of (A) optimal quantification for high-heat dynamic range scenes; nasal cavity region tracking (two); respiration estimation was performed by thermal voxel integration (iii).
2. Optimal quantization for high thermal dynamic range scenes (one) based on claim 1, characterized in that in the thermal image processing, the quantization represents the conversion process between successive temperature values and their equivalent digital color maps; will be provided withOriginal temperature scale [ T0,Tk-1](i.e., temperature range of interest) to the color scale [ u ]0,uk-1]The transformation between is defined as u ═ (T); only linear relationships are considered here; for quantization in a time-varying thermal dynamic range scene, the thermal distribution sequence is optimally quantized by searching for a thermal range of interest that encompasses the entire facial temperature distribution for each frame.
3. The sequence of quantitative heat distributions of claim 2, wherein, in a first step, statistical extreme values are removed, accidental noise (e.g. sunlight projected on human glasses) and extreme temperature points (e.g. fog mirrors) caused by temperature calculation errors of moving thermal imaging are reduced; to accomplish this, the heat range T 'is determined by removing heat signals exceeding 1.96 standard deviations'min,T′max]Setting the initial candidates, which in fact corresponds to a 95% confidence, as shown in the following equation:
<mrow> <msubsup> <mi>T</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> <mo>&prime;</mo> </msubsup> <mo>=</mo> <mover> <mi>c</mi> <mo>&OverBar;</mo> </mover> <mo>-</mo> <mn>1.96</mn> <mfrac> <msub> <mi>&sigma;</mi> <mi>c</mi> </msub> <msqrt> <mrow> <mi>n</mi> <mo>&CenterDot;</mo> <mi>m</mi> </mrow> </msqrt> </mfrac> <mo>,</mo> <msubsup> <mi>T</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> <mo>&prime;</mo> </msubsup> <mo>=</mo> <mover> <mi>c</mi> <mo>&OverBar;</mo> </mover> <mo>+</mo> <mn>1.96</mn> <mfrac> <msub> <mi>&sigma;</mi> <mi>c</mi> </msub> <msqrt> <mrow> <mi>n</mi> <mo>&CenterDot;</mo> <mi>m</mi> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
wherein,is the sample mean of c (x), which is a one-dimensional temperature distributionn m is the spatial resolution of the collected heat distribution matrix, σcIs the standard deviation of c (x).
4. Optimal threshold selection based on claim 2, characterized in that the optimal threshold separating the object from the background is found by iteratively analyzing the color histogram; this may help search for a time-varying temperature histogram that distinguishes between the caloric values of a person's skin and non-skin areas and shows various dynamic ranges; the optimal quantization is to iteratively calculate the optimal threshold value ToptTo accomplish the following steps:
Topt(0)=T′min(2)
<mrow> <msub> <mi>T</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>&mu;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>&mu;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> <mn>2</mn> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
wherein, when c (x) is less than or equal to Topt(t),c(x)>ToptAt (t), μ1(t)+μ2(t) is an average value; this process is repeated until T is satisfiedopt(p)-Topt(p-1) is approximately equal to 0; the final temperature range of interest selected was:
T0=Topt(p),Tk-1=T′max(4)
the lower limit is only the optimal range, provided that the average temperature of the background, including hair and air, is lower than the average temperature of human skin.
5. Nasal cavity region tracking (two) based on claim 1, characterized by a tracker that uses forward-backward error estimation of difference tracking points, given a color mapped image by optimal quantization method, which calculates the thermal gradient magnitude matrix from each frame and adapts the median flow algorithm; the loss of feature points is compensated by a two-dimensional normalized correlation based on gradients.
6. The thermal gradient magnitude graph of claim 5, wherein the heat distribution to adjacent skin is very similar due to human body thermal metabolism and the relatively low thermal conductivity of human skin (e.g., the thermal map of the nasal region), and the shape of the nostrils and nostril region tends to blur, resulting in differences between the weak key surface point features; to obtain clearer characteristics, the boundary between the nostrils may be enhanced by converting the quantified thermal image u into a thermal gradient magnitude map Φ by:
<mrow> <mi>&Phi;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>u</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mo>&part;</mo> <mi>x</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>u</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mo>&part;</mo> <mi>y</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
by comparing with normal thermal images under combined artifacts (i.e. motion and respiratory dynamics), different morphologies of the nostrils can be obtained from thermal gradient based images; the transformed images may then be used to collect feature points representing motion trajectories for the nostrils.
7. Tracking according to claim 6, characterized in that Φ is obtained by selecting a nostril from the first frame as region of interest (ROI) with size N × NROI(ii) a This may be done by manual selection or automation; the tracker implements a median flow algorithm that defines the forward-backward computation as:
<mrow> <mi>e</mi> <mrow> <mo>(</mo> <msubsup> <mi>T</mi> <mi>f</mi> <mi>k</mi> </msubsup> <mo>|</mo> <mi>S</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>-</mo> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>t</mi> </msub> <mo>|</mo> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
wherein the thermal gradient image sequence S in time t is (Φ)t,Φt+1,…,Φt+k) The forward trajectory is Backward trace generated by backward tracing to first frameWhereinIs the Euclidean distance between two points; in this algorithm, points are tracked based on the difference given by:
<mrow> <msub> <mi>h</mi> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>=</mo> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>+</mo> <mfrac> <mrow> <msub> <mi>&Sigma;</mi> <mi>x</mi> </msub> <mi>w</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mi>F</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>+</mo> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>&lsqb;</mo> <mi>G</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>F</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>+</mo> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>&rsqb;</mo> </mrow> <mrow> <msub> <mi>&Sigma;</mi> <mi>x</mi> </msub> <mi>w</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mi>F</mi> <msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>+</mo> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
similar to the concept of back propagation in artificial neural networks, point features are computed to compute errors and predict point locations backwards by observing past trajectories.
8. The two-dimensional normalized correlation of claim 7, wherein the tracking performance of a nostril with a high level of deformability on a thermal surface is enhanced by searching for a new location of the ROI with the maximum of the gradient-based normalized correlation, expressed as:
<mrow> <mi>&gamma;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>&Sigma;</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>&Phi;</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> <mo>-</mo> <msub> <mi>&mu;</mi> <mrow> <msub> <mi>&Phi;</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>&Phi;</mi> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <mi>x</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mi>&mu;</mi> <mrow> <mi>&Phi;</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mrow> <msqrt> <mrow> <msub> <mi>&Sigma;</mi> <mi>i</mi> </msub> <msup> <mrow> <mo>(</mo> <msub> <mi>&Phi;</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> <mo>-</mo> <msub> <mi>&mu;</mi> <mrow> <msub> <mi>&Phi;</mi> <mrow> <mi>R</mi> <mi>O</mi> <mi>I</mi> </mrow> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <msub> <mi>&Sigma;</mi> <mi>i</mi> </msub> <msup> <mrow> <mo>(</mo> <mi>&Phi;</mi> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <mi>x</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mi>&mu;</mi> <mrow> <mi>&Phi;</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
wherein x is at the center of nxn; when the number of tracking points is less than a certain threshold, resetting the ROI and finding new gradient-based point features; this method may also be applied to automatic ROI selection for the first frame in cases where there are a large number of nostril image sets.
9. Respiratory estimation by thermosomatization (iii) based on claim 1, characterized by the fact that the heat exchange of the nostrils during inhalation and exhalation may indicate the breathing pattern of the person; this may be characterized by calculating the mean of the heat distribution in the bounding box (i.e. nostril ROI); projecting the two-dimensional heat matrix onto a three-dimensional space, wherein unit heat elements can be regarded as voxels; may be configured such that:
<mrow> <mover> <mi>v</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mo>&Integral;</mo> <mrow> <msub> <mi>T</mi> <mi>min</mi> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>T</mi> <mi>&delta;</mi> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </msubsup> <msub> <mi>&Lambda;</mi> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>T</mi> <mo>)</mo> </mrow> <mi>d</mi> <mi>T</mi> <mo>&ap;</mo> <munder> <mo>&Sigma;</mo> <mi>i</mi> </munder> <munder> <mo>&Sigma;</mo> <mi>j</mi> </munder> <msub> <mi>T</mi> <mi>&delta;</mi> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mover> <mi>u</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>T</mi> <mi>&delta;</mi> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mover> <mi>u</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>></mo> <mn>0</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
wherein, Λt(T) is the integral of the thermal voxel of the nostril in the cross-section with temperature T,is the absolute temperature distribution, T, over the tracking area Is the upper boundary of the integrated concave volume, which is set to a moving average (n 2) of average temperature values, not only with stable boundaries, but also taking into account global thermal variations; however, once the ROI tracker is erroneously placed in a different region at a certain time, the moving average will need to be reset from the next frame, removing the value from the misplaced bounding box; the misalignment can be detected by examining the sudden change (i.e. differential) in the statistical skewness on the heat distribution:
<mrow> <msubsup> <mi>&Delta;&gamma;</mi> <mn>1</mn> <mi>t</mi> </msubsup> <mo>=</mo> <mi>E</mi> <msup> <mrow> <mo>&lsqb;</mo> <mfrac> <mrow> <msub> <mover> <mi>u</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>&mu;</mi> <mi>t</mi> </msub> </mrow> <msub> <mi>&sigma;</mi> <mi>t</mi> </msub> </mfrac> <mo>&rsqb;</mo> </mrow> <mn>3</mn> </msup> <mo>-</mo> <mi>E</mi> <msup> <mrow> <mo>&lsqb;</mo> <mfrac> <mrow> <msub> <mover> <mi>u</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>t</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>&mu;</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> </mrow> <msub> <mi>&sigma;</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> </mfrac> <mo>&rsqb;</mo> </mrow> <mn>3</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow>
as shown in the above formula.
10. The respiration rate of claim 9, wherein for the estimation of the respiration cycle, frequency domain and time domain methods can be used; similarly, short-time power spectral density is implemented to analyze voxel characteristicsSelf-similarity of (c); this is a Fourier transform F based on a short-time autocorrelation functionf(ii) a To reduce the ripple in the frequency domain due to the truncated short time window, a gaussian window is used:
<mrow> <msub> <mi>w</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <mover> <mi>v</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>n</mi> <mi>i</mi> </msub> <mo>+</mo> <mi>k</mi> <mo>)</mo> </mrow> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <msup> <mrow> <mo>(</mo> <mfrac> <mi>k</mi> <mi>&sigma;</mi> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msup> <mo>,</mo> <mi>k</mi> <mo>&Element;</mo> <mo>{</mo> <mo>-</mo> <msub> <mover> <mi>t</mi> <mo>^</mo> </mover> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> <msub> <mi>f</mi> <mi>s</mi> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mover> <mi>t</mi> <mo>^</mo> </mover> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> <msub> <mi>f</mi> <mi>s</mi> </msub> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
wherein the window wi(k) Has a length of Determined by the respiration rate of interest, fsIs the sampling frequency; w is ai(k) Normalization by feature scalingAnd filtered by a third-order elliptic filter (here, 3dB of passband ripple and 6dB of stopband attenuation), and has passband cut-off frequencies of 0.1Hz and 0.85 Hz; finally, the breathing rate is estimated by finding the lookup delay k that maximizes the power spectral density:
SV(f)=Ff(Rww)=∑kRww(k)e-j2πfk(12)
wherein R iswwIs the short-time autocorrelation w of the filteri(k)。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710660817.5A CN107403407A (en) | 2017-08-04 | 2017-08-04 | A kind of breathing tracking based on thermal imaging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710660817.5A CN107403407A (en) | 2017-08-04 | 2017-08-04 | A kind of breathing tracking based on thermal imaging |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107403407A true CN107403407A (en) | 2017-11-28 |
Family
ID=60401846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710660817.5A Withdrawn CN107403407A (en) | 2017-08-04 | 2017-08-04 | A kind of breathing tracking based on thermal imaging |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107403407A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108198154A (en) * | 2018-03-19 | 2018-06-22 | 中山大学 | Image de-noising method, device, equipment and storage medium |
CN109034179A (en) * | 2018-05-30 | 2018-12-18 | 河南理工大学 | A kind of rock stratum classification method based on mahalanobis distance IDTW |
CN111008622A (en) * | 2020-03-11 | 2020-04-14 | 腾讯科技(深圳)有限公司 | Image object detection method and device and computer readable storage medium |
CN111507268A (en) * | 2020-04-17 | 2020-08-07 | 浙江大华技术股份有限公司 | Alarm method and device, storage medium and electronic device |
CN112255141A (en) * | 2020-10-26 | 2021-01-22 | 光谷技术股份公司 | Thermal imaging gas monitoring system |
WO2021077515A1 (en) * | 2019-10-25 | 2021-04-29 | 苏州大学 | Voxel model-based characterization method for respiratory characteristics |
CN115115737A (en) * | 2022-08-29 | 2022-09-27 | 深圳市海清视讯科技有限公司 | Method, device, equipment, medium and program product for identifying artifacts in thermal imaging |
CN116701695A (en) * | 2023-06-01 | 2023-09-05 | 中国石油大学(华东) | Image retrieval method and system for cascading corner features and twin network |
-
2017
- 2017-08-04 CN CN201710660817.5A patent/CN107403407A/en not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
YOUNGJUN CHO 等: ""Robust respiration tracking in high-dynamic range scenes using mobile thermal imaging"", 《网页在线公开:TTPS://ARXIV.ORG/ABS/1705.06628V1》 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108198154A (en) * | 2018-03-19 | 2018-06-22 | 中山大学 | Image de-noising method, device, equipment and storage medium |
CN108198154B (en) * | 2018-03-19 | 2020-06-26 | 中山大学 | Image denoising method, device, equipment and storage medium |
CN109034179A (en) * | 2018-05-30 | 2018-12-18 | 河南理工大学 | A kind of rock stratum classification method based on mahalanobis distance IDTW |
WO2021077515A1 (en) * | 2019-10-25 | 2021-04-29 | 苏州大学 | Voxel model-based characterization method for respiratory characteristics |
US11373367B2 (en) | 2019-10-25 | 2022-06-28 | Soochow University | Method for characterization of respiratory characteristics based on voxel model |
CN111008622B (en) * | 2020-03-11 | 2020-06-12 | 腾讯科技(深圳)有限公司 | Image object detection method and device and computer readable storage medium |
CN111008622A (en) * | 2020-03-11 | 2020-04-14 | 腾讯科技(深圳)有限公司 | Image object detection method and device and computer readable storage medium |
CN111507268A (en) * | 2020-04-17 | 2020-08-07 | 浙江大华技术股份有限公司 | Alarm method and device, storage medium and electronic device |
CN111507268B (en) * | 2020-04-17 | 2024-02-20 | 浙江华感科技有限公司 | Alarm method and device, storage medium and electronic device |
CN112255141A (en) * | 2020-10-26 | 2021-01-22 | 光谷技术股份公司 | Thermal imaging gas monitoring system |
CN112255141B (en) * | 2020-10-26 | 2021-05-11 | 光谷技术有限公司 | Thermal imaging gas monitoring system |
CN115115737A (en) * | 2022-08-29 | 2022-09-27 | 深圳市海清视讯科技有限公司 | Method, device, equipment, medium and program product for identifying artifacts in thermal imaging |
CN116701695A (en) * | 2023-06-01 | 2023-09-05 | 中国石油大学(华东) | Image retrieval method and system for cascading corner features and twin network |
CN116701695B (en) * | 2023-06-01 | 2024-01-30 | 中国石油大学(华东) | Image retrieval method and system for cascading corner features and twin network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107403407A (en) | A kind of breathing tracking based on thermal imaging | |
Janssen et al. | Video-based respiration monitoring with automatic region of interest detection | |
US9597016B2 (en) | Activity analysis, fall detection and risk assessment systems and methods | |
CN105636505B (en) | For obtaining the device and method of the vital sign of object | |
Monkaresi et al. | A machine learning approach to improve contactless heart rate monitoring using a webcam | |
US9697599B2 (en) | Determining a respiratory pattern from a video of a subject | |
US20200138337A1 (en) | Non-Contact Breathing Activity Monitoring And Analyzing System Through Thermal On Projection Medium Imaging | |
Hu et al. | Synergetic use of thermal and visible imaging techniques for contactless and unobtrusive breathing measurement | |
KR101738278B1 (en) | Emotion recognition method based on image | |
Koolen et al. | Automated respiration detection from neonatal video data | |
Lin et al. | Image-based motion-tolerant remote respiratory rate evaluation | |
Procházka et al. | Machine learning in rehabilitation assessment for thermal and heart rate data processing | |
Basu et al. | Infrared imaging based hyperventilation monitoring through respiration rate estimation | |
Pereira et al. | Robust remote monitoring of breathing function by using infrared thermography | |
CN111127511B (en) | Non-contact heart rate monitoring method | |
Ganfure | Using video stream for continuous monitoring of breathing rate for general setting | |
CN105869144A (en) | Depth image data-based non-contact respiration monitoring method | |
Chatterjee et al. | Real-time respiration rate measurement from thoracoabdominal movement with a consumer grade camera | |
Alkali et al. | Facial tracking in thermal images for real-time noncontact respiration rate monitoring | |
Lukáč et al. | Contactless recognition of respiration phases using web camera | |
CN107616795A (en) | A kind of contactless respiratory rate detection method in real time based on camera | |
KR101796871B1 (en) | Apparatus for diagnosing sleep apnea using video and respiration sound | |
CN112363139A (en) | Human body breathing time length detection method and device based on amplitude characteristics and storage medium | |
Tan et al. | Lightweight video-based respiration rate detection algorithm: An application case on intensive care | |
Jagadev et al. | Contactless monitoring of human respiration using infrared thermography and deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20171128 |