WO2021044150A1 - Systèmes et procédés d'analyse de la respiration - Google Patents

Systèmes et procédés d'analyse de la respiration Download PDF

Info

Publication number
WO2021044150A1
WO2021044150A1 PCT/GB2020/052112 GB2020052112W WO2021044150A1 WO 2021044150 A1 WO2021044150 A1 WO 2021044150A1 GB 2020052112 W GB2020052112 W GB 2020052112W WO 2021044150 A1 WO2021044150 A1 WO 2021044150A1
Authority
WO
WIPO (PCT)
Prior art keywords
breathing
user
classified
data
sensor data
Prior art date
Application number
PCT/GB2020/052112
Other languages
English (en)
Inventor
George Edward WINFIELD
Yasin COTUR
Francesco GUAGLIARDO
Original Assignee
Spyras Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spyras Ltd filed Critical Spyras Ltd
Publication of WO2021044150A1 publication Critical patent/WO2021044150A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0803Recording apparatus specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters

Definitions

  • the present techniques generally relate to systems, apparatus and methods for monitoring health, and in particular relate to monitoring health by sensing and analysing breathing.
  • breathing rate has been described as one of the most sensitive and important indicators of the deterioration of patient health.
  • breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours.
  • breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours.
  • some medical cases where breathing rate, and changes in breathing rate, have not been observed have led to avoidable patient death.
  • GB2550833 which describes techniques for analysing breathing data of the type obtained using structured light plethysmography (i.e. by projecting a pattern of light onto a patient) to identify breaths representing a breathing pattern over time
  • W02012/007719 which describes a method of identifying a breath by sensing a signal generated by a human
  • EP0956820 which describes breath analysis apparatus that comprises a spirometer and breath tube
  • US2013/079656 which describes extracting respiratory information from a photoplethysmograph signal
  • JP2019141597 which describes a signal processing device
  • W02014/128090 which describes a respiration monitoring system that can be adhered to a patient's torso.
  • a health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern.
  • the sensor data collected by the sensor may be noisy and may need to be processed (i.e. smoothed) in order to generate a breathing pattern that accurately reflects the user's breathing.
  • a method for health monitoring comprising: sensing, using an apparatus, breathing of a user wearing the apparatus; generating a breathing pattern from the sensed data; and determining from the breathing pattern at least one breathing characteristic.
  • At least one breathing characteristic may be determined or derived from the breathing pattern.
  • the at least one remote processor may be located in any component in the system that is remote to the apparatus used/worn by the user. Multiple remote processors may be used to perform the smoothing of the sensor data (and to thereby generate a breathing pattern from the sensor data, and perform any other processing such as determining breathing characteristics), which may be located in the same or different components in the system.
  • the at least one remote processor may be in a user device and/or in a remote server.
  • one or more of the steps to smooth the sensor data may be performed by the processor(s) in the user device, and one or more steps may be performed by the processor(s) of the remote server.
  • the at least one remote processor may determine an indication of the health of the user from the at least one breathing characteristic.
  • the health monitoring system may be used in a variety of contexts.
  • the health monitoring system may be used by a user to monitor their own health.
  • a user may monitor their breathing during exercise, in which case the indication of the health of the user may comprise information on the user's fitness.
  • the indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example.
  • the fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
  • a user may monitor their breathing while resting or stationary to determine their health, lung capacity or lung health.
  • the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months).
  • This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following an illness, a respiratory illness (such as that caused by COVID-19), disease or surgery, or following the user quitting smoking (or switching from cigarettes to e- cigarettes), or to monitor the health of a user with a chronic condition such as cystic fibrosis or chronic obstructive pulmonary disease (COPD).
  • a respiratory illness such as that caused by COVID-19
  • COVID-19 chronic obstructive pulmonary disease
  • a user admitted to a hospital may wear/use the apparatus so that doctors and nurses in the hospital may monitor the user's breathing more regularly, remotely and without human involvement. This advantageously increases the chances of detecting changes in the user's breathing to be identified and actioned early, and reduces the risk of human error.
  • the system may further comprise at least one user interface for displaying the indication of the health of the user.
  • the user interface may be provided in any suitable manner or on any suitable device.
  • the user interface may be the display screen of an electronic device used by the user, such as a computer or smartphone.
  • the user interface may be the display screen on hospital equipment, such as a computer at a nurses' station or a tablet holding an electronic patient record.
  • the raw sensor data and/or the processed data (e.g. the breathing pattern) may be anonymised.
  • the anonymised data may be associated with a Unique Identifier (UID), which may link the processed data to a patient's personal health records, which may be stored by the hospital in the hospital's own secure servers for patient data privacy.
  • UID Unique Identifier
  • the at least one user interface may be on a user device, and the indication of the health of the user may comprise information on the user's fitness. Additionally or alternatively, the at least one user interface may be on a device in a hospital, and the indication of the health of the user may comprise a warning that the user's health is deteriorating. This may advantageously enable hospital staff to take action sooner than if breathing is monitored by infrequent observation.
  • breathing rate may be monitored through visual assessment every 12 hours, while in intensive care units (ICUs), specialist capnography devices may be used to monitor the concentration or volume of carbon dioxide exhaled by a patient.
  • ICUs intensive care units
  • Electronic techniques for measuring breathing is limited to piezoelectric sensors that measure chest wall movement, impedance measurements that measure changes in chest conductivity during inhalation and exhalation, and microphones to detect the sound of the lungs expanding and contracting.
  • these techniques suffer from a low signal to noise ratio and may require significant filtering to accurately determine breathing rate.
  • the remote processor(s) may smooth the received sensor data to generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum or a local minimum. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-l) and a data point that is immediately after data point N (i.e. N + l).
  • the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point that has been identified and classified may be saved in storage.
  • the at least one remote processor may determine whether each identified inflection point is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it may need to be removed or ignored in order to smooth the sensor data and generate a breathing pattern that accurately reflects the user's breathing. (Preferably, the data point is simply ignored in subsequent processing). For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called “peak prominence detection” or "peak prominence threshold”.
  • the remote processor(s) may determine whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance is below the threshold distance, the remote processor(s) may remove both of the two adjacent inflection points.
  • the threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000.
  • the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000.
  • the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000.
  • the threshold distance may vary depending on, for example, the sensor and external/environmental conditions.
  • the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
  • the remote processor(s) may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote processor(s) may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time.
  • a local maximum i.e. two adjacent peaks
  • two successive inflection points each classified as a local minimum i.e. two adjacent troughs
  • one of the two inflection points may be removed by the remote processor(s) so that it is not used to generate the breathing pattern.
  • This process may be called “peak separation analysis” or “peak distance analysis”.
  • the predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute.
  • Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/ hyperventilation/ tachypnoea.
  • the peak separation analysis may take into account information on the activity that the user is performing/undertaking while using the apparatus. That is, knowledge of this activity (e.g. sitting, running, walking, etc.) could be used to determine the peak separation analysis, as different activities may be associated with different peak separations (or number of breaths per minute). This means knowledge of different breathing rates associated with different activities could be used to perform the peak separation analysis.
  • a breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation.
  • the remote processor(s) may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points. This process may be called "consecutive peaks or troughs detection or elimination".
  • the breathing pattern may be generated in real time. That is, the breathing pattern may be generated as soon as at least two peaks and a trough are received, or two troughs and one peak are received. In some cases, a single peak and trough (or trough and peak) may be sufficient to generate the breathing pattern. It will be understood that this enables real-time analysis to be performed. More data (e.g. more peaks and troughs) may enable a more accurate breathing pattern to be produced, and more accurate breathing characteristics (see below) to be determined. Thus, a breathing pattern and breathing characteristic may be generated/determined in real-time but may also change in real-time as more data is received.
  • the remove processor(s) may determine at least one breathing characteristic from the generated breathing pattern.
  • breathing characteristics that may be derived from the breathing pattern, and which may be used to provide feedback on the user's health or fitness.
  • the at least one breathing characteristic may be any one or more: inhalation speed, exhalation speed, inhalation to exhalation ratio, number of breaths per minute (which could be used to detect hyperventilation, hypocapnia, hypoventilation, hypercapnia, etc.), average breathing rate when wearing a resistive sports mask or resistive respiratory muscle training device (which may depend on the restriction level of the resistive sports mask), exertion score, and depth or volume of inhalation or exhalation (which may be indicative of lung capacity or fitness).
  • inhalation/exhalation speed, inhalation time and exhalation time, and flow rate and therefore the depth or volume of inhalation/exhalation.
  • the remote processor(s) may determine the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time.
  • the sensor measures conductivity as a function of time.
  • the conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out.
  • the decrease in conductivity over time between a peak and a trough may be indicative of a decrease in humidity over time as the user takes a breath (i.e.
  • the conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. For example, an increase in pressure or temperature may occur on exhalation, and a decrease in pressure or temperature may occur on inhalation. Exactly what the conductivity represents may depend on the type of sensor used.
  • the remote processor(s) may determine the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time.
  • the sensor measures conductivity as a function of time.
  • the conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out.
  • the increase in conductivity over time between a trough and a peak may be indicative of an increase in humidity over time as the user exhales (i.e.
  • the conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. Exactly what the conductivity represents may depend on the type of sensor used.
  • the remote processor(s) may determine the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed, or preferably by dividing the inhalation time by the exhalation time.
  • the remote processor(s) may determine the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute. Determining the number of inflection points classified as a local maximum in a minute may comprise determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute. Alternatively, determining the number of inflection points classified as a local maximum in a minute may comprise extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
  • the breathing rate may be extrapolated at a resolution of half-breath.
  • a half-breath resolution may not be useful for calculating breathing rate since inhalation and exhalation time are often different.
  • a half breath resolution may be useful to detect rapid changes in breathing rate that may be clinically relevant (e.g. coughing or a panic attack). These rapid changes would be averaged out if using a BPM calculated over a one- minute window. For example, if a person is coughing, their BPM may be 60 or more using a half breath resolution, but that is not indicative of their average BPM - rather, it is indicative of a specific, short-time event.
  • a resolution of one breath may be the smallest resolution from which an accurate breathing rate can be accurately calculated.
  • the remote processor(s) may determine the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
  • the remote processor(s) may determine the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum. Exhalation depth may enable short breaths and long breaths to be identified, and may enable shallow breaths and deep breaths to be identified.
  • the at least one user interface may display information in addition to the breathing characteristic or indication of the health or fitness of the user.
  • the user interface may display one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
  • the remote processor(s) may be arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period.
  • the remote server may be able to identify any changes in the sensitivity or accuracy of the sensor over time, by considering whether, for example, the maximum and minimum values sensed by the sensor have changed over time.
  • the remote server may be able to send a message to a user device to indicate to the user (or to a hospital staff member or administrator) that the sensor needs to be replaced.
  • the apparatus may further comprise an accelerometer to sense movement of the user while the user is wearing the apparatus.
  • the accelerometer data may be transmitted to the remote server along with the sensor data.
  • the accelerometer data may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic.
  • the accelerometer data may be mapped to or matched to the generated breathing pattern, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance.
  • the remote processor(s) may use data from the accelerometer to generate the breathing pattern and determine the at least one breathing characteristic.
  • the remote server may use additional input data to generate the breathing pattern and determine the at least one breathing characteristic.
  • the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
  • the additional input data may be obtained by the remote server from external sources or third party sources.
  • the weather data may be obtained from a national weather service provider (such as the Met Office in the UK), while altitude data may be obtained from a map provider (such as via open APIs).
  • the remote processor(s) may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately. Similarly, other data such as pressure and external/environmental temperature may be used.
  • the geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate or taking deeper breaths. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
  • the sensor of the apparatus may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor/detector, and a sensor comprising a porous material. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
  • the apparatus may be any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a disposable face mask, a personal protection equipment face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula. It would be understood that this is an example, non- exhaustive list of possible types of apparatus that could be used to sense breathing and monitor user health.
  • the apparatus may be any device which is able to be placed in the proximity of exhaled air or which can receive exhaled air (e.g. via tubes that direct exhaled air from the user to the apparatus).
  • the sensor and communication module may be removably attached to the apparatus. This may be useful if the apparatus is a disposable device such as a disposable mask, or a reusable device that is washed for reuse, such as a washable mask.
  • the sensor and communication module may be removed before the apparatus is disposed of (enabling reuse of the sensor and communication module), or before the apparatus is washed.
  • the sensor and communication module may be irremovably attached to the apparatus. This may be achieved by integrating the sensor and communication module into the apparatus, by any suitable means.
  • the remote processor may: determine an indication of the health of the user from the at least one breathing characteristic; and transmit the indication of the health of the user to any one or more of: a user device, a third party device, and a third party server.
  • the remote processor may be configured to: transmit the received sensor data to a third party device or third party server.
  • the remote processor may be configured to: transmit the generated breathing pattern to a third party device or third party server.
  • the transmission in any case to any device may be in real-time. This may advantageously enable the user or a third party to see real-time data for a user, which could be particularly useful for clinicians or in a hospital.
  • the remote processor may be configured to: receive sensor data from a third party server for analysis; smooth the received sensor data from the third party server to generate a breathing pattern; and transmit the generated breathing pattern to the third party server. That is, third parties may send sensor data to be processed by the remote processor.
  • the method may further comprise: providing, to a user device, breathing exercises for the user to follow while using the apparatus; and analysing, using the received sensor data, user performance while undertaking the breathing exercises.
  • the method may further comprise determining an exertion score by: calculating, using the breathing characteristic, a distribution profile of the breathing characteristic; determining a scaling factor; scaling the distribution profile using the scaling factor to generate a scaled distribution profile; and determining the exertion score using the scaled distribution profile.
  • the breathing characteristic may be breathing rate (i.e. breaths per minute or BPM).
  • the breathing characteristic may be a breath depth.
  • the method may comprise determining the breath depth from the breathing pattern by: determining an average breath depth for a predetermined number of breaths; comparing a depth of a subsequent breath to the average breath depth; and classifying the subsequent breath as a shallow breath when the depth of the subsequent breath is below the average breath depth by a threshold value, or as a deep breath when the depth of the subsequent is above the average breath depth by a threshold value.
  • the method may further comprise providing the user, using a total number of breaths recorded using the apparatus, with an indication of when to replace the sensor of the apparatus. This may be advantageous as it may indicate, ahead of the end of the lifetime of the sensor, when the sensor needs to be replaced (if it is replaceable), or when a new apparatus needs to be used.
  • a (non- transitory) computer readable medium carrying processor control code which when implemented in a system causes the system to carry out any of the methods, processes and techniques described herein.
  • present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
  • the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs. Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
  • the techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP).
  • DSP digital signal processor
  • the techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier.
  • the code may be provided on a carrier such as a disk, a microprocessor, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier.
  • Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language).
  • a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
  • a logical method may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit.
  • Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
  • the present techniques may be implemented using multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
  • the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
  • Figure 1 shows a schematic diagram of a health monitoring system
  • Figure 2 shows a flowchart of example steps performed by the health monitoring system of Figure 1;
  • Figure 3A shows example sensor data sensed by the health monitoring system of Figure 1 after peak detection has been performed
  • Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed
  • Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed
  • Figure 3D shows an example generated breathing pattern
  • Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously;
  • Figure 4C shows data used to determine another technique for generating an exertion score from sensor data asynchronously; and Figure 5 shows a flowchart of example steps to determine an exertion score.
  • embodiments of the present techniques provide a health monitoring system which uses user breathing data to determine a user's or patient's health.
  • FIG. 1 shows a schematic diagram of a health monitoring system 100.
  • the system 100 comprises an apparatus 102 and a remote processor(s) 120.
  • the apparatus 102 may be a wireless apparatus or a wired apparatus. In other words, the apparatus 102 may be capable of wirelessly transmitting data to another device, or may need a wired connection to transmit data to another example.
  • the apparatus 102 may comprise at least one sensor 104 for sensing breathing of a user wearing the apparatus 102.
  • the at least one sensor 104 may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor, and a sensor comprising a porous material.
  • An example of a sensor comprising a porous material can be found in International Patent Publication No.
  • the apparatus 102 may comprise a communication module 106 for transmitting sensor data.
  • the data collected by the sensor 104 may be transmitted to an external device or server for storage and analysis. This may be advantageous because the apparatus 102 may not have the processing power or capacity to analyse the data, and/or the storage capacity to store large quantities of data.
  • the data collected by the sensor 104 may be transmitted periodically to an external device/server, such as every second, or every few seconds, or at any suitable frequency such as, but not limited to, 12.5Flz or 20Flz. Alternatively, data collected by the sensor 104 may be transmitted at longer intervals or at irregular times in certain circumstances. For example, if the apparatus 102 is not within range to be able to communicate with an external device (e.g.
  • the apparatus 102 may have storage or memory 132 to temporarily store sensor data collected by the sensor 104 when real-time communication to an external device is not possible.
  • the storage 132 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory.
  • the storage 132 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
  • the data collected by the sensor 104 may be communicated/transmitted to the remote server 120 directly, or via an intermediate device. Communicating the sensor data to an intermediate device may be preferable due to the communication capability of the communication module 106. For example, if the communication module 106 is only capable of short range communication, then the sensor data may need to be transmitted to an intermediate device which is capable of transmitting the sensor data to the remote sever. In some cases, the communication module 106 may be able to communicate directly with the remote server. In some cases, the intermediate device may process the sensor data into a format or into processed data that the remote server can handle.
  • the communication module 106 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow.
  • the communication module 106 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc.
  • the communication module 106 may use a wired communication technique to transfer sensor data to an intermediate/external device, such as via metal cables (e.g. a USB cable) or fibre optic cables.
  • the communication module 106 may use more than one communication technique to communicate with other components in the system 100.
  • the apparatus 102 may comprise a processor or processing circuitry 108.
  • the processor 108 may control various processing operations performed by the apparatus 102, such as communicating with other components in system 100.
  • the processor 108 of the apparatus 102 may simply control the operation of the sensor 104, communication module 106 and storage 132.
  • the processor 108 may have some further processing capability.
  • the processor 108 may comprise processing logic to processor data (e.g. the sensor data collected by sensor 104), and generate output data/signals/messages in response to the processing.
  • the processor 108 may be able to compress the sensor data for example, to reduce the size of the data that is being transmitted to another device.
  • the processor 108 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
  • the apparatus 102 may optionally comprise an accelerometer 133 to sense movement of the user while the user is wearing the apparatus 102.
  • the accelerometer data may be transmitted to an external device along with the sensor data.
  • Apparatus 102 may optionally comprise an interface 138 for providing feedback on a user's breathing.
  • the interface 138 may be one or more LEDs or other lights which may turn on and off according to the generated breathing pattern. This may provide a visual indicator to a user of the apparatus or a third party (e.g. a doctor or personal trainer) of the generated breathing pattern.
  • the system 100 may comprise a remote server 120 for performing one or more of the steps involved in smoothing the sensor data received from the apparatus 102.
  • the apparatus 102 may transmit sensor data to the remote server 120.
  • the remote server 120 may then generate a breathing pattern from the sensor data, and determine from the breathing pattern at least one breathing characteristic.
  • the remote server 120 may comprise at least one processor 123 and storage 122.
  • Storage 122 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory.
  • the storage 122 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data (such as the sensor data received from the apparatus 102), programs, or instructions, for example.
  • the remote server 120 may create visualisations and plots of sensor data, and may send these to a user device or any third party device for visualisation purposes.
  • the system 100 may comprise a user device 110.
  • the user device 110 may be any type of electronic device, such as, for example, a smartphone, a mobile computing device, a laptop, tablet or desktop computer, or a mobile or portable electronic device.
  • the user device 110 may be a dedicated user device that is specifically for use with the apparatus 102.
  • the user device 110 may be a non-dedicated user device, such as a smartphone.
  • the user device 110 may comprise a software application ('app') 112 which is associated with the system 100.
  • the app 112 may be launched or run when the user puts on the apparatus 102. For example, when the user is about to begin exercising, the user may put on the apparatus 102 and run the app 112.
  • the app 112 may comprise a 'record' or 'start' function, which the user may press/engage when they want to start measuring their breathing using the apparatus 102.
  • the app 112 may communicate with the apparatus 102 to instruct the sensor 104 to begin sensing and/or to instruct the communication module 106 to begin transmitting sensor data. Additionally or alternatively, when the user presses 'record' or 'start' on the app 112, the app 112 may prepare to receive sensor data from the apparatus 102.
  • the app 112 may display the sensor data as it is received from the apparatus 102. Additionally or alternatively, the app 112 may display the generated breathing pattern produced by the remote server 120.
  • the user device 110 may comprise a user interface 114 to display, for example, the app 112, sensor data, generated breathing pattern, and/or any other information.
  • the user interface 114 may be the display screen of a smartphone for example.
  • the user device 110 may comprise a processor 116 to control various processing operations performed by the user device, such as communicating with other components in system 100.
  • the processor 116 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
  • the user device 110 may comprise a communication module 118.
  • the communication module 118 may receive the sensor data from the communication module 106 of the apparatus 102.
  • the communication module 118 may be able to communicate with the remote server 120, i.e. to transmit the received sensor data to the remote server 120 for processing/analysis.
  • the communication module 118 may be able to receive data from the remote server 120.
  • the communication module 118 may receive a generated breathing pattern in real-time, near real-time or after the sensing performed by sensor 104 has ended.
  • the generated breathing pattern may be displayed on the user interface 114 (e.g. in via app 112). That is, in some cases, sensor data may be transmitted from the apparatus 102 in real-time (e.g.
  • the user device 110 may transmit the sensor data to the remote server, and receive a generated breathing pattern back from the remote server 120, which the user device 110 may display.
  • the user device 110 may also receive, for example, the at least one breathing characteristic from the remote server 120 as the characteristic, and may also display the at least one breathing characteristic. It may be possible for the user of user device 110 to see the raw sensed data on the user device and see how, in real-time, the remote server is generating the breathing pattern from the raw data.
  • the communication module 118 may have at least the same communication capability as the communication module 106 of the apparatus 102, and the remote server 120.
  • the communication module 118 may use the same or different communication techniques or protocols to communicate with the communication module 106 and remote server 120.
  • the communication module 118 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow.
  • a wireless communication protocol such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (N
  • the communication module 118 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc.
  • the communication module 118 may use a wired communication technique to receive sensor data from the apparatus 102, such as via metal cables (e.g. a USB cable) or fibre optic cables.
  • the communication module 118 may use more than one communication technique to communicate with other components in the system 100.
  • Figure 1 shows system 100 as having a single remote server 120. It will be understood that the system 100 may have multiple servers 120. One or more of the servers 120 may be used to collect, process and store data collected from multiple apparatuses 102. One or more of the servers 120 may be private servers or dedicated servers to ensure the sensor data is stored securely. For example, if apparatuses 102 are used in a hospital, it may be preferable for the sensor data to be collected, processed and stored by a dedicated server within the hospital, to ensure patient privacy and data confidentiality. In this case, the system 100 may comprise a router 128 for receiving sensor data from each apparatus 102 within the hospital and transmitting this to the dedicated server 120.
  • the system 100 may comprise a user interface 126 (e.g. a display screen) on hospital equipment or a third party device 124, such as a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • a user interface 126 e.g. a display screen
  • hospital equipment e.g. a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • a third party device 124 such as a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • This may again ensure that the patient data is kept secure within the hospital itself.
  • System 100 may be used by a personal trainer to monitor patient health, and in this case, the third party device 124 may belong to a personal trainer.
  • the personal trainer may be able to see the user's breathing pattern during a personal training session.
  • the third party device 124 may be able to display, in real-time, group workout data, i.e. breathing patterns for multiple users in a group exercise session.
  • the group workout data could be used to provide live dashboards showing a ranking, based on the breathing patterns, of each user in the group.
  • raw sensor data collected by sensor 104 may be transmitted to the user device 110, and the user device 110 may transmit the sensor data to the remote server 120 for processing.
  • Algorithms, code, routines or similar for smoothing the raw sensor data to generate a breathing pattern may be stored in the remote server 120 (e.g. in storage 122) and run by processor 123.
  • the remote server 120 may also use the generated breathing pattern to determine one or more breathing characteristics, and the algorithms or techniques to determine the breathing characteristics may also be stored on the remote server 120.
  • the results of the analysis (e.g. the breathing pattern and/or the breathing characteristics) may be transmitted by the remote server 120 back to the user device 110 for display via a user interface 114 (e.g. via an app on a display screen).
  • the remote server 120 may use additional input data 130 to generate the breathing pattern and determine the at least one breathing characteristic.
  • the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
  • the additional input data 130 may be received or pulled in from public sources or websites, such as openweathermap.org.
  • the remote server 120 may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately.
  • the geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
  • the accelerometer data collected by accelerometer 133 in the apparatus 102 may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic.
  • the accelerometer data may be mapped to or matched to the generated breathing pattern by the remote server 120, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance.
  • the remote server 120 may use data from the accelerometer 133 to generate the breathing pattern and determine the at least one breathing characteristic.
  • Figure 2 shows a flowchart of example steps performed by the at least one remote processor of the health monitoring system 100.
  • the method performed by the at least one remote processor may comprise receiving sensor data from an apparatus 102, the sensor data being the sensed breathing of a user wearing the apparatus 102 (step S100).
  • the remote processor(s) may smooth the sensor data to generate a breathing pattern.
  • the remote processor may determine at least one breathing characteristic from the breathing pattern.
  • the remote processor may use the at least one breathing characteristic to determine an indication of user health (step S106).
  • the indication of the health of the user may comprise information on the user's fitness.
  • the indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example.
  • the fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
  • the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months).
  • This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following a respiratory illness, disease or surgery, or following the user quitting smoking (or switching from cigarettes to e-cigarettes).
  • the indication of the health of the user may comprise information on whether the user's breathing has changed suddenly or unexpectedly (e.g. increase or decrease in breathing rate, or an increase or decrease in inhalation/exhalation depth or volume - e.g. deeper or shallower breaths). This may be useful in a hospital, as it may enable changes in the user's breathing to be identified and actioned early.
  • the remote processor may transmit data on the user's fitness (step S108).
  • the remote processor may transmit a message to a hospital device 124 warning of the deteriorating health or condition of the user (step S110) if the user's breathing has changed suddenly.
  • FIG 3A shows example sensor data 300 sensed by the health monitoring system of Figure 1 after peak detection has been performed.
  • the sensor data may be conductivity changes over time, but this may depend on the type of sensor 104 in the apparatus 102.
  • the remote server may generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum 302 or a local minimum 304. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e.
  • each inflection point 302, 304 that has been identified and classified may be saved in storage 122.
  • the remote server 120 may determine whether each identified inflection point 302, 304 is indicative of a breathing pattern or of noise.
  • an inflection point is indicative of noise, it needs to be removed or ignored in order to generate a breathing pattern that accurately reflects the user's breathing. For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection".
  • Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed.
  • the remote server 120 may determine whether a distance 312 between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e.
  • the threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath.
  • analogue sensor values e.g. voltages
  • the threshold distance may be 1000. More specifically, if a sensor's values range from 0V to 3.3V, the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000.
  • the peaks and troughs in region 314 of the sensor data have been determined to not be representative of breathing. As shown in Figure 3B, the peaks and troughs in region 314 are no longer marked/tagged as inflection points, and so will not be used to generate a breathing pattern.
  • the threshold distance may vary depending on, for example, the sensor and external/environmental conditions.
  • the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
  • the remote server may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote server may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time.
  • a local maximum i.e. two adjacent peaks
  • two successive inflection points each classified as a local minimum i.e. two adjacent troughs
  • FIG. 3C shows the sensor data of Figure 3B after peak separation analysis has been performed.
  • the two adjacent troughs or local minima 304a, 304b are considered to be too close together and not representative of a real breathing pattern. Consequently, as shown in Figure 3C, point 304b is no longer marked/tagged as an inflection point, and so will not be used to generate a breathing pattern.
  • an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together.
  • the predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute.
  • Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/hyperventilation.
  • a breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation.
  • the remote server may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points.
  • Figure 3D shows an example generated breathing pattern 350 after this process has been performed.
  • two adjacent inflection points that were classified as local maxima 302a, 302b do not have a local minimum between them. Accordingly, as shown in Figure 3D, inflection point 302b is no longer marked/tagged as an inflection point.
  • the resulting data 350 is the generated breathing pattern which shows a series of peaks and troughs that are representative of breathing (i.e. noise has been removed).
  • one or more breathing characteristics may be derived or determined.
  • a number of breathing characteristics may be determined, such as, for example:
  • Inhalation Speed e.g. the amplitude of a peak minus the amplitude of a neighbouring/consecutive trough, divided by inhalation time
  • Inhalation Time e.g. the time between a consecutive peak and a trough in the breathing pattern
  • Other characteristics relating to either the user activity or the sensor itself may be determined from the original sensor data and/or the breathing pattern, and/or other data may be collected from the apparatus 102 or user device 110, such as, for example:
  • Resistance training levels of a resistance training mask worn by the user (where the level may be obtained directly from the apparatus or via a user input on an app on the user device 110). For example, it may be determined that a user takes more breaths when using a training mask at one resistance level compared to when they use the training mask set at another resistance level.
  • Signal depth - which may be indicative of shallow breathing, deep breaths, etc.
  • One way to determine shallow or deep breaths based on signal depths is as follows: Since signal depth is sensor- and environment-dependent, after a period of sensor stabilisation (e.g. a minute), an average of the previous N breaths (e.g. 20 breaths) is taken. Each new breath is compared with this average. If the new breath depth is above or below the average by a certain percentage or amount, then this new breath may be classified as a deep or a shallow breath, respectively. This value is then added to the average, while the oldest breath is removed from the average, creating a moving window of N breaths. The next breath is then analysed in the same way, and the average modified accordingly. In the case where the depth is much higher or lower than the average value, then the depth value in this case would not be included in the average to ensure the average is not biased by such outliers.
  • Exertion score e.g. how hard is the user exercising or breathing based on the generated breathing pattern (and relative to e.g. historical breathing patterns/characteristics collected during past exercise sessions)
  • the Exertion Score may be used to measure the intensity of a user's exercise activity.
  • the Exertion Score may be a numerical score or value ranging from 0 to 10.
  • the table below provides example scores and their meaning:
  • the first example technique in Table 2 above for generating an exertion score can be performed in real-time, i.e. synchronously with the sensor data collection.
  • the first example technique determines the exertion score (ES) by multiplying the breaths per minute (BPM), as determined from the sensor data, with a rate of increase of the signal from the sensor when a user exhales on the sensor (OutSpeed). This is multiplied by a constant (Scaling) to generate an exertion score that is in a range between, for example, 0 to 100 or 0 to 10 (as in Table 1 above).
  • the second example technique in Table 2 above for generating an exertion score can be performed both synchronously and asynchronously.
  • the second example is an extension of the first example technique in the table above.
  • the second technique determines the exertion score (ES) by applying to the breaths per minute (BPM) as determined from the sensor data, a function f(BPM) to generate an exertion score that is in a range between 0 to 10.
  • Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously.
  • Figure 4A shows breaths per minute against exertion score, as the f(BPM) scales the BPM between 0 and 10.
  • a breaths per minute value of 20 corresponds to an exertion score of 1 (little exertion), while a breaths per minute value of 50 corresponds to an exertion score of 8. From this, various models were trialled to determine which model best fit the data in the graph.
  • a Gaussian model was chosen and thus, the constants in Table 2 above (for technique 2) were obtained by fitting the Gaussian model to the data.
  • the function is termed f(BPM).
  • the third technique in Table 2 above generates an exertion score by normalising breaths per minute data obtained throughout a workout using the highest BPM value.
  • the normalised value is then inputted into the equation shown in the table above to calculate the exertion score. This process is described by Nicolo et al in “Respiratory Frequency during Exercise: The Neglected Physiological Measure” (Frontiers in Physiology, December 2017, Vol. 8, Article 922).
  • the fourth technique in Table 2 above can be used to generate an exertion score from sensor data asynchronously.
  • Figure 4C shows data relating to this technique. Specifically, the top graph shows a distribution plot of how much time a user spends at each BPM. A reason why high BPMs of 70 and 80 appear on the distribution plot is because a window of 10 seconds is being used, and a person can breathe at what would be 70BPM for 10 seconds easily, but maintaining this rate for 60 seconds or more would be much harder.
  • This means the scaling function, which is a function of duration at each BPM is also a function of the window size for which BPMs have been calculated.
  • the scaling coefficients may be calculated manually, or preferably, may be calculated using a machine learning model which has been trained on breathing data from many people who also provide their perceived exertion score.
  • the BPM distribution is then scaled, where higher BPMs are scaled-up and lower BPMs are scaled-down (as described above).
  • the distribution profile is replotted, as shown in Figure 4C (bottom figures).
  • the scaling function may be called f(BPM_duration, windowSize).
  • the area under the scaled distribution profile is then calculated as the exertion score. This area is determined by calculating the sum(f(BPM_duration, windowSize)*BPM_duration). For intense workouts, the area may have values reaching 90-100 (which is then scaled by a further 1/10 to be in the range of 0 to 10 as per Table 1 above), while for non-workouts the exertion score may be below 0 due to the scaling (and negative scaling).
  • Negative scaling means that a user's resting periods during a workout, for example at around 10 to 15 BPMs for some people, would be subtracted from the total exertion score. This means if someone rests too long during exercise the exertion score will be lowered. If the overall exertion score is negative this may be capped to zero. This may occur if an activity was recorded while doing a simple walk or sitting at the desk.
  • FIG. 5 shows a flowchart of example steps to determine an exertion score.
  • the process may begin by receiving breathing characteristic data after an exercise session has been completed (step S500).
  • a distribution profile may be calculated using the breathing characteristic data (step S502).
  • An algorithm or technique, such as one described above, may be used to determine a scaling factor (step S504).
  • the scaling factor may be used to scale the distribution profile, as described above and the scaled distribution profile may be used to determine an exertion score (S506).
  • the exertion score may be transmitted to the user or to a clinician or personal trainer, for example.

Abstract

D'une manière générale, des modes de réalisation de la présente invention concernent un système de surveillance de santé qui comprend un capteur pour détecter la respiration d'un utilisateur et un processeur pour lisser les données de capteur reçues pour générer un modèle de respiration.
PCT/GB2020/052112 2019-09-05 2020-09-04 Systèmes et procédés d'analyse de la respiration WO2021044150A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1912815.6 2019-09-05
GB1912815.6A GB2586848A (en) 2019-09-05 2019-09-05 Systems and methods for analysing breathing

Publications (1)

Publication Number Publication Date
WO2021044150A1 true WO2021044150A1 (fr) 2021-03-11

Family

ID=68241188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2020/052112 WO2021044150A1 (fr) 2019-09-05 2020-09-04 Systèmes et procédés d'analyse de la respiration

Country Status (2)

Country Link
GB (1) GB2586848A (fr)
WO (1) WO2021044150A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022090702A2 (fr) 2020-10-26 2022-05-05 Spyras Ltd Appareil de détection et d'analyse de respiration
WO2023114494A1 (fr) * 2021-12-16 2023-06-22 Breezee, Inc. Dispositif et procédés de surveillance et d'entraînement respiratoire
US20230199348A1 (en) * 2021-12-16 2023-06-22 3M Innovative Properties Company System and computer-implemented method for providing responder information

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112120701A (zh) * 2020-09-17 2020-12-25 江苏集萃有机光电技术研究所有限公司 一种呼吸监控口罩及呼吸监控方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0956820A1 (fr) 1996-10-04 1999-11-17 Karmel Medical Acoustic Technologies Ltd. Détermination d'apnée
US20050241639A1 (en) * 2003-01-30 2005-11-03 Compumedics, Inc. Algorithm for automatic positive air pressure titration
WO2012007719A1 (fr) 2010-07-14 2012-01-19 Imperial Innovations Limited Identification de caractéristiques pour moniteur de respiration
US20130079656A1 (en) 2011-09-23 2013-03-28 Nellcor Puritan Bennett Ireland Systems and methods for determining respiration information from a photoplethysmograph
US20130331723A1 (en) * 2011-02-22 2013-12-12 Miguel Hernandez-Silveira Respiration monitoring method and system
WO2014128090A1 (fr) 2013-02-20 2014-08-28 Pmd Device Solutions Limited Procédé et dispositif de surveillance respiratoire
WO2016065180A1 (fr) 2014-10-22 2016-04-28 President And Fellows Of Harvard College Détection de gaz et de respiration par la conductivité d'eau dans un capteur de substrat poreux
GB2550833A (en) 2016-02-26 2017-12-06 Pneumacare Ltd Breath identification and matching
US20180317846A1 (en) * 2017-05-08 2018-11-08 Intel Corporation Respiratory biological sensing
JP2019141597A (ja) 2019-03-05 2019-08-29 パイオニア株式会社 信号処理装置及び方法、並びにコンピュータプログラム及び記録媒体

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0956820A1 (fr) 1996-10-04 1999-11-17 Karmel Medical Acoustic Technologies Ltd. Détermination d'apnée
US20050241639A1 (en) * 2003-01-30 2005-11-03 Compumedics, Inc. Algorithm for automatic positive air pressure titration
WO2012007719A1 (fr) 2010-07-14 2012-01-19 Imperial Innovations Limited Identification de caractéristiques pour moniteur de respiration
US20130331723A1 (en) * 2011-02-22 2013-12-12 Miguel Hernandez-Silveira Respiration monitoring method and system
US20130079656A1 (en) 2011-09-23 2013-03-28 Nellcor Puritan Bennett Ireland Systems and methods for determining respiration information from a photoplethysmograph
WO2014128090A1 (fr) 2013-02-20 2014-08-28 Pmd Device Solutions Limited Procédé et dispositif de surveillance respiratoire
WO2016065180A1 (fr) 2014-10-22 2016-04-28 President And Fellows Of Harvard College Détection de gaz et de respiration par la conductivité d'eau dans un capteur de substrat poreux
US10712337B2 (en) 2014-10-22 2020-07-14 President And Fellows Of Harvard College Detecting gases and respiration by the conductivity of water within a porous substrate sensor
GB2550833A (en) 2016-02-26 2017-12-06 Pneumacare Ltd Breath identification and matching
US20180317846A1 (en) * 2017-05-08 2018-11-08 Intel Corporation Respiratory biological sensing
JP2019141597A (ja) 2019-03-05 2019-08-29 パイオニア株式会社 信号処理装置及び方法、並びにコンピュータプログラム及び記録媒体

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NICOLO ET AL.: "Respiratory Frequency during Exercise: The Neglected Physiological Measure", FRONTIERS IN PHYSIOLOGY, vol. 8, December 2017 (2017-12-01)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022090702A2 (fr) 2020-10-26 2022-05-05 Spyras Ltd Appareil de détection et d'analyse de respiration
WO2023114494A1 (fr) * 2021-12-16 2023-06-22 Breezee, Inc. Dispositif et procédés de surveillance et d'entraînement respiratoire
US20230199348A1 (en) * 2021-12-16 2023-06-22 3M Innovative Properties Company System and computer-implemented method for providing responder information
US11722807B2 (en) * 2021-12-16 2023-08-08 3M Innovative Properties Company System and computer-implemented method for providing responder information
US11962958B2 (en) 2021-12-16 2024-04-16 3M Innovative Properties Company System and computer-implemented method for providing responder information

Also Published As

Publication number Publication date
GB2586848A (en) 2021-03-10
GB201912815D0 (en) 2019-10-23

Similar Documents

Publication Publication Date Title
WO2021044150A1 (fr) Systèmes et procédés d'analyse de la respiration
US9706946B2 (en) Spirometer system and methods of data analysis
JP6200430B2 (ja) 圧補助装置を監視し且つ制御する方法及び装置
JP2005537068A5 (fr)
CA2847412C (fr) Systeme et procedes d'estimation d'un ecoulement d'air respiratoire
JP6665179B2 (ja) 被験者の健康状態を決定する方法及びデバイス
US20140155774A1 (en) Non-invasively determining respiration rate using pressure sensors
Basra et al. Temperature sensor based ultra low cost respiration monitoring system
JP6315576B2 (ja) 寝息呼吸音解析装置及び方法
CN108366756B (zh) 基于呼吸气体确定对象的呼吸特征的装置、系统和方法
US20230000388A1 (en) Oxygen mask respirometer
CN114391809A (zh) 一种非报告式嗅觉功能检查设备及其检查方法
CA2963471C (fr) Dispositif et procede permettant d'evaluer des donnees respiratoires chez un sujet surveille
EP3781028A1 (fr) Spiromètre électronique mobile portatif
JP2018201725A (ja) 生体情報処理装置
JP7109534B2 (ja) 肺活量計の流れ感知配置構成
KR101696791B1 (ko) 흉부임피던스를 이용한 폐기능 모니터링 장치 및 방법
EP4069077B1 (fr) Systèmes et procédés de surveillance métabolique
CN109316189B (zh) 一种非接触性呼吸动态检测方法和装置
WO2017042350A1 (fr) Procédé et système de surveillance de paramètre de ventilation
JP6552158B2 (ja) 解析装置、解析方法及びプログラム
Jin et al. VTMonitor: Tidal Volume Estimation Using Earbuds
CN108471951A (zh) 用于监测患者的copd的可植入装置及方法
JP2024506439A (ja) 心電図信号と呼吸信号を出力する装置および方法
JP2022122975A (ja) 生体モニタリングシステム及びそのプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20768669

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20768669

Country of ref document: EP

Kind code of ref document: A1