GB2586848A - Systems and methods for analysing breathing - Google Patents
Systems and methods for analysing breathing Download PDFInfo
- Publication number
- GB2586848A GB2586848A GB1912815.6A GB201912815A GB2586848A GB 2586848 A GB2586848 A GB 2586848A GB 201912815 A GB201912815 A GB 201912815A GB 2586848 A GB2586848 A GB 2586848A
- Authority
- GB
- United Kingdom
- Prior art keywords
- breathing
- user
- monitoring system
- health monitoring
- classified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0803—Recording apparatus specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Pulmonology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
System 100 comprises apparatus 102 including sensor 104 monitoring a user’s breathing, such as a thermistor, humidity or pressure sensor, or microphone. A remote processor 123, for example in a smart phone, may receive the sensed data and smooths the received data to generate a breathing pattern. A user interface, such as at a nurse’s station in a hospital, may display related information. The apparatus may be an inhaler, e-cigarette or cannula or be wearable such as a sports, oxygen or surgical mask. The smoothing may include classifying local maxima or minima and comparing distance or time between adjacent points with a threshold, removing at least one point. Breathing characteristics such as inhalation speed or breathing rate may be determined. The system may enable a user to monitor their health, fitness, whether an exercise regime is effective or comprise information on e.g. lung capacity or ongoing chronic conditions e.g. COPD.
Description
Systems and Methods for Analysing Breathing The present techniques generally relate to systems, apparatus and methods for monitoring health, and in particular relate to monitoring health by sensing and analysing breathing.
Breathing is difficult to measure, but breathing rate has been described as one of the most sensitive and important indicators of the deterioration of patient health. However, generally speaking, in hospitals breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours. As well as being time-consuming, qualitative and highly prone to human error, some medical cases where breathing rate, and changes in breathing rate, have not been observed have led to avoidable patient death.
Therefore, there is a desire to provide an improved system and method for monitoring health by sensing and analysing breathing.
In a first approach of the present techniques, there is provided a health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern. The sensor data collected by the sensor may be noisy and may need to be processed (i.e. smoothed) in order to generate a breathing pattern that accurately reflects the user's breathing.
Once the breathing pattern has been generated, at least one breathing characteristic may be determined or derived from the breathing pattern.
The at least one remote processor may be located in any component in the system that is remote to the apparatus used/worn by the user. Multiple remote processors may be used to perform the smoothing of the sensor data, which may be located in the same or different components in the system. For example, the at least one remote processor may be in a user device and/or in a remote server.
In some cases, one or more of the steps to smooth the sensor data may be performed by the processor(s) in the user device, and one or more steps may be performed by the processor(s) of the remote server.
The at least one remote processor may determine an indication of the health of the user from the at least one breathing characteristic.
The health monitoring system may be used in a variety of contexts. For example, the health monitoring system may be used by a user to monitor their own health. A user may monitor their breathing during exercise, in which case the indication of the health of the user may comprise information on the user's fitness. The indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example. The fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
In another example, a user may monitor their breathing while resting or stationary to determine their health, lung capacity or lung health. In this case, the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following an illness, a respiratory illness, disease or surgery, or following the user quitting smoking (or switching from cigarettes to e-cigarettes), or to monitor the health of a user with a chronic condition such as cystic fibrosis or chronic obstructive pulmonary disease (COPD).
In another example, a user admitted to a hospital may wear/use the apparatus so that doctors and nurses in the hospital may monitor the user's breathing more regularly, remotely and without human involvement. This advantageously increases the chances of detecting changes in the user's breathing to be identified and actioned early, and reduces the risk of human error.
The system may further comprise at least one user interface for displaying the indication of the health of the user. The user interface may be provided in any suitable manner or on any suitable device. For example, if the system is being used by a user to monitor themselves, the user interface may be the display screen of an electronic device used by the user, such as a computer or smartphone. If the system is being used by a hospital to monitor patient health, the user interface may be the display screen on hospital equipment, such as a computer at a nurses' station or a tablet holding an electronic patient record.
Thus, the at least one user interface may be on a user device, and the indication of the health of the user may comprise information on the user's fitness. Additionally or alternatively, the at least one user interface may be on a device in a hospital, and the indication of the health of the user may comprise a warning that the user's health is deteriorating. This may advantageously enable hospital staff to take action sooner than if breathing is monitored by infrequent observation.
As mentioned above, on a general ward in a hospital, breathing rate may be monitored through visual assessment every 12 hours, while in intensive care units (ICUs), specialist capnography devices may be used to monitor the concentration or volume of carbon dioxide exhaled by a patient. Electronic techniques for measuring breathing is limited to piezoelectric sensors that measure chest wall movement, impedance measurements that measure changes in chest conductivity during inhalation and exhalation, and microphones to detect the sound of the lungs expanding and contracting. However, these techniques suffer from a low signal to noise ratio and may require significant filtering to accurately determine breathing rate.
In the present techniques, the remote processor(s) may smooth the received sensor data to generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum or a local minimum. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-1) and a data point that is immediately after data point N (i.e. N+1). In the case where data point N is greater than both data points N-1 and N+1, the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-1 and N+1, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point that has been identified and classified may be saved in storage.
The at least one remote processor may determine whether each identified inflection point is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it needs to be removed or ignored in order to smooth the sensor data and generate a breathing pattern that accurately reflects the user's breathing. For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection". Thus, the remote processor(s) may determine whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance is below the threshold distance, the remote processor(s) may remove both of the two adjacent inflection points. The threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000. More specifically, if a sensor's values range from OV to 3.3V, the digital values may range from 0 to 65500 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000.
The remote processor(s) may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote processor(s) may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time. When the time is less than a predetermined time, one of the two inflection points may be removed by the remote processor(s) so that it is not used to generate the breathing pattern. This process may be called "peak separation analysis" or "peak distance analysis". However, it is important to choose an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together. The predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute. Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/ hyperventilation/ tachypnoea.
A breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation. Thus, if two adjacent inflection points are both classified as a local maximum or as a local minimum, such that there are two peaks next to each other without a trough in-between, or two troughs without a peak in-between, then the sensor data does not represent an alternating sequence of inhalation and exhalation. Accordingly, the remote processor(s) may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points.
Once a breathing pattern has been generated, the remove processor(s) may determine at least one breathing characteristic from the generated breathing pattern. There are a number of breathing characteristics that may be derived 35 from the breathing pattern, and which may be used to provide feedback on the user's health or fitness. For example, the at least one breathing characteristic may be any one or more: inhalation speed, exhalation speed, inhalation to exhalation ratio, number of breaths per minute (which could be used to detect hyperventilation, hypocapnia, hypoventilation, hypercapnia, etc.), average breathing rate when wearing a resistive sports mask (which may depend on the restriction level of the resistive sports mask), exertion score, and depth or volume of inhalation or exhalation (which may be indicative of lung capacity or fitness).
When the breathing characteristic is inhalation speed, the remote processor(s) may determine the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time. In the example data shown in Figures 3A to 3D, the sensor measures conductivity as a function of time. The conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out. Thus, the decrease in conductivity over time between a peak and a trough may be indicative of a decrease in humidity over time as the user takes a breath (i.e. the inhalation speed).
When the breathing characteristic is exhalation speed, the remote processor(s) may determine the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time. In the example data shown in Figures 3A to 3D, the sensor measures conductivity as a function of time. The conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out. Thus, the increase in conductivity over time between a trough and a peak may be indicative of an increase in humidity over time as the user exhales (i.e. the exhalation speed).
When the breathing characteristic is a ratio of inhalation to exhalation, the remote processor(s) may determine the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed.
When the breathing characteristic is a breathing rate, the remote processor(s) may determine the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute. Determining the number of inflection points classified as a local maximum in a minute may comprise determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute. Alternatively, determining the number of inflection points classified as a local maximum in a minute may comprise extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
When the breathing characteristic is inhalation depth, the remote processor(s) may determine the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
When the breathing characteristic is exhalation depth, the remote processor(s) may determine the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum.
The at least one user interface may display information in addition to the breathing characteristic or indication of the health or fitness of the user. For example, the user interface may display one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
The remote processor(s) may be arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period. Thus, the remote server may be able to identify any changes in the sensitivity or accuracy of the sensor over time, by considering whether, for example, the maximum and minimum values sensed by the sensor have changed over time. The remote server may be able to send a message to a user device to indicate to the user (or to a hospital staff member or administrator) that the sensor needs to be replaced.
The apparatus may further comprise an accelerometer to sense movement of the user while the user is wearing the apparatus. The accelerometer data may be transmitted to the remote server along with the sensor data. The accelerometer data may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic. For example, the accelerometer data may be mapped to or matched to the generated breathing pattern, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be is provided to the user on how their exercise regime could be changed to improve their health, fitness or performance. Thus, the remote processor(s) may use data from the accelerometer to generate the breathing pattern and determine the at least one breathing characteristic.
The remote server may use additional input data to generate the breathing pattern and determine the at least one breathing characteristic. For example, the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, and oxygen level. The additional input data may be obtained by the remote server from external sources or third party sources. For example, the weather data may be obtained from a national weather service provider (such as the Met Office in the UK), while altitude data may be obtained from a map provider.
The remote processor(s) may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately.
The geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate or taking deeper breaths. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
The sensor of the apparatus may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor/detector, and a sensor comprising a porous material. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
The apparatus may be any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula. It would be understood that this is an example, non-exhaustive list of possible types of apparatus that could be used to sense breathing and monitor user health.
In a second approach of the present techniques, there is a method for health monitoring, comprising: sensing, using an apparatus, breathing of a user wearing the apparatus; generating a breathing pattern from the sensed data; and determining from the breathing pattern at least one breathing characteristic.
In a related approach of the present techniques, there is provided a (non-30 transitory) computer readable medium carrying processor control code which when implemented in a system causes the system to carry out any of the methods, processes and techniques described herein.
Preferred features are set out below and apply equally to each approach of the present techniques.
As will be appreciated by one skilled in the art, the present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
Furthermore, the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs.
Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
The techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP). The techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier. The code may be provided on a carrier such as a disk, a microprocessor, CD-or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier. Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, such code and/or data may be distributed between a plurality of coupled components in communication with one another. The techniques may comprise a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
It will also be clear to one of skill in the art that all or part of a logical method according to embodiments of the present techniques may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit. Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
In an embodiment, the present techniques may be implemented using multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
In an embodiment, the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
Implementations of the present techniques will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a schematic diagram of a health monitoring system; Figure 2 shows a flowchart of example steps performed by the health monitoring system of Figure 1; Figure 3A shows example sensor data sensed by the health monitoring to system of Figure 1 after peak detection has been performed; Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed; Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed; and Figure 3D shows an example generated breathing pattern.
Broadly speaking, embodiments of the present techniques provide a health monitoring system which uses user breathing data to determine a user's or patient's health.
Figure 1 shows a schematic diagram of a health monitoring system 100.
The system 100 comprises an apparatus 102 and a remote processor(s) 120. The apparatus 102 may be a wireless apparatus or a wired apparatus. In other words, the apparatus 102 may be capable of wirelessly transmitting data to another device, or may need a wired connection to transmit data to another example. The apparatus 102 may comprise at least one sensor 104 for sensing breathing of a user wearing the apparatus 102. The at least one sensor 104 may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor, and a sensor comprising a porous material. An example of a sensor comprising a porous material can be found in International Patent Publication No. W02016065180. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
The apparatus 102 may comprise a communication module 106 for 5 transmitting sensor data. The data collected by the sensor 104 may be transmitted to an external device or server for storage and analysis. This may be advantageous because the apparatus 102 may not have the processing power or capacity to analyse the data, and/or the storage capacity to store large quantities of data. The data collected by the sensor 104 may be transmitted periodically to 10 an external device/server, such as every second, or every few seconds, or at any suitable frequency such as, but not limited to, 12.5Hz or 20Hz. Alternatively, data collected by the sensor 104 may be transmitted at longer intervals or at irregular times in certain circumstances. For example, if the apparatus 102 is not within range to be able to communicate with an external device (e.g. is not within the range for Bluetooth (RIM) or WiFi communication), the communication module 106 may not transmit any sensor data until the external device is determined to be within range. The apparatus 102 may therefore have storage or memory 132 to temporarily store sensor data collected by the sensor 104 when real-time communication to an external device is not possible. The storage 132 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory. The storage 132 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
The data collected by the sensor 104 may be communicated/transmitted to the remote server 120 directly, or via an intermediate device. Communicating the sensor data to an intermediate device may be preferable due to the communication capability of the communication module 106. For example, if the communication module 106 is only capable of short range communication, then the sensor data may need to be transmitted to an intermediate device which is capable of transmitting the sensor data to the remote sever. In some cases, the communication module 106 may be able to communicate directly with the remote server. In some cases, the intermediate device may process the sensor data into a format or into processed data that the remote server can handle.
The communication module 106 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow. The communication module 106 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc. The communication module 106 may use a wired communication technique to transfer sensor data to an intermediate/external device, such as via metal cables (e.g. a USB cable) or fibre optic cables. The communication module 106 may use more than one communication technique to communicate with other components in the system 100.
The apparatus 102 may comprise a processor or processing circuitry 108. The processor 108 may control various processing operations performed by the apparatus 102, such as communicating with other components in system 100. In some cases, the processor 108 of the apparatus 102 may simply control the operation of the sensor 104, communication module 106 and storage 132. In other cases, the processor 108 may have some further processing capability. For example, the processor 108 may comprise processing logic to processor data (e.g. the sensor data collected by sensor 104), and generate output data/signals/messages in response to the processing. The processor 108 may be able to compress the sensor data for example, to reduce the size of the data that is being transmitted to another device. The processor 108 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
The apparatus 102 may optionally comprise an accelerometer 132 to sense movement of the user while the user is wearing the apparatus 102. The accelerometer data may be transmitted to an external device along with the sensor data.
Apparatus 102 may optionally comprise an interface 138 for providing feedback on a user's breathing. For example, the interface 138 may be one or more LEDs or other lights which may turn on and off according to the generated breathing pattern. This may provide a visual indicator to a user of the apparatus or a third party (e.g. a doctor or personal trainer) of the generated breathing pattern.
As mentioned above, the system 100 may comprise a remote server 120 for performing one or more of the steps involved in smoothing the sensor data received from the apparatus 102. Thus, the apparatus 102 may transmit sensor data to the remote server 120. The remote server 120 may then generate a breathing pattern from the sensor data, and determine from the breathing pattern at least one breathing characteristic. The remote server 120 may comprise at least one processor 123 and storage 122. Storage 122 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory. The storage 122 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data (such as the sensor data received from the apparatus 102), programs, or instructions, for example.
The system 100 may comprise a user device 110. The user device 110 may be any type of electronic device, such as, for example, a smartphone, a mobile computing device, a laptop, tablet or desktop computer, or a mobile or portable electronic device. The user device 110 may be a dedicated user device that is specifically for use with the apparatus 102. Alternatively, the user device 110 may be a non-dedicated user device, such as a smartphone. In either case, the user device 110 may comprise a software application ('app') 112 which is associated with the system 100. The app 112 may be launched or run when the user puts on the apparatus 102. For example, when the user is about to begin exercising, the user may put on the apparatus 102 and run the app 112. The app 112 may comprise a 'record' or 'start' function, which the user may press/engage when they want to start measuring their breathing using the apparatus 102. The app 112 may communicate with the apparatus 102 to instruct the sensor 104 to begin sensing and/or to instruct the communication module 106 to begin transmitting sensor data. Additionally or alternatively, when the user presses record' or 'start' on the a pp 112, the app 112 may prepare to receive sensor data from the apparatus 102. The app 112 may display the sensor data as it is received from the apparatus 102. Additionally or alternatively, the app 112 may display the generated breathing pattern produced by the remote server 120.
The user device 110 may comprise a user interface 114 to display, for example, the app 112, sensor data, generated breathing pattern, and/or any other information. The user interface 114 may be the display screen of a smartphone for example.
The user device 110 may comprise a processor 116 to control various processing operations performed by the user device, such as communicating with other components in system 100. The processor 116 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
The user device 110 may comprise a communication module 118. The communication module 118 may receive the sensor data from the communication module 106 of the apparatus 102. The communication module 118 may be able to communicate with the remote server 120, i.e. to transmit the received sensor data to the remote server 120 for processing/analysis. The communication module 118 may be able to receive data from the remote server 120. For example, the communication module 118 may receive a generated breathing pattern in real-time, near real-time or after the sensing performed by sensor 104 has ended. The generated breathing pattern may be displayed on the user interface 114 (e.g. in via app 112). That is, in some cases, sensor data may be transmitted from the apparatus 102 in real-time (e.g. every second as it is being sensed or at a frequency of 12.5Hz) to the user device 110. The user device 110 may transmit the sensor data to the remote server, and receive a generated breathing pattern back from the remote server 120, which the user device 110 may display. The user device 110 may also receive, for example, the at least one breathing characteristic from the remote server 120 as the characteristic, and may also display the at least one breathing characteristic. It may be possible for the user of user device 110 to see the raw sensed data on the user device and see how, in real-time, the remote server is generating the breathing pattern from the raw data.
Thus, the communication module 118 may have at least the same communication capability as the communication module 106 of the apparatus 102, and the remote server 120. The communication module 118 may use the same or different communication techniques or protocols to communicate with the communication module 106 and remote server 120. The communication module 118 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NEC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow. The communication module 118 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc. The communication module 118 may use a wired communication technique to receive sensor data from the apparatus 102, such as via metal cables (e.g. a USB cable) or fibre optic cables. The communication module 118 may use more than one communication technique to 20 communicate with other components in the system 100.
Figure 1 shows system 100 as having a single remote server 120. It will be understood that the system 100 may have multiple servers 120. One or more of the servers 120 may be used to collect, process and store data collected from multiple apparatuses 102. One or more of the servers 120 may be private servers or dedicated servers to ensure the sensor data is stored securely. For example, if apparatuses 102 are used in a hospital, it may be preferable for the sensor data to be collected, processed and stored by a dedicated server within the hospital, to ensure patient privacy and data confidentiality. In this case, the system 100 may comprise a router 128 for receiving sensor data from each apparatus 102 within the hospital and transmitting this to the dedicated server 120.
If the system 100 is being used by a hospital to monitor patient health, the system 100 may comprise a user interface 126 (e.g. a display screen) on hospital equipment 124, such as a computer at a nurses' station or a tablet holding an electronic patient record. This enables the generated breathing pattern to be displayed or recorded on a hospital device 124 rather than on a user device or on another device which may leave the hospital. This may again ensure that the patient data is kept secure within the hospital itself.
In some cases, raw sensor data collected by sensor 104 may be transmitted to the user device 110, and the user device 110 may transmit the sensor data to the remote server 120 for processing. Algorithms, code, routines or similar for smoothing the raw sensor data to generate a breathing pattern may be stored in the remote server 120 (e.g. in storage 122) and run by processor 123. The remote server 120 may also use the generated breathing pattern to determine one or more breathing characteristics, and the algorithms or techniques to determine the breathing characteristics may also be stored on the remote server 120. The results of the analysis (e.g. the breathing pattern and/or the breathing characteristics) may be transmitted by the remote server 120 back to the user device 110 for display via a user interface 114 (e.g. via an app on a display screen).
The remote server 120 may use additional input data 130 to generate the breathing pattern and determine the at least one breathing characteristic. For example, the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, and oxygen level. The additional input data 130 may be received or pulled in from public sources or websites, such as openweathermap.org.
The remote server 120 may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately. The geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
The accelerometer data collected by accelerometer 132 in the apparatus 102 may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic. For example, the accelerometer data may be mapped to or matched to the generated breathing pattern by the remote server 120, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance. Thus, the remote server 120 may use data from the accelerometer 132 to generate the breathing pattern and determine the at least one breathing characteristic.
Figure 2 shows a flowchart of example steps performed by the at least one remote processor of the health monitoring system 100. The method performed by the at least one remote processor may comprise receiving sensor data from an apparatus 102, the sensor data being the sensed breathing of a user wearing the apparatus 102 (step 5100). At step 5102, the remote processor(s) may smooth the sensor data to generate a breathing pattern.
Optionally, at step 5104, the remote processor may determine at least one breathing characteristic from the breathing pattern.
Optionally, the remote processor may use the at least one breathing characteristic to determine an indication of user health (step 5106). The indication of the health of the user may comprise information on the user's fitness. The indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example. The fitness information could be used by a personal trainer to devise or modify an exercise regime for the user. The indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following a respiratory illness, disease or surgery, or following the user quitting smoking (or switching from cigarettes to e-cigarettes). The indication of the health of the user may comprise information on whether the user's breathing has changed suddenly or unexpectedly (e.g. increase or decrease in breathing rate, or an increase or decrease in inhalation/exhalation depth or volume -e.g. deeper or shallower breaths). This may be useful in a hospital, as it may enable changes in the user's breathing to be identified and actioned early.
Thus, the remote processor may transmit data on the user's fitness (step S108). Alternatively, the remote processor may transmit a message to a hospital device 124 warning of the deteriorating health or condition of the user (step S110) if the user's breathing has changed suddenly.
Figure 3A shows example sensor data 300 sensed by the health monitoring system of Figure 1 after peak detection has been performed. The sensor data may be conductivity changes over time, but this may depend on the type of sensor 104 in the apparatus 102. The remote server may generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum 302 or a local minimum 304. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-1) and a data point that is immediately after data point N (i.e. N+1). In the case where data point N is greater than both data points N-1 and N+1, the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-1 and N+1, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point 302, 304 that has been identified and classified may be saved in storage 122.
The remote server 120 may determine whether each identified inflection point 302, 304 is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it needs to be removed or ignored in order to generate a breathing pattern that accurately reflects the user's breathing. For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection". Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed. Thus, the remote server 120 may determine whether a distance 312 between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance 312 is below the threshold distance, the remote server may remove both of the two adjacent inflection points. The threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000.
More specifically, if a sensor's values range from OV to 3.3V, the digital values may range from 0 to 65500 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000. Thus, after performing peak prominence detection on sensor data 300, the peaks and troughs in region 314 of the sensor data have been determined to not be representative of breathing. As shown in Figure 3B, the peaks and troughs in region 314 are no longer marked/tagged as inflection points, and so will not be used to generate a breathing pattern.
The remote server may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote server may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time. When the time is less than a predetermined time, one of the two inflection points may be removed by the remote server so that it is not used to generate the breathing pattern. This process may be called "peak separation analysis" or "peak distance analysis". Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed. Thus, after performing the peak separation analysis on the data in Figure 3B, the two adjacent troughs or local minima 304a, 304b are considered to be too close together and not representative of a real breathing pattern. Consequently, as shown in Figure 3C, point 304b is no longer marked/tagged as an inflection point, and so will not be used to generate a breathing pattern.
As mentioned above, it is important to choose an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together. The predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute. Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/hyperventilation.
A breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation. Thus, if two adjacent inflection points are both classified as a local maximum or as a local minimum, such that there are two peaks next to each other without a trough in-between, or two troughs without a peak in-between, then the sensor data does not represent an alternating sequence of inhalation and exhalation. Accordingly, the remote server may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points. For example, if the two consecutive inflection points are both classified as local maxima (peaks), the higher peak may be kept and the lower peak removed, whereas if the two consecutive inflection points are both classified as local minima (troughs), the larger trough may be kept, and the smaller trough removed. Figure 3D shows an example generated breathing pattern 350 after this process has been performed. In Figure 3C, two adjacent inflection points that were classified as local maxima 302a, 302b do not have a local minimum between them. Accordingly, as shown in Figure 3D, inflection point 302b is no longer marked/tagged as an inflection point. The resulting data 350 is the generated breathing pattern which shows a series of peaks and troughs that are representative of breathing (i.e. noise has been removed).
As mentioned above, once a breathing pattern has been generated by smoothing the original sensor data, one or more breathing characteristics may be derived or determined. A number of breathing characteristics may be determined, such as, for example: * Inhalation Rate * Inhalation Speed (e.g. the amplitude of a peak minus the amplitude of a neighbouring/consecutive trough, divided by inhalation time) * Inhalation Time (e.g. the time between a consecutive peak and a trough in the breathing pattern) * Exhalation Rate * Exhalation Speed * Exhalation Time * Breaths per minute * Highest Breathing rate * Lowest Breathing Rate Other characteristics relating to either the user activity or the sensor itself may be determined from the original sensor data and/or the breathing pattern, and/or 30 other data may be collected from the apparatus 102 or user device 110, such as, for example: * Average breathing rate with respect to: * User activity level (via, for example, accelerometer 132) Resistance training levels of a resistance training mask worn by the user (where the level may be obtained directly from the apparatus or via a user input on an app on the user device 110). For example, it may be determined that a user takes more breaths when using a training mask at one resistance level compared to when they use the training mask set at another resistance level.
* Signal depth -which may be indicative of shallow breathing, deep breaths, etc. * Exertion score -e.g. how hard is the user exercising or breathing based on the generated breathing pattern (and relative to e.g. historical breathing patterns/characteristics collected during past exercise sessions) * hours trained -e.g. hours of exercise while wearing a resistance training mask, based on the sensor data sensing breathing * sensor lifetime, which may be determined by: * total number of breaths taken/recorded while the apparatus 102 is used * external factors * inactive time periods * inhalation/exhalation intervals * breathing power, which may be measured by the change in inhalation velocity over time 17) gradient of breathing pattern Those skilled in the art will appreciate that while the foregoing has described what is considered to be the best mode and where appropriate other modes of performing present techniques, the present techniques should not be limited to the specific configurations and methods disclosed in this description of the preferred embodiment. Those skilled in the art will recognise that present techniques have a broad range of applications, and that the embodiments may take a wide range of modifications without departing from any inventive concept as defined in the appended claims.
Claims (25)
- CLAIMS1. A health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern.
- 2. The health monitoring system as claimed in claim 1 wherein the at least one remote processor smooths the received sensor data by: identifying a plurality of inflection points in the sensor data; classifying each inflection point as a local maximum or a local minimum; determining whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum, is above a threshold distance; and removing both of the two adjacent inflection points where the distance is below the threshold distance.
- 3. The health monitoring system as claimed in claim 2 wherein the at least one remote processor smooths the received sensor data by: determining whether a time between two successive inflection points classified as a local maximum, or between two successive inflection points each classified as a local minimum, is less than a predetermined time; and removing, when the time is less than a predetermined time, one of the two inflection points.
- 4. The health monitoring system as claimed in claim 2 or 3 wherein the at least one remote processor smooths the received sensor data by: identifying consecutive inflection points that are both classified as a local maximum or as a local minimum; and removing one of the two consecutive inflection points.
- 5. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor is further configured to determine, from the breathing pattern, at least one breathing characteristic.
- 6. The health monitoring system as claimed in claim 5 wherein the breathing characteristic is inhalation speed, and the at least one remote processor determines the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time.
- 7. The health monitoring system as claimed in claim 5 or 6 wherein the is breathing characteristic is exhalation speed, and the at least one remote processor determines the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time.
- 8. The health monitoring system as claimed in claim 7 wherein the breathing characteristic is a ratio of inhalation to exhalation, and the at least one remote processor determines the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed.
- 9. The health monitoring system as claimed in any one of claims 5 to 8 wherein the breathing characteristic is a breathing rate, and the at least one remote processor determines the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute.
- 10. The health monitoring system as claimed in claim 9 wherein determining the number of inflection points classified as a local maximum in a minute comprises: determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute.
- 11. The health monitoring system as claimed in claim 9 wherein determining the number of inflection points classified as a local maximum in a minute comprises: extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
- 12. The health monitoring system as claimed in any one of claims 5 to 11 wherein the breathing characteristic is inhalation depth, and the at least one remote processor determines the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
- 13. The health monitoring system as claimed in any one of claims 5 to 12 wherein the breathing characteristic is exhalation depth, and the at least one remote processor determines the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum.
- 14. The health monitoring system as claimed in any preceding claim further comprising at least one user interface for displaying information on one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
- 15. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor is arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period.
- 16. The health monitoring system as claimed in any preceding claim wherein the apparatus further comprises: an accelerometer to sense movement of the user while the user is using the apparatus.
- 17. The health monitoring system as claimed in claim 16 wherein the at least one remote processor uses data from the accelerometer to smooth the sensor data to generate the breathing pattern.
- 18. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor uses additional input data to smooth the sensor data to generate the breathing pattern, where the additional input data comprises one or more of: geographical location of the user, altitude data, weather data, humidity data, and oxygen level.
- 19. The health monitoring system as claimed in claim 18 wherein the at least one remote processor uses the humidity data as a baseline humidity measure when smoothing the sensor data to generate the breathing pattern.
- 20. The health monitoring system as claimed in any preceding claim wherein the sensor is any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor or detector, and a sensor comprising a porous material.
- 21. The health monitoring system as claimed in any preceding claim wherein the apparatus is any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula.
- 22. The health monitoring system as claimed in any one of claims 5 to 21 wherein the remote server: determines an indication of the health of the user from the at least one breathing characteristic; and wherein the system further comprises: at least one user interface for displaying the indication of the health of the user.
- 23. The health monitoring system as claimed in any preceding claim wherein 10 the at least one remote processor performs the smoothing of the sensor data in real-time or in near real-time.
- 24. A method for health monitoring, comprising: receiving sensor data from an apparatus comprising a sensor for sensing breathing of a user using the apparatus; and smoothing the received sensor data to generate a breathing pattern.
- 25. A computer readable medium carrying processor control code which when implemented in a system causes the system to carry out the method of claim 24.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1912815.6A GB2586848A (en) | 2019-09-05 | 2019-09-05 | Systems and methods for analysing breathing |
PCT/GB2020/052112 WO2021044150A1 (en) | 2019-09-05 | 2020-09-04 | Systems and methods for analysing breathing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1912815.6A GB2586848A (en) | 2019-09-05 | 2019-09-05 | Systems and methods for analysing breathing |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201912815D0 GB201912815D0 (en) | 2019-10-23 |
GB2586848A true GB2586848A (en) | 2021-03-10 |
Family
ID=68241188
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1912815.6A Pending GB2586848A (en) | 2019-09-05 | 2019-09-05 | Systems and methods for analysing breathing |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2586848A (en) |
WO (1) | WO2021044150A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112120701A (en) * | 2020-09-17 | 2020-12-25 | 江苏集萃有机光电技术研究所有限公司 | Breathing monitoring mask and breathing monitoring method |
GB2600164A (en) | 2020-10-26 | 2022-04-27 | Spyras Ltd | Apparatus for sensing and analysing breathing |
EP4449746A1 (en) | 2021-12-16 | 2024-10-23 | 3M Innovative Properties Company | System and computer-implemented method for providing responder information |
EP4447801A1 (en) * | 2021-12-16 | 2024-10-23 | Breezee, Inc. | Device and methods for monitoring and training breathing |
CN118588224A (en) * | 2024-08-06 | 2024-09-03 | 浙江微红健康科技有限公司 | Health data monitoring system based on big data, data acquisition device and equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0956820A1 (en) * | 1996-10-04 | 1999-11-17 | Karmel Medical Acoustic Technologies Ltd. | Apnea determination |
WO2012007719A1 (en) * | 2010-07-14 | 2012-01-19 | Imperial Innovations Limited | Feature characterization for breathing monitor |
US20130079656A1 (en) * | 2011-09-23 | 2013-03-28 | Nellcor Puritan Bennett Ireland | Systems and methods for determining respiration information from a photoplethysmograph |
WO2014128090A1 (en) * | 2013-02-20 | 2014-08-28 | Pmd Device Solutions Limited | A method and device for respiratory monitoring |
GB2550833A (en) * | 2016-02-26 | 2017-12-06 | Pneumacare Ltd | Breath identification and matching |
JP2019141597A (en) * | 2019-03-05 | 2019-08-29 | パイオニア株式会社 | Signal processing device and method, computer program, and recording medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1767785B (en) * | 2003-01-30 | 2015-08-26 | 康普麦迪克斯有限公司 | For the algorithm of automatic positive air pressure titration |
GB2488316A (en) * | 2011-02-22 | 2012-08-29 | Toumaz Uk Ltd | Method for determining respiration rate from uncorrupted signal segments |
US10712337B2 (en) | 2014-10-22 | 2020-07-14 | President And Fellows Of Harvard College | Detecting gases and respiration by the conductivity of water within a porous substrate sensor |
US11246530B2 (en) * | 2017-05-08 | 2022-02-15 | Intel Corporation | Respiratory biological sensing |
-
2019
- 2019-09-05 GB GB1912815.6A patent/GB2586848A/en active Pending
-
2020
- 2020-09-04 WO PCT/GB2020/052112 patent/WO2021044150A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0956820A1 (en) * | 1996-10-04 | 1999-11-17 | Karmel Medical Acoustic Technologies Ltd. | Apnea determination |
WO2012007719A1 (en) * | 2010-07-14 | 2012-01-19 | Imperial Innovations Limited | Feature characterization for breathing monitor |
US20130079656A1 (en) * | 2011-09-23 | 2013-03-28 | Nellcor Puritan Bennett Ireland | Systems and methods for determining respiration information from a photoplethysmograph |
WO2014128090A1 (en) * | 2013-02-20 | 2014-08-28 | Pmd Device Solutions Limited | A method and device for respiratory monitoring |
GB2550833A (en) * | 2016-02-26 | 2017-12-06 | Pneumacare Ltd | Breath identification and matching |
JP2019141597A (en) * | 2019-03-05 | 2019-08-29 | パイオニア株式会社 | Signal processing device and method, computer program, and recording medium |
Also Published As
Publication number | Publication date |
---|---|
GB201912815D0 (en) | 2019-10-23 |
WO2021044150A1 (en) | 2021-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2586848A (en) | Systems and methods for analysing breathing | |
US9706946B2 (en) | Spirometer system and methods of data analysis | |
CN107529991A (en) | For the heart of detection object and/or the equipment, system and method for respiratory disorder | |
US20140088373A1 (en) | System and method for determining sleep stage | |
KR102115643B1 (en) | self - diagnostic method of chronic obstructive pulmonary disease based on IoT | |
CN102687152A (en) | COPD exacerbation prediction system and method | |
CA2847412C (en) | System and methods for estimating respiratory airflow | |
CN106659427A (en) | Non-invasive monitoring of pulmonary conditions | |
US10966632B2 (en) | Method and device for determining the health of a subject | |
US20140155774A1 (en) | Non-invasively determining respiration rate using pressure sensors | |
EP2651294B1 (en) | System and method of identifying breaths based solely on capnographic information | |
CN104720808A (en) | Human sleep respiration detection method and device | |
US20230000388A1 (en) | Oxygen mask respirometer | |
JP6315576B2 (en) | Sleep breathing sound analysis apparatus and method | |
JP2020510947A (en) | Method and apparatus for predicting health by analyzing physical behavior patterns | |
CN106793975A (en) | Method and apparatus for detecting patient's cardiopulmonary situation deterioration in respiratory assistance apparatus | |
CN111919242A (en) | System and method for processing multiple signals | |
CN205458684U (en) | Wearable breathes monitoring system based on graphite alkene | |
WO2014201371A1 (en) | System for optimal physical exercise and training | |
CN107205672B (en) | Apparatus and method for evaluating respiratory data of a monitored subject | |
KR20180017392A (en) | System and method for providing sleeping state monitoring service | |
JP7296671B2 (en) | Biological monitoring system and its program | |
Abinayaa et al. | An intelligent monitoring device for asthmatics using Arduino | |
CN108471951A (en) | The implantable device and method of COPD for monitoring patient | |
TW202110397A (en) | Device, sysyem, and method for user apnea events detection |