EP3445242A1 - Dispositif de surveillance de mouvement corporel - Google Patents

Dispositif de surveillance de mouvement corporel

Info

Publication number
EP3445242A1
EP3445242A1 EP17785579.8A EP17785579A EP3445242A1 EP 3445242 A1 EP3445242 A1 EP 3445242A1 EP 17785579 A EP17785579 A EP 17785579A EP 3445242 A1 EP3445242 A1 EP 3445242A1
Authority
EP
European Patent Office
Prior art keywords
subject
thorax
breathing
pattern
respiratory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP17785579.8A
Other languages
German (de)
English (en)
Other versions
EP3445242A4 (fr
Inventor
Ditza Auerbach
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BreatheVision Ltd
Original Assignee
BreatheVision Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/135,797 external-priority patent/US9788762B2/en
Application filed by BreatheVision Ltd filed Critical BreatheVision Ltd
Publication of EP3445242A1 publication Critical patent/EP3445242A1/fr
Publication of EP3445242A4 publication Critical patent/EP3445242A4/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • A61B5/1135Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing by monitoring thoracic expansion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02444Details of sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/091Measuring volume of inspired or expired gases, e.g. to determine lung capacity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1101Detecting tremor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4818Sleep apnoea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/04Babies, e.g. for SIDS detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • A61B2560/0214Operational features of power management of power generation or supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • A61B2560/0238Means for recording calibration data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Definitions

  • the present invention relates to monitoring apparatus. More particularly, the invention relates to sensor-based monitoring apparatus for monitoring a variety of activities, such as human breathing.
  • Respiration rate is a vital sign that is monitored either intermittently or continuously in a variety of situations, including but not limited to during and after medical intervention procedures.
  • Continuous monitoring apparatus can be placed in contact with the body (abdominal belts or air flow measuring devices with a mask or invasive flow devices placed inside the trachea) or non-contact (e.g. pressure sensors under mattress, or a Doppler sensor).
  • Continuous monitoring of respiration beyond the respirator ⁇ ' rate is achieved nowadays using contact and invasive sensors.
  • the present invention relates to a system for non-invasive monitoring of the respiration properties through sensing chest wall movements.
  • Various measures are deduced such as respiratory rate, respiratory effort, tidal volumes and respiratory patterns. These properties are obtained under a wide range of ambient conditions such as in the presence of subject's non-breathing movements, subject pose changes, various ambient lighting conditions, physical obstructions, etc.
  • reliable early warnings can be issued based on such respiratory properties and based on analysis of the sensor signals such as image sensors and mertial sensor signals.
  • subject as used herein, is meant to indicate an individual the respiratory activity of whom is being tracked, whether a patient in a health facility, or a person being tested for any reason or purpose.
  • the system is applicable to various settings such as monitoring subjects who are undergoing sedative or pain killing treatment that can depress respiration, monitoring deterioration in the critically ill, monitoring infants to protect against SIDS and diagnostic tools for sleep testing such as for sleep apnea.
  • the monitor of the invention can be used to track other movements in addition to respiratory movements, e.g., leg movements and eye movements, as well as for quantifying awakenings and sedation level. These can be important for sleep staging analysis.
  • Another use of the invention is for the diagnosis of conditions using intermittent monitoring or comparison of measurements taken at different times.
  • One example of this is comparison of breathing movement patterns of the chest and abdomen before and after surgery in order to diagnose diaphragm paralysis.
  • change in respiration patterns at rest over time can be tracked with the system of the invention.
  • the settings in which the system may be used are versatile such as hospitals, clinics, home, battlefield and even outdoor environments. Often for some of these conditions, low respiratory rate is a late indicator of respiratory distress, and other measures such as the depth of breath and pulmonary motion patterns are earlier indicators of deterioration. There are situations in which the respiratory rate is not a sufficient differentiator of deterioration as in the case of an airway obstruction.
  • the subject does not stand still and a variety of disturbances may occur such as changing subject pose, chest wall motion whose source is not associated with breathing and possible movement of a marker attached to the subject relative to the chest wall.
  • the present invention enables respiratory monitoring in the presence of these disturbances.
  • the present invention is directed to a system for monitoring the respiratory activity of a subject, which comprises:
  • the one or more computing devices is operable to: i. generate a first breathing pattern from the first signals, ii. divide each respiratory cycle experienced by the subject and defined by the first pattern into a plurality of portions, each of the portions delimited by two different time points and calculate, for each of the plurality of portions of a given respiratory cycle of the first pattern, a slope representing a thorax velocity;
  • iii. derive, from the given respiratory cycle of the first pattern, a pulmonary air flow rate of the subject during predetermined portions of the respiratory cycle;
  • v. determine respiratory characteristics of the subject for subsequent respiratory cycles experienced by the subject, based on a calculated thorax velocity and the calibration.
  • the respirator ⁇ ' characteristics of the subject may be updated, based on the detection of changes in poses and/or the location of one or more markers with respect to the subject body.
  • the system may further comprise one or more additional movement sensors applied to the body of the subject, for generating third signals that are indicative of movement of the abdomen of the subject, wherein the receiver is configured to additionally receive the third generated signals during breathing motion of the subject and the one or more computing devices is also operable to generate a third breathing pattern from the third signals and to divide each respiratory cycle experienced by the subject and defined by the third pattern into a plurality of portions, such that a portion of the third pattern is delimi ted by the same two time points as a corresponding portion of the first pattern.
  • the one or more computing devices may be also operable to calculate, for each of the plurality of portions of a given respirator ⁇ ' cycle of the third pattern, a slope representing a body velocity and to determine respiratory characteristics of the subject for subsequent respiratory cycles experienced by the subject based on a calculated body velocity and thorax velocities.
  • the system may further comprise an alarming device in data communication with the one or more computing devices, wherein the one or more computing devices is also operable to command the alarming device to generate an alert if the markers movement with respect to the subject body exceeds a predetermined threshold.
  • the one or more computing devices may be also operable to readjust the calibration of the thorax velocity with the pulmonary air flow rate if the calculated changes in the 3-D position during the course of a breathing motion measuring operation.
  • the system may further comprise a one or more straps for attaching the one or more movement sensors to the subject's body, to facilitate performance of a breathing motion measuring operation while the subject is disposed on his or her side.
  • each of the one or more movement sensors and the one or more additional movement sensors may be an accelerometer, a gyroscope, or a combination thereof.
  • Each of the one or more movement sensors and the one or more additional movement sensors may be disposable and holds power supply elements and operating circuitry.
  • Each of the one or more movement sensors may be provided with an adhesive and is suitable to be positioned directly on the subject's thorax, and each of the one or more additional movement sensors is provided with an adhesive and is suitable to be positioned directly on the subject's abdomen.
  • Each of the one or more movement sensors may be positionable on a fabric located in close proximity to the subject's thorax, and is further provided with fastening means adapted to maintain it in a desired positional relationship to a chosen location on the subject's thorax, and wherein each of the one or more additional movement sensors is positionable on a fabric located in close proximity to the subject's abdomen, and is further provided with fastening means adapted to maintain it in a desired positional relationship to a chosen location on the subject's abdomen.
  • the one or more computing devices may be provided with communication apparatus suitable to deliver raw or analyzed data to a remote device.
  • the determined respiratory characteristics may include tidal volume calculation.
  • the present invention is also directed to a system for monitoring the respiratory activity of a subject, which comprises:
  • one or more signal generating elements being movement sensors, applied to the thorax of a subject, for generating signals that are indicative of movement of the thorax of the subject;
  • the one or more computing devices is operable to: i. calculate, in response to the received generated signals, a magnitude of a maximum displacement in 3D space of the one or more signal generating elements during a cycle of the breathing motion;
  • ii. calculate the magnitude of a current displacement in 3D space of the one or more signal generating elements during the breathing motion with respect to a reference tidal volume associated with the maximum displacement in 3D space;
  • iii compare current and maximum displacements; and iv. calculate the tidal volume based on a calibration between the current and maximum displacements.
  • Each of the one or more kinematic signal generating elements may be an accelerometer, a gyroscope, or a combination thereof.
  • each of the signal generating elements is disposable and holds power supply elements and operating circuitry; each of the signal generating elements is provided with an adhesive and is suitable to be positioned directly on the subject's thorax; each of the signal generating elements is positionable on a fabric located in close proximity to the subject's thorax, and is further provided with fastening means adapted to maintain it in a desired positional relationship to a chosen location on the subject's thorax; or each of the signal generating elements is connectable to a base attached directly to the subject's thorax, wherein a fabric is located between the base and the signal generating element.
  • the signal generating elements may include a first group of one or more kinematic signal generating elements selected from an accelerometer, a gyroscope, or a combination thereof, and a second group of signal generating elements including one or more light sources or one or more light source assemblies, each of which comprises one two or more spaced light sources suitable to generate light, which are mounted on a rigid plate and fed by a power source, and the receiver is a stationary camera for tracking movement of, and imaging, the two or more light sources during breathing motion of the subject, the one or more computing devices operable to calculate the magnitude of a maximum displacement and of a current displacement in conjunction with both the first and second group of signal generating elements.
  • the one or more computing devices may be also operable to output an alert when a current tidal volume for a monitored respiration cycle deviates from the reference tidal volume by more than a predetermined threshold that is indicative of an anomalous breathing pattern.
  • the system may be provided with communication apparatus suitable to deliver raw or analyzed data to a remote device.
  • the present invention is also directed to a sleep assessment system, which comprises: a) one or more movement sensors adhesively applied to an eyelid of a subject, for generating signals that are indicative of movement of the eyelid, wherein each of the movement sensors comprises operating circuitry and a wireless transmitter for transmitting the generated signals;
  • a computing device in data communication with the receiver, which is operable to analyze the generated signals, calculate a movement pattern of the eyelid during sleep, and to associate the calculated movement pattern with a known sleep stage.
  • Fig. 1 is a flow chart of a monitoring stage, according to one embodiment of the invention.
  • Fig. 2 is a flow chart of a detection stage, according to one embodiment of the invention.
  • Fig. 3 is a flow chart of a tracking stage, according to one embodiment of the invention.
  • Fig. 4 is a flow chart of an analysis stage, according to one embodiment of the invention.
  • Fig. 5 is a flow chart of a volume calibration stage, according to one embodim ent of the invention.
  • Fig. 6 shows a marker according to one embodiment of the invention in separated position relative to a cloth, as may be part of the subject's robe;
  • Fig, 7 shows the marker of Fig. 6 in operating position
  • Fig. 8 is a top view of the marker of Fig. 7;
  • Fig. 9 is a cross-section taken along the A A line of Fig. 8,
  • Fig. 10 is an exploded view of the disposable part of a marker according to one embodiment of the invention.
  • Fig. 11 is a breathing waveform showing a single marker's displacement over time
  • Fig. 12 is a schematic illustration of a system for monitoring the breathing patterns of a subject using inertial sensors
  • Fig, 13 is a method for determining the respirator ⁇ ' activity of a subject according to one embodiment of the invention.
  • Fig. 14 is a scatter plot depicting spirometer flow readings versus the speed of a single marker placed on a subject.
  • the following description relates to a system for monitoring the respiratory activity of a subject in a static position such as in bed, but it will be appreciated that it may also be used to monitor the respiratory activity of a moving subject, particularly one that is undergoing physical activity, such as riding a bicycle.
  • markers on the subject comprising signal generating elements, power supply elements and circuitry for operating the signal generating elements, either directly contacting the subject or on the subject's clothing or covers (i.e., on the sheets and/or blankets).
  • markers may include light emitting elements or inertial measuring sensors such as an accelerometer or a gyroscope.
  • the markers may also include a microprocessor for performing some signal processing of the sensors.
  • a receiver for receiving the generated signals for example an imaging device, which can be for example a CMOS video camera, a 3D camera, a thermal imager, a light field camera or a depth camera.
  • the imaging device may have an associated illumination source such as IR LEDs in some embodiments.
  • the receiver may also be configured with a wireless or wired electronic circuit to which a single or several inertia! measurement sensors on the markers transmit measurements.
  • a computing device that is connected either wirelessly or wired to the receiver.
  • the computing device may be remote and the connection to it may include an internet connection.
  • part or ail of the signal or image processing may be performed on a device such as a microprocessor or a computer board wired to the receiver.
  • a Bluetooth connection (or a connection of functionally ssiimmiillaarr tteecchhnnoollooggyy)) mmaayy ccoonnnneecctt ttoo aa ggaatteewwaayy ddeevviiccee tthhaatt uuppllooaaddss tthhee ddaattaa ttoo aa cclloouudd oonn tthhee iinntteerrnneett,, oorr ddeelliivveerrss iitt ttoo aa rreemmoottee llooccaattiioonn,, ffoorr fuurrtthheerr pprroocceessssiinngg aanndd ssttoorraagg
  • the subject monitoring process is carried out according to one embodiment of the invention when a subject 5 in supine position is lying on a bed or on any other substantially horizontal support surface 7, as illustrated in Fig. 12.
  • a rigid marker 51 provided with two inertial or kinematic sensors 56 and 57, a single inertial sensor or any other suitable number thereof, such as a three-axis accelerometer and a 3-axis gyroscope, or a combination thereof, is used to calculate the respiratory activity of subject 5.
  • Marker 51 is applied to the thorax 13 of subject 5 by engaging means 16, such as an adhesive connection applied directly to a bodily portion or an interfacing element such as a fabric in engagement with the thorax, to define a desired positional relationship to a chosen location on the subject's thorax.
  • Each inertial sensor may be equipped with its own power supply element 22 and operating circuitry 24, or alternatively marker 51 has common power supply elements 22 and operating circuitry 24 for ail inertial sensors provided therewith.
  • thorax 13 is displaced cyclically in 3D space.
  • Marker 51 is also displaced by a distance of dl in 3D space in response to the thorax displacement.
  • Inertial sensors 56 and 57 sense the force applied by thorax 13 during a breathing motion along a local inertial frame defined by their rectilinear structure, and in turn output a signal S that is indi cati ve of the detected force or of the acceleration related to the detected force.
  • Output signal S is generally a wireless signal, but may also be transmitted by a wired connection.
  • Receiver 33 which is located in the proximity of marker 51 when the generated signal is a wireless signal but which also may be remotely separated from the marker, receives the generated signals S and transmits them to a computing device 37, which processes the generated signals S to calculate a sensor acceleration.
  • the receiver 33 may be wired and located in close proximity to the computing device. Both of them may be situated on or near the patient's bed.
  • the motion of the rigid marker 5 1 in a single exhalation or inhalation movement may be approximated as a 3D translation. However, it was found that there is a measurable rotation of the marker 51 of the order of 1 degree, depending on the subject and his pose.
  • the acceleration due to the motion of the rigid marker 51 is sufficiently small that its inclination to gravity can be continuously estimated from a 3-axis accelerometer. This can be a achieved by calculating the Euler angles of marker 51 continuously and applying Kalman filtering to reduce noise. During breathing, the Euler angles were found to show a cyclic behaviour atthe respiratory rate, thus providing an independent measurement of the respiratory rate.
  • a gyroscope 56 can be used in conjunction with the accelerometer in order to determine the acceleration of the marker in a global frame of reference using well- known navigation filtering techniques (see for example Quaternion-Based Extended Kalman Filter for Determining Orientation by Inertia! and Magnetic Sensors, A.M. Sabatini, IEEE Trans. Biomed. Eng., 53 (2006).
  • the global acceleration measurements can be integrated to determine the flow rates during each respirator ⁇ ' cycle,
  • Fig. 13 illustrates a method for calculating a subject's respiratory rate and/or current tidal volume using the signals generated by the inertial sensors.
  • each signal is filtered in step 64 to determine the continuous rotational component of the breathing motion and subsequently the respiratory rate.
  • filtering is carried out to determine the acceleration due to breathing motion in a fixed global frame of reference which can in turn be integrated.
  • the integration is carried out between peaks and troughs of single breaths by enforcing regularization constraints such as zero velocity at peaks and troughs of the respiratory signal.
  • non-breathing motion may be identified according to the presence of significant rotations in the signal related data: a possible criterion is that the cumulative angle of rotation over a predetermined period of time, e.g., the previous 10 seconds, increases compared to baseline values.
  • the learned calibration model is utilized using the, integrated measurements over a cycle in order to determine tidal volume.
  • Another embodiment of the monitoring process is carried out by imaging.
  • One or more video cameras are used to record a scene of the subject.
  • One example is a CMOS camera with night-vision capabilities, i.e. one in which a filter exists to filter out the visible light and sense the near Ifl optical wavelengths (NIR: 0.75-1.4 ⁇ ).
  • NIR near Ifl optical wavelengths
  • the ambient light source may be on the camera or in a chosen spatial arrangement in the subject's vicinity.
  • Another example for the camera sensor is a thermal camera which is sensitive to mid- wavelength (3-8um) or long-wavelength (8-14 urn) infrared radiation. It too can have a light source such as a halogen lamp source which is used in Thermal Quasi- Reflectography.
  • the sensor may also be a 3D depth camera such as the Realsense camera by Intel corporation.
  • Markers are applied either directly to the subject's body or integrated to a covering of the subject such as his clothing, his blanket, an elastic strap, or a bandage for example.
  • the markers can be for example patches made from retro-reflective material, geometric patterns on the blanket or nightgown, or low voltage LED lights embedded or attached to clothing or patches.
  • Each LED light can be either fixed or can flash each with its own known temporal pattern.
  • their light emitting wavelength and illumination may be specific for each LED.
  • they can be embedded in retro reflective cones or other shaped contraptions made of reflective materials.
  • the illumination angle may be narrow by using a lens.
  • One specific marker which is an object of the present invention and which can be used in a system such as this illustrative system, will be described in greater details later in this description.
  • markers may be incorporated together such as into an adhesive linear strip, or a 3D shape such as a dome shape consisting of scattered markers on its curved surface.
  • Another possibility is to attach the marker assembly to elastic straps or a vest that the subject can wear and that reflect the thoracic movements.
  • the marker unit consists of two distinct physical parts that can be firmly connected to each other but can also be easily released from each other.
  • One part may be positioned close to the body and may therefore be for single use (cannot be disinfected) while the other part (can be disinfected and therefore reusable) attaches to the disposable part but has no direct attachment to the subject.
  • the attachment between the two parts can be direct through fasteners, e.g., similar to ECG fasteners.
  • Another alternative is that the two parts be connected through clothing or a blanket.
  • the disposable part is a patch that adheres to the body, while the reusable part consists of an active marker that is positioned on clothing and attaches to the disposable part through one or more fasteners.
  • the power unit may either be incorporated within the disposable patch or within the reusable marker.
  • the active markers can emit steady signals or coded signals. For example if the markers emit light, the intensity can be modulated according to a frequency that is predetermined and distinct for each marker location. In this way, each marker can be easily identified during operation. Furthermore, light patterns which can save on battery life can be employed for long-term monitoring.
  • Controller and Mount The camera can be hung overlooking the subject on the wall, ceiling, a stand or part of the bed or any other fixture in the vicinity.
  • a controller may be provided, which can be used to pan, rotate and zoom the camera's field of view freely in 3D either manually or automatically based on the video content.
  • it can be possible to adjust other camera features automatically based on the video content. Examples of such features are: focal distance, illumination level, white balance and optical filters. Providing such controlling elements is well known in the art and is not therefore described herein, for the sake of brevity.
  • the camera visual sensor may also be equipped with a depth sensor providing the distance of each monitored pixel from the camera.
  • the following relates to the detection and tracking of the markers using 2D video images (Figs. 2 and 3). Later on a description is provided how the depth information can be obtained from standard 2D images or from a 3D camera or stereo camera For the first video frame or any frame in which the markers from the relatively recent frames were not all tracked, an attempt to detect the markers is carried out. Detection on the intensity image (or sequence of images) can proceed in the following steps:
  • Pixels with high intensity are marked. This is done using a fixed threshold or one that adapts to the local noise according to the local intensity distribution. For example, all pixels whose intensity is higher than 5 standard deviations from the mean of the neighborhood intensity distribution.
  • the marked pixels are clustered to connected components using chain clustering for example. Two pixels belong to the same cluster if there is a path of marked pixels that connect them, each within distance d of each other.
  • the various clusters are the candidate markers. They are now filtered according to a priori known features of the markers such as their size, eccentricity, bounding box size, temporal frequency and gray levels. For circular markers, principal components values of the candidate clusters are useful features. Position of the candidate clusters relative to the bed or body silhouette can also be used to filter out spurious clusters.
  • the above analysis is carried out for a series of frames where the extent and identity of the frames in the series depend on the flashing frequencies.
  • the final candidate markers for step 3 above consist of the unified set of markers obtained from the frame series.
  • the clustering is carried out differently. This can occur for example in the presence of a blanket which may scatter the light reflected or emanating from the markers, dependent on the folds in the blanket and the relative configuration of the blanket relative to the markers. In these cases, chain clustering with a single threshold on the intensity image as described above may not be sufficient to identify a single marker consistently.
  • a marker may be composed of several disjoint clusters which are the "seeds" of its more extended light pattern.
  • the seeds may be expanded by region growing techniques to unite the seeds to a single signature. This signature is saved as a set of pixels with their gray levels.
  • the presence of a specific temporal behavior for a particular marker is used to associate disjoint pixels to a common signature.
  • the clusters in a frame are detected, they are tracked over the following frames (as described below). If the tracking was not successful for all the detected markers, the detection is reinitialized on the new current frame. Successful tracking means that the positions of all the detected markers were determined in each frame, that the frame to frame displacement of each marker does not surpass a predefined threshold and that this displacement is consistent across the marker pixels and the individual emitters.
  • the threshold is set using the maximal expected speed of the breathing motion. It can be later adjusted per patient during the baseline setting per patient described below.
  • a valid cluster is one whose motion is consistent in terms of frequency of motion with the major detected markers (those in the trunk of the body). This can be analyzed by calculating the largest component of the Fourier Transform of the cluster displacements over say 10 seconds and removing clusters whose frequency is far from the median frequency of the chest & abdominal markers
  • Lucas Kanade method can be utilized (B. D. Lucas and T. Kanade, An iterative image registration technique with an application to stereo vision. Proceedings of imaging Understanding Workshop, 121 130, 1981 ).
  • the tracking is carried out relative to a reference frame in order to avoid accumulating frame-to-frame errors.
  • the reference frame is updated from time to time due to the fact that the scene is changing over time.
  • One possibility is to update the reference frame when one or more of a specific set of criteria are met. Examples of these criteria are:
  • Non-breathing movements are identified.
  • Illumination conditions of the scene have changed significantly (total intensity change of marker cluster relative to same marker in reference frame),
  • Every preset time period (e.g., 30 seconds)
  • the tracking can be used as a filter as follows: the single frame detection method may be set to produce an abundance of objects some of which are not actual markers on the subject's body. Once they are tracked, criteria for filtering them can be used based on characteristics of their movement. For example, markers that remain static for a long time even when subject makes some position change can be filtered. Furthermore, if one is concerned with tracking only breathing motion, non-static markers with a different frequency of motion (or uncorrelated motion) than the identified breathing markers are filtered.
  • Each individual marker time series (2-dimensional x and y) is processed on-line using noise reduction filters, such as those using a measurement window around each point to process the data using a mathematical operation such as averaging, weighted averaging, or the median operation.
  • Bandpass filters damping high frequency noise can also be applied.
  • the a priori knowledge of the rigid geometric arrangement between the individual markers enables applying Kalman filtering to obtain their position at higher accuracy.
  • Another possibility of isolating breathing motion from other motions, such as body movement, heartbeat or body tremors, is to first learn the principal image direction of the breathing vector for each marker when the subject is at rest. All motion in the perpendicular direction is regarded as “noise” and removed. Other physiological information can be extracted from this removed “noisy component” such as the frequency of cardiac oscillations (heart rate) and tremor frequencies. Heart beats can be identified using band pass filters associated with frequencies learned based on subjects data or from an additional hear rate sensor (for example the pulse-oximeter).
  • additional noise may enter the video.
  • Such noise can be tracked by following the movement of a stationary object such as a wall or floor or a marker on a wall.
  • Another way the movement can be tracked is by using a 3D actuator connected and recording camera vibrations and movements. These movements can be used to align camera frames. Camera vibrations can be compensated by removing the vibrational frequency from the markers path history in Fourier space.
  • the available assigned positions may be for example, the following ones: Supine (belly-up), Prone (belly-down), On right side, On left side, Out of bed.
  • the pose can be determined according to the inclination angle deduced from the one or more accelerometers. For example an abdominal marker will have an angle essentially parallel to gravity for the supine position, an angle of approximately 90 degrees for the lateral poses but with a difference in sign between left and right lateral.
  • the 3D position of each of the markers in the camera frame can be deduced, given the image coordinates of at least 3 emitters that are connected rigidly (see below).
  • the set of subject positions may be enlarged to incorporate intermediate positions, for example a left lateral pose with various degrees of bending of the spine. These poses can be differentiated for example by calculating the vector distance between an abdominal and chest emitter from reconstruction of the 3D locations in the camera frame. Chest and abdominal accelerometers can be used to determine their relative inclination to gravity, the addition of chest and abdominal gyroscopes to each marker assembly can provide the relative orientation of specific 1-d lines within each of the marker assemblies.
  • the marker movements consist of respiratory movements, cardiac movements and other gross movements unrelated to the cardiorespiratory system. It is important for some applications such as sleep studies to identify periods of such movements, extract them, and continue measuring breathing parameters in their presence when possible. Such movements may include leg movements, twisting and turning in bed, bed- exit and others.
  • the respirator ⁇ ' analysis may be carried out differently during periods of body movement and therefore these periods need to be identified.
  • the markers are tracked one can identify frames in which there was significant non-breathing motion as follows: For each marker at each frame, the image positions of the marker in all the recent frames (say, 10 seconds) are analyzed through principal component analysis. When the body motion is pure breathing, the magnitude of the first principal component of each marker is significantly larger than the magnitude of the second principal component. The direction and absolute size of the first principal component vector is dependent on the marker location, relative camera position and subject's physiology. However, what is found to normal breathing motion is that locally it is along a fixed 3D axis throughout the entire cycle. By choosing a camera position and marker positions whose lines of sight are not parallel to these 3D physiological breathing directions, the local directionality translates to the 2D image.
  • breathing motion is characterized by locally possessing a principal component that is significantly larger than the second one in the 2D image.
  • a principal component that is significantly larger than the second one in the 2D image.
  • the magnitude of the second principal component becomes significant relative to the first principal component for a few frames (for 0.5 seconds for example) of any chest/abdominal marker, it is an indication that some kind of significant non-breathing motion has occurred.
  • tracking is reset and detection is carried out again.
  • the above Principal Component Analysis should be carried out in 3D (see below how the 3D vector is reconstructed).
  • Additional indicators of non-breathing motion can be extracted: for example by counting the number of pixels who have changed their gray level significantly (greater than a fixed threshold) from frame to frame of the video sequence.
  • the number of pixels which obey this criterion during breathing can be determined from a baseline period, say, 30 seconds when the subject being monitored is not moving significantly (asides from breathing motion).
  • the threshold on the number of pixels that change significantly for identifying non- breathing motion is set.
  • the inertial sensors can also be utilized to identify non-breathing motion epochs according to their signal features such as the magnitude of inclination changes and their derivatives.
  • Features from the various sensor signals may be combined to learn a classifier that identifies epochs (say 10 second intervals) of non-breathing motion; the classifier is trained on a database of subjects a priori and run during monitoring to determine epochs in which breathing parameters cannot be extracted due to significant non-breathing motion 21) ⁇ 3D Motion T racking
  • the 3D displacements of the markers in the body frame of reference needs to be determined in order to calculate various parameters such as breathing volumes.
  • the additional depth coordinate of each pixel is provided.
  • these coordinates are in the camera frame and not relative to a body frame of reference.
  • the following description illustrates how the 3D motion can be obtained using only the 2D image coordinates and a marker that consists of 2 or more LEDs,
  • One possible marker configuration includes 2 LEDs fixed to a rigid plate and separated from each other by a known distance, say 4 cm.
  • the structure of the marker can be that of Fig. 6 with electronic circuitry and battery source that powers 2 LEDs.
  • Each of the 2 LEDS are detected and tracked as described above.
  • In order to calculate their 3D displacement and vector direction consider two video frames in time: the end of an exhalation (beginning of inhalation) and the following end of inhalation. These frames can be found by calculating ID curves of the marker displacement along the instantaneous principal component direction and subsequently analyzing these marker displacements curves for minima and maxima using standard ID methods.
  • the shape of the quadrilateral formed by the four 3D positions of the 2 LEDS (two at the beginning of inhalation and two at the end), is well-approximated by a parallelogram. This is due to the physiological characteristic that the breathing vectors are more or less fixed over small body surface regions (say 5x5 cm) of the abdomen or chest. One condition is not to locate the markers over points of discontinuity such as the ribcage-abdomen boundary.
  • the base of the parallelogram has fixed length since it on a rigid marker but the length of its sides is unknown. First we describe how the length is determined assuming the parallelogram angle was known.
  • the displacements for intermediate frames can be found from the principal component displacement graphs by simple linear scaling. Specifically, for each monotonic segment of the ID graph of Fig. 11, the following transformation is carried out from left to right:
  • factor Y3d(tl,t2) /(y(t2)-y(tl)).
  • the function y(t) is the ID original principal component projection at frame t
  • y'(t) is the 3D displacement graph at frame t.
  • a monotonic segment starts at frame tl and finishes at frame t2.
  • Y3D(tl ,t2) is the length of the 3d vector corresponding to h found for the breathing maneuver from frame tl to frame t2.
  • markers with more than 2 LEDs can be used as follows: 1. In a first frame, say the beginning of inhale, the 3D camera position is determined using standard methods based on the knowledge of 3 or more points on the rigid body (the marker).
  • the positions of the LEDs in a subsequent video frame are assumed to all be translated from their original positions at the beginning of inhale by a fixed 3D vector [a,b,c] that represents the breathing vector.
  • a,b and c can be determined by least squares.
  • Another configuration in which 3D motions can be reconstructed is by using several distinct markers.
  • several single marker LEDs can be distributed at 2 or more distinct locations on the chest and the distances between them measured using measuring tape or some other measurement means at a particular time in the respiratory cycle (say end of exhale). From these measurements, the camera position can be determined. Using the directions of breathing motion at each of these locations (determined using methods such as those described herein), the breathing displacements can be estimated analogously to the way described for the multi-led markers using image frames at two different times (tl & t2) in a single respirator ⁇ ' cycle, say beginning of inhale and end of inhale.
  • the "virtual rigid body” that is being tracked is the 3D slab formed by the vertices of the marker locations at tl (define one face of slab) and the marker locations at t2 (define second face of slab).
  • the virtual slab reduces to a quadrilateral with two long almost parallel sides of known length.
  • the video analysis and inertial sensor measurements of breathing motion can be combined to provide improved estimates of the respirator ⁇ - motion.
  • the video analysis of the marker motion essentially provides the respiration displacements of each breath in 3D space (length and direction).
  • the acceierometer analysis provides a complementary independent view: in that it provides information regarding the curvature of the essentially 1-D breathing translational movements Combining the signals to calculate respiratory parameters enables more accurate estimation of respiratory parameters by combining independent aspects of the respiratory motion.
  • the markers and the accelerometer must be on the same rigid base or very close to each other, so that they actually measure the same body movement.
  • the two sensors can be used to fill in measurements when one of the sensors cannot make a measurement, when there is no line of sight between the imaging sensor and the LEDs, or when there if a partial obstruction of some of the LEDs.
  • This can be achieved by using the previously described Kalman filter with measurement covariances, which are dynamically changed according to the quality of the data measurements. For example, when a LED becomes completely obstructed, the covariance of its measurements is set to the maximal value for the filter iterations.
  • Accepted clinical measurements such as respiratory rate, tidal volume, minute ventilation and apnea indicators can be derived from the sensing system.
  • calibration is needed between the measured signals and the respiration property.
  • These calibration functions can be learned a priori based on subject attributes such as body shape, silhouette, weight, age, sex, BMI, chest circumference, and body position and data from a validated device for measuring air flow such as a spirometer.
  • the calibration can also be updated using data from a phantom or a simulator. Note that calibration functions used should be taken with the subject pose and marker location taken into account.
  • Respiratory Rate The respiratory rate is extracted by first extracting the dominant frequency from each available signal separately and then fusing these estimates together, say by averaging; For a video sensor, the signals analyzed are the 1 -d displacements of the markers along the principal component axis in the last say 30 seconds; for an accelerometer/gyroscope the signals are the inclination angles to gravity. Periods of identified non-breathing motion are not included in the analysis window.
  • One method to extract the dominant frequency from the marker's cleaned (filtered, cleaned of noise and after non-breathing motion removal) signal is through analysis of time series windows which include several breaths. For instance, this can be achieved by calculating the Fourier transform over a moving window of, e.g., -20 sec every 1 sec.
  • a Hamming window Enoch son, Lorers. D.; Otrs.es, Robert K. ( 1968). Programming and Analysis for Digital Time Series Data. U.S. Dept. of Defense, Shock and Vibration Info. Center, p. 142.) is used to reduce artifacts.
  • the instantaneous frequency of each signal is taken to be the frequency with the largest Fourier component amplitude.
  • Another way of calculating the individual respiration rates are in the time domain as the time difference between consecutive significant peaks or troughs.
  • the estimate of the actual respiratory rate is taken as a weighted average of the dominant frequencies found from each valid marker signal.
  • the weighting can be taken proportional to the signal to noise ratio of each of the comprising signals.
  • Apnea can appear with chest wall movements present as in an obstruction and there a more careful analysis is needed.
  • the phase difference between abdominal and thoracic markers is followed to determine whether a significant change 1 time may be obtained by obtaining the phase delay between two short time segments (10 sec for example) of the marker movements. For example, this can be achieved by taking the Fourier Transform of each segment separately and obtaining the phase difference between the relevant complex Fourier component of each of the two markers. The relevant component is the one that correspond to the extracted respiratory rate.
  • Another method to find the phase difference between two markers is to average the time lag between their peaks and express it as a fraction of the cycle time; 50%, for example, corresponds to a 180° phase lag.
  • Periods of disordered breathing including apnea events can be found by building a classifier on signal windows (say 10 seconds) from all available sensors (inertial and image sensors). Relevant features are based on the amplitudes, cycle lengths and phase lags of the signals within a window.
  • This classifier is learned on collected data of subjects with labeled apnea events. Such events are identified by a physician with the aid of a capnograph that continuously measures the concentration of CQ 2 in exhaled air (Lifesense, Nonin Inc.). This classier can be combine with the classifier for non-breathing motion epochs described above.
  • Breathing Variability can be calculated as the entropy of the lengths of the breathing cycle. It is often an indicator of respiratory distress.
  • Tidal Volume One embodiment for calculating tidal volume (the amount of air displaced or inhaled per breath) proceeds through calibration of the subject with a spirometer for a few breaths. A model is learned for the dependence of the spirometer flow versus a marker's "flow", which is essentially the derivative of its 3D displacement function during a breath, smoothed if needed. Flow data from the inhale phase of several breaths of a subject who has breathed through a spirometer is shown in Fig. 14. The flow readings of the spirometer are shown versus the speed of a single marker's displacement, A model, for example a one-dimensional linear one, can be fit to this data and used in runtime to estimate the flow during each breath.
  • a more detailed mode is also possible.
  • a model with an additional parameter representing the scaled time within the breath may be used.
  • separate one-dimensional models can be learned for each phase of the breath such as: beginning of inhale, midst of inhale and end of inhale.
  • the spirometer mouth flow to the marker signals Prior to fitting the spirometer mouth flow to the marker signals they need to be aligned, say by aligning the time of the maximal displacement to the end of inhale. Standard integration of the flow during one breath provides an estimate of the tidal volume based on inhale. A similar model can be learned to estimate the exhale phase.
  • the inertial sensors can also be used to build a model based on their velocity profile which can be obtained from integration of their acceleration profile in the global frame (with gravity removed from the acceleration profile as described above).
  • the tidal volume is estimated by a weighted average of the individual model estimates, with weighting provided by their overall reliability measured by their goodness of fit at calibration.
  • the learned linear models are sensitive to several parameters such as the anatomical position of the markers and the specific subject pose.
  • the subject pose can change during the monitoring (during sleep, for example).
  • Non-rigid changes of subject's thorax pose can be identified by tracking in real-time the relative 3D pose of one marker relative to the other marker using well-known 3D reconstruction methods, such as described above, hi cases where a significant change in relative pose of the 2 markers is identified, the calibration model used is modified.
  • the calibration model to be used in runtime is modified according to the model learned at caiibration for the specific subject's pose. Specifically, for the linear model it was found that small changes in pose (say from supine) can be modeled by adding a linear term in the difference in "bending" angle from the original pose.
  • This "bending" angle can be taken as the angle between the planes of the chest and abdominal markers.
  • the coefficient of this linear term is learned by varying the subject position during the calibration procedure. This coefficient can be obtained either from the image sensor measurements with 3D reconstructions or from the kinematic sensors' inclination estimation as described above. In the case of the presence of a single marker, absolute pose changes can be identified and calibrated for a priori (using a similar calibration procedure).
  • anatomical position changes in the marker location require a modification of the linear coefficient of the model which is learned at calibration.
  • Changes in marker position relative to anatomy typically arise when the marker is not adhered to the subject's body, but rather is attached to a strap or vest worn by the subject.
  • Changes in the anatomical marker position are carried out by tracking the 3D position of the marker relative to anatomical landmarks on a subject.
  • a fiducial marker can be placed on an exposed part of the body, such as on the neck; this marker may be reflective and tracked only intermittently when movement of the main markers is suspected. Changes in the relative 3D location between the fiducial and main markers can be used as an indication of marker movement.
  • an alert may be issued to call the staff to adjust the location.
  • the model's linear coefficient can be adjusted according to what was found during calibration.
  • anatomical lines such as the midsternum anatomical line can be tracked in the image using deformable registration methods on the body silhouette which, can be found as outlined below.
  • the relative 3D movement between the rigid chest and abdominal markers can be decomposed into 2 components - the first resulting from a body "bending" component and the second consisting of other marker motions.
  • the resulting magnitude of the relative 3D movement between the markers can be used as an indicator of the magnitude of the markers movements relative to the subject's body (which cannot be explained solely by "bending”).
  • the relative 3D movement between markers can then be decomposed into these 2 components to determine the amount of markers movements redlative to the subject body.
  • the distances to the body silhouette or facial features can be evaluated, to determine which one is more consistent with previous measurements during monitoring.
  • the tidal volume can be used as opposed to the actual flow measurements. In this case, the flow measurements are reconstructed by multiplying the relevant marker speed by the ratio of tidal volume to maximum displacement for the breath in question.
  • the calibration of each subject with a spirometer on a few- breaths can be replaced by calibration from a previously acquired database of subjects as described below.
  • Another method for calculating the tidal volume is through modeling the abdominal and thoracic cavity shape.
  • the shape can be assumed to be a double cylinder. Both cylinder walls expand during inhalation and contract during exhalation. The heights of the two cylinders change with breathing due to the movement of the diaphragm.
  • the thoracic cylinder and abdominal cylinder have different mean radii which can be approximated from the image by using edge detection on the image to find the body edges, at one or more body positions.
  • the known physical marker size can be used to convert these measurements to physical units.
  • the marker displacements and the angle of the camera are used to calculate tidal volumes through the model.
  • Yet another modeling technique can utilize the 3D modeling of the thorax as described herein above.
  • the extracted marker motion can be superimposed on the 3D model to extract the volume change during exhalation.
  • Eye Movements Eyes, when open can be detected from the video imaging without the necessity of markers.
  • small kinematic markers including strain sensors placed on or above the eyelids or by the corner of the eye, such as by means of a sticker and wireless transmission, can be used. These small kinematic markers may be used for sleep staging for quantifying eyelid movement to associate a specific eyelid movement pattern with a corresponding sleep stage or for quantifying the continuity of sleep (eye openings). It will be appreciated that these kinematic markers may be used for sleep assessment during anesthetic procedures,
  • Posture stability The number of position changes during sleep can be tracked easi ly due to the markers that move in and out of the field of view. The actual body position at each time of sleep can be found and alarms can be set to caregiver if person needs to have position changed (subjects with bedsores, infants who are not to be positioned in prone positions, etc.),
  • Bed Exits The number of bed exits and bed entries can be detected and alarms setoff in real time if person does not return to bed or falls near bed for example.
  • Heart Rate The heart rate can be obtained from markers located near to chest. This can be best done by finding peaks in the Fourier transform of the displacement signal of each marker in the range of, e.g., 40-140 bpm for adults.
  • Tremors/seizures Continuous trackable motion in all the markers as well as motion in a large fraction of pixels found within the body silhouette indicate a tremor.
  • Eye micro-tremor The characteristics of eye micro movements are related to the level of sedation.
  • the described sensor can be used in conjunction with a marker to track these movements as follows:
  • a marker can be adhered to the eyelid or around the eye, such as a strain sensor, a thin retro- reflective adhesive sticker or else IR reflective polish can be applied to eyelids or eyelashes.
  • An alternative is to attach the marker to the side of the face near the eye and have a thin flexible object connecting it to the eyelid. It is possible to track movements by detecting and tracking the boundaries between eyelashes and eyelids in the image. Since the movements are of high frequency ( ⁇ 80hz) and of small amplitude (up to several microns).
  • the choice of camera sensor should be made accordingly and also the placement should be closer to narrow the field of view.
  • One possibility is to attach the camera to an eyeglass frame rather than to the bed.
  • the eye-tremor and breathing monitors can be configured to use the same sensor or distinct sensors,
  • Head Movements Often there are head movements arising from an obstruction and manifesting itself as snoring and increase respiratory effort.
  • Respiration is monitored in order for an early warning alert to be issued if a problem is about to occur.
  • the system produces an online early warning alert based on personalized per subject data and based on recent continuous measurements.
  • the ubiquitous problem with alert systems is that they are often unreliable and therefore lead to alarm fatigue, in which alarms are often ignored.
  • a system which provides a more reliable early warning, is more immune to artifacts and classifies the overall respiratory information sensed.
  • a quality measure is provided, which quantifies the reliability of the current measurement.
  • the early warning system is based on a multi-dimensional analysis of the respiratory waveforms and personalized to the cuirent attributes of the subject being monitored. This should be contrasted to typical monitoring systems that track a single feature, for example the respiratory rate. In such systems, an alarm will typically be set off whenever the derived respiration rate declines or exceeds fixed preset minimal and maximal thresholds.
  • the approach that introduce here is different in that it is adaptive to the subject but reliable in that it is based on a large amount of data.
  • features from the subject are used (age, sex, thoracic size, etc.) and then various feature are extracted from the recent measurements.
  • These can be the outputs from the video imaging system described above or from other devices or from a combination of several devices.
  • the raw signals may include those from an end tidal CO 2 capnograph, a pulse-oximeter and the video system described above with a single marker.
  • the measurements can be impedance measurement made through leads attached to a subject.
  • the identity of the features, mathematical quantities derived from the measurements time series, are determined in the training stage described below.
  • a classifier is trained in this feature space during the training stage which is performed either in advance or online.
  • the classifier can be a two-class one that differentiates between normal subject behavior and abnormal behavior.
  • a score is assigned continuously based upon the measurement data, with 1 indicating typical baseline behavior while a lower score, e.g., 0.5, represents the edge of normal behavior below which outlier behavior is observed.
  • a quality score is calculated denoting how reliable the basic measurements are in cases where this is possible.
  • the features are calculated from the various signals on various timescales and typically depend on the physiological source of the signal.
  • the respiratory rate calculated from a 20 second time series signal of several of the video markers described above can be one of the features.
  • Other features can quantify the trend of physiological quantities, such as the derivative (trend) of the respiratory rate over consecutive overlapping 20 second intervals.
  • average amplitudes of the respiratory signal peaks and troughs over 20 second intervals and their derivative over time can be used.
  • the initial set of features can be reduced further in the training stage to reduce over-fitting of the classifier using standard methods.
  • a classifier is trained using training data, which consists of training vectors that consist of the set of features and the label of one of the classes to be classified.
  • the label may be "normal”, “coughing", “sensor disconnected”, “risky”, and so forth. If sufficient data exists for more than one-class of behavior, a multi -class classifier map be trained. Minority classes with few data points can be grouped to a larger class labeled "artifacts" for example. In the case where labeled data does not exist or predominantly belongs to a single class, a single class classifier can be trained. Frameworks for learning such classifiers exist, for example the SVM one-class classifier (B, Scholkopf et. al., Estimating the support of a high-dimensional distribution. Neural Computation, 13(7), 2001).
  • Standard feature selection methods can be used to reduce the dimensionality of the classifier problem, such as PCA, and methods that rank the features according to their discrimination ability.
  • the forgoing applies in general to training a classifier based on a fixed training set that is collected a priori to monitoring. However, if the data is collected a priori, it may not be relevant to the baseline characteristics of the subject in question, thus leading to an inaccurate classifier.
  • An alternative would be to train the classifier based on the initial period of monitoring the subject, e.g., the first few minutes when the medical staff is in the vicinity. However, this greatly limits the amount of training data that can be used and thus can severely limit the generalizability of the classifier due to over-fitting. Again the classifier results may not be reliable.
  • Our proposed methods involve forming training sets which are relevant to the subject in question and enable the incorporation of a large amount of data.
  • Augmenting training sets online The data from a monitored subject can be added to the database by adding it as time segments of e.g., 10 minutes with all its features.
  • the periods of respiratory depression can be labeled according to preset conditions such as respiratory rate ⁇ 8 or minute volume ⁇ 3 liter. Using such a criterion, all the data can be labeled according to how long they precede a respiratory event. This data can be used in future training sets of the classifier since it can be labeled as "normal' ' ' and "risky” depending on how long before a respirator event they were extracted.
  • Subject training set selection "on the fly” One method is to train a classifier based on data from a subset of the total available subjects. This subset is chosen so that it is "closest" to the subject in question.
  • One method this can be done is by carrying out a clustering procedure (e.g., mean-shift clustering) in the feature space composed of the data from all the subjects, including the initial data of the subject in question.
  • the subjects, whose data falls mainly in the same cluster as those of the subject are those used as the training set.
  • the clustering can be carried out many times with different parameter values; the number of times each subject falls in the same cluster as the current subject is recorded.
  • Subjects that fail in a majority of the experiments in the same cluster as the subject in question are used for the classifiers training set.
  • Training set selection "a priori" The data from ail subjects can be a priori clustered as in method 1 to obtain subsets of subjects with similar physiological signal features. More than one clustering experiment can be used to produce additional subsets that may intersect previous clusters. Each such cluster is a candidate training set and the one that is chosen is the one who is closest to the current subject. This subset can be chosen according to the distance of the cluster's center of mass to the current subject's features mean value. In addition, differences in the principal component directions in feature space can be incorporated into the distance matrix between the two distributions.
  • Another method of finding the distance between the current subject and the subsets is to determine the overlap volume between each cluster distribution and the current subject distribution. Each such distribution can be approximated by an ellipsoid using the covanance matrix and the center of mass in feature space. Additional criteria than can be taken into account in calculating the closeness are subject attributes such as age, medical status, sex and base vital signs. 4. Transforming training set: Another option of utilizing the original training set for the new subject is to scale the individual subjects to normalized coordinates. For example, a feature value f, can be transformed as follows:
  • the output of the training stage are mathematical formulae whose input are feature vectors and whose output is a class probability - the probability that the feature belongs to a specific class of the classifier.
  • regions of the feature space that are associated with non- normal respiration patterns are assigned informative clinical descriptions; such as "Apnea”, "Shallow Breathing", etc. These regions and abnormalities are assigned a priori based on pre-assigned rules (based on expert rules).
  • an initial class is assigned continuously based on the class with the highest score of the learned functions.
  • the normalization factor is updated during runtime on the previous data of the subject in question.
  • a quality measure is associated continuously in time. For the video sensor, whenever non-respiratory body motion is identified, the quality score is reduced from 1. The amount that it is reduced depends on the prediction error of the motion, the larger the error the more it is reduced (the minimal quality measure is 0). Also if periods of coughing or talking are identified using other sensors (for example acoustic sensors), the quality measure is also reduced. Any indication of artifacts in one of the sensor signals should be used to reduce the quality measure of the other sensor signals.
  • the suspected anomaly can be output based on the feature values and predetermined rules. Examples of such rules are:
  • a class output that is neither "normal” nor “unknown” can trigger an alarm or warning to be issued.
  • a delay time e.g. 10 seconds
  • Alarms can be audible with sound associated with severity and type of breathing pattern. They can be communicated to a variety of output devices either by a wired or wireless connection, such as to a bedside monitor, a handheld monitor, a mobile unit or a central nursing station.
  • the camera For accurate reconstruction of 3D marker and subject coordinates, it may be necessary to calibrate the camera once first connected using known grid pattern targets (without a subject). In some embodiments the camera is used at two positions to mimic stereo acquisition. Here too the camera should be calibrated with respect to the fixed known physical baseline (for instance, a rod placed on the bed). These calibrations can be achieved by many available tools (see for example: http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example5.html).
  • the normal baseline tidal volume for an individual depends on many factors such as: age, sex, obesity, pose, disease conditions, time evolved since anesthesia, etc.
  • One possibility for calibration of baseline tidal volume of a subject is to carry out a calibration procedure of the above video device versus a spirometer which measures continuous pulmonary flow rate of air inhaled and exhaled.
  • additional calibration measurements are needed to be carried out on the subject. These tests can be done prior to surgery or to an invasive or imaging procedure, for patients who are to be monitored during or post procedure/surgery.
  • An alternative to the calibration per subject is to learn/build a model on several subjects or subject simulators a priori in a training session in which several breathing patterns and poses are recorded both with the video device and a spirometer.
  • other physiological data may be collected on these training subjects using movement sensors such as an image sensor, kinematic sensors, specific covering (for example a snug sheet or undergarments only) and possibly additional markers.
  • movement sensors such as an image sensor, kinematic sensors, specific covering (for example a snug sheet or undergarments only) and possibly additional markers.
  • Silhouette of the trunk area which moves during breathing This is achieved in a number of ways but most simply by looking at differences of video frames taken at opposite phase during a respiratory cycle when the scene is illuminated. Pixels which change significantly more than they do for consecutive frames participate in respiration. These pixels can be pot processed to form a smooth region by region growing techniques.
  • the spirometer flow rates are fit to a function of the marker motion signals .
  • the number of data points entering the fit is according to the sensor and spirometer sampling rate, which is of the order of tens of samples per second, so that say a 5 minute trial involves several thousand data points. This is repeated for additional subject poses (lying on the side, for example).
  • the volume function can be the result of a non-linear regression scheme, e.g. SVM regression (B. Sch Olkopf and A. J. Smoia. Learning with Kernels. MIT Press, 2002).
  • the model learned can be a linear function of the displacements; the phases can be taken into account by using signed displacements which are determined by setting a reference zero displacement for each marker.
  • One way the zero is assigned is by computing a moving average of the raw displacements over the last few respiratory cycles (say 5 breaths for example).
  • the signed displacement is obtained by subtraction of the reference zero displacement from the measured one at each time point.
  • the relative weights between markers can be based on their relative "reliability" which is provided by the variance in equi -volume measurements or the variance of displacements during stable tidal breathing.
  • the phases can be taken into account by learning different models for different scenarios of phase relations between the markers. For example one function for all markers in-phase, another one for chest and abdominal markers out of phase, and so forth.
  • a short automatic calibration session e.g., 30 seconds
  • a predefined protocol such as "subject lies on back”
  • various relevant measurements are made for an estimate of the subject's silhouette and 3D shape as in the training set.
  • other features related to trunk size are measured: These could be the distances between markers for example which can be extracted using the device's image frames or distances between body landmarks identified through image analysis or manually.
  • physiological features are recorded for new and database subjects such as age, sex, height and weight.
  • the "nearest -neighbor" database subjects are extracted from the database.
  • the distance between two subjects is measured for instance by measuring the distance between their tmnk images (within silhouette) as follow:
  • the two images are registered using body landmarks and markers locations placed in corresponding anatomical positions.
  • the registration is non-rigid but can be confined, for instance, to Affine transformations.
  • the distance between 2 images is determined by their scale factors (for example by the difference of the geometric mean of the scale factors to 1).
  • the marker movements and breathing volumes change as a function of subject pose.
  • the pose of a subject is deduced in real time as described herein.
  • the relevant database estimation for the volume is used.
  • the relevant "nearest neighbors" from the database are selected during the initial auto-calibration session of the test subject.
  • Fig. 1 is a flow chart of the monitoring step that starts with the inputs of the first few frames 101 after which, in step 102, the markers are detected according to the procedure of Fig. 2. The detection step is followed by the impact of the next frame 103 after which, step 104 the markers are tracked over time as detailed in Fig. 3. From the frames the subject's pose is extracted in step 105, and the respiration properties are calculated in step 106, as further detail in Fig. 4. At the end of the calculation process the system checks, 107, if the tracking has been lost and in the affirmative case the process is started anew in step 101, while in the negative case the next frame is input in step 103.
  • Fig. 2 details the detection steps.
  • the system already has at least one frame and the detection starts by getting the new frame, 201.
  • a comparison step 202 is performed to determine whether the frame is stable compared to the previous one. In the negative case another frame is obtained and the process starts again at step 201, while in the affirmative case the frame is set as a reference frame and in step 203 all pixels greater than a threshold value are located.
  • the pixels are then clustered into "clouds" in step 204 and the clouds are filtered according to their features in step 205.
  • step 206 the markers (clouds) are tracked from the reference frame to the current frame and the success of this tracking is verified in step 207, If the tracking was unsuccessful the process starts again at step 201, while if it- was successful after t seconds have elapsed since the reference frame (verified in step 208) tracking of markers whose motion is consistent (similar frequency ) is continued in step 209, and the analysis of marker movement begins in step 210. If the time that passed is less than t seconds, the next frame is obtained in step 21 1 and the markers are checked again in step 206.
  • Fig. 3 is a flowchart of the tracking procedure, which starts by setting the first frame is a reference frame at 301.
  • step 303 checks whether t seconds have passed since the reference frame, in which case the last frame is set to be the reference frame in step 304.
  • step 305 checks whether the brightness has changed relative to the reference frame, in which case, again, the frame is set to be the reference frame in step 304.
  • the 2-D motion vectors are calculated between the current and the reference frame in step 306. This also happens after the last frame has been set to be the reference frame in step 304.
  • the system checks whether the tracking is successful using criteria such as making sure the extent of the marker movement is not too large, that the pixel movements within the marker are consistent, and that the mean square error in the numerical fit is not too large in step 307, and in the affirmative case it goes back to step 302 and gets the next frame. If tracking was unsuccessful the detection process of Fig. 2 is performed again in step 308.
  • Fig. 4 is a flowchart of the analysis process.
  • Step 402 calculates the projection of marker location onto the current principal component direction from the last t seconds.. If a new peak is identified in the principal component function over time, in step 403, the 3D position of the marker at the time of "New Peak" frame is estimated in step 404, else, step 401 is repeated.
  • the 3D position of the marker at all frames between the two most recent peaks is estimated by scaling the principal component graph in step 405 and the RR, phase delay and the volumes for all frames between the most recent two peaks is calculated in step 406, after which the process restarts at step 401.
  • Fig. 5 is a flowchart of a volume calibration procedure that may be carried out according to some embodiments of the invention.
  • the calibration process begins in step 501 where mechanical calibration is performed. Then, in step 502, the subject is recorded from each camera in supine position in various illumination conditions. In step 503 the marker's position on the subject are tracked to obtain 3-D displacement vectors, and in step 504 the silhouette and the 3D shape of the subject's trunk is extracted. In step 505 the "nearest neighbor" database subject is found by registering the silhouette and the 3D shape via a non-rigid transformation. In step 506 the scaling factor between the subject and his nearest neighbor is determined. The calibration function is found in step 507 by using the nearest neighbor estimator function with the current subject's scaled displacement.
  • Fig. 6 illustrates the main parts of a marker according to one embodiment of the invention, consisting of LED assembly 601, which contains a LED, together with circuitry needed to operate it. Operating LED's is conventional in the art and is well known to the skilled person and therefore said circuitry is not shown in detail, for the sake of brevity.
  • Assembly 601, in this embodiment of the invention is meant for multiple uses and operates together with disposable part 602, which in some embodiments of the invention is applied directly to the subject's skin, e.g. using an adhesive surface provided at its bottom.
  • disposable part 602 in some embodiments of the invention is applied directly to the subject's skin, e.g. using an adhesive surface provided at its bottom.
  • disposable part 602 is located above a fabric 603, which may be, for instance, a subject's pajama or a blanket, and is kept in position relative to said fabric by a lower base 604, which is maintained in position on the subject's body by using an adhesive material located on its bottom 605.
  • Lower base 604 can be coupled to the bottom 606 of disposable part 602 in a variety of ways, e.g. by magnetic coupling between the two, or by providing mechanical coupling, e.g., through pins that perforate and pass through fabric 603 (not shown).
  • disposable part 602 may contain batteries needed to operate the LED, as well as, if desired, additional circuitry.
  • Reusable LED assembly 601 is connected to disposable part 602 through seats 607 and 607', which engage buttons 608 and 608' located in disposable part 602.
  • An additional positioning pin 609 can be provided in LED assembly 601, to engage hole 610 in disposable part 602.
  • Electrical contact can be provided from the power supply located in disposable part 602, through buttons 608 and 608', although of course many alternative ways exist of conveying power from the batteries to the LED assembly 601.
  • Fig. 7 is an operationally connected view of the various elements of Fig. 6, the same numerals being used to indicate the same parts.
  • Fig. 8 is a top view of the assembled marker devise of Fig. 7, and Fig. 9 is a cross-section of the same device assembly, taken along the AA line of Fig. 8.
  • a compartment 611 is seen, which houses the LED, as well as electronics used to operate it.
  • the top portion 612 of LED assembly 601 is made of material of a transparency sufficient to allow the required amount of light generated by the LED to be viewed from the outside.
  • Fig. 10 shows the disposable part 602 of Figs. 6 - 9 in exploded view.
  • This disposable assembly is made of a plurality of elements and layers, kept together by gluing or by mechanical connection, for instance when upper buttons 608 and 608' are connected with a lower buttons 613 and 613'.
  • a battery 614 is provided as the power supply to the LED assembly, which can be, for instance, a standard 2030 disc battery.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pulmonology (AREA)
  • Dentistry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Cardiology (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Système pour surveiller l'activité respiratoire d'un sujet, qui comprend un ou plusieurs capteurs de déplacement, appliqués au thorax d'un sujet, pour produire des premiers signaux qui sont indicatifs du déplacement du thorax du sujet ; un récepteur pour recevoir les premiers signaux produits pendant un mouvement de respiration du sujet ; et un ou plusieurs dispositifs informatiques en communication de données avec le récepteur, pour analyser le mouvement de respiration. Le dispositif informatique sert à produire un premier profil de respiration à partir des premiers signaux ; diviser chaque cycle respiratoire subi par le sujet et défini par le premier profil en une pluralité de parties, chacune des parties étant délimitée par deux points temporels différents et calculer, pour chaque partie de la pluralité de parties d'un cycle respiratoire donné du premier profil, une pente représentant une vitesse de thorax ; dériver, à partir du cycle respiratoire donné du premier profil, un débit d'air pulmonaire du sujet pendant des parties prédéfinies du cycle respiratoire ; comparer entre des parties correspondantes du premier profil et des débits moyens pendant différentes phases du cycle respiratoire, pour étalonner les vitesses de thorax du sujet avec des débits d'air pulmonaire ; et déterminer des caractéristiques respiratoires du sujet pour des cycles respiratoires ultérieurs subis par le sujet, sur la base d'une vitesse de thorax calculée et de l'étalonnage.
EP17785579.8A 2016-04-22 2017-04-21 Dispositif de surveillance de mouvement corporel Pending EP3445242A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/135,797 US9788762B2 (en) 2013-10-24 2016-04-22 Motion monitor
PCT/IL2017/050466 WO2017183039A1 (fr) 2013-10-24 2017-04-21 Dispositif de surveillance de mouvement corporel

Publications (2)

Publication Number Publication Date
EP3445242A1 true EP3445242A1 (fr) 2019-02-27
EP3445242A4 EP3445242A4 (fr) 2019-12-04

Family

ID=65036514

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17785579.8A Pending EP3445242A4 (fr) 2016-04-22 2017-04-21 Dispositif de surveillance de mouvement corporel

Country Status (1)

Country Link
EP (1) EP3445242A4 (fr)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6997882B1 (en) * 2001-12-21 2006-02-14 Barron Associates, Inc. 6-DOF subject-monitoring device and method
GB201009379D0 (en) * 2010-06-04 2010-07-21 Univ Edinburgh Method, apparatus, computer program and system for measuring oscillatory motion
US20160029949A1 (en) * 2013-03-25 2016-02-04 Technion Research & Development Foundation Ltd. Apnea and hypoventilation analyzer
EP2974648A1 (fr) * 2013-12-18 2016-01-20 Analog Devices Global Système et procédé pour mesurer la respiration au moyen d'accéléromètres

Also Published As

Publication number Publication date
EP3445242A4 (fr) 2019-12-04

Similar Documents

Publication Publication Date Title
US11612338B2 (en) Body motion monitor
US10506952B2 (en) Motion monitor
US20200237261A1 (en) Apparatus and method for the detection of the body position while sleeping
Liu et al. Recent development of respiratory rate measurement technologies
CN111565638B (zh) 用于基于视频的非接触式潮气容积监测的系统和方法
US20200260996A1 (en) Method and apparatus for processing a cyclic physiological signal
US10219739B2 (en) Breathing pattern identification for respiratory function assessment
Li et al. Noncontact vision-based cardiopulmonary monitoring in different sleeping positions
Liu et al. Breathsens: A continuous on-bed respiratory monitoring system with torso localization using an unobtrusive pressure sensing array
CN102869305A (zh) 呼吸运动检测装置
US20200260998A1 (en) Monitoring system
CN106413533A (zh) 用于检测对象的呼吸暂停的设备、系统和方法
Zhang et al. Monitoring cardio-respiratory and posture movements during sleep: What can be achieved by a single motion sensor
Chatterjee et al. Real-time respiration rate measurement from thoracoabdominal movement with a consumer grade camera
CN115334959A (zh) 用于呼吸暂停-低通气指数计算的睡眠状态检测
JP2021506543A (ja) 肺疾患患者のスクリーニングツール
Matar et al. Kalman filtering for posture-adaptive in-bed breathing rate monitoring using bed-sheet pressure sensors
Nesar et al. Improving touchless respiratory monitoring via lidar orientation and thermal imaging
Chatterjee et al. Real-time visual respiration rate estimation with dynamic scene adaptation
Loblaw et al. Remote respiratory sensing with an infrared camera using the Kinect (TM) infrared projector
EP3445242A1 (fr) Dispositif de surveillance de mouvement corporel
JP2022501103A (ja) 骨格モデルを提供するための装置、システム及び方法
Wu et al. An intelligent in-shoe system for real-time gait monitoring and analysis
Jakkaew An Approach to Non-contact Respiration
Slastnikov et al. Review of Instrumental Methods for Detecting the Respiratory Signal

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181119

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20191105

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/087 20060101AFI20191029BHEP

Ipc: A61B 5/091 20060101ALI20191029BHEP

Ipc: A61B 5/08 20060101ALI20191029BHEP

Ipc: A61B 5/113 20060101ALI20191029BHEP

Ipc: A61B 3/113 20060101ALI20191029BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230213