US20150094606A1 - Breathing pattern identification for respiratory function assessment - Google Patents

Breathing pattern identification for respiratory function assessment Download PDF

Info

Publication number
US20150094606A1
US20150094606A1 US14/044,043 US201314044043A US2015094606A1 US 20150094606 A1 US20150094606 A1 US 20150094606A1 US 201314044043 A US201314044043 A US 201314044043A US 2015094606 A1 US2015094606 A1 US 2015094606A1
Authority
US
United States
Prior art keywords
subject
breathing
depth
camera
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/044,043
Inventor
Lalit Keshav MESTHA
Eribaweimon SHILLA
Edgar A. Bernal
Graham S. Pennington
Himanshu J. MADHU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xerox Corp
Original Assignee
Xerox Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xerox Corp filed Critical Xerox Corp
Priority to US14/044,043 priority Critical patent/US20150094606A1/en
Assigned to XEROX CORPORATION reassignment XEROX CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHILLA, ERIBAWEIMON, MADHU, HIMANSHU J., PENNINGTON, GRAHAM S., BERNAL, EDGAR A., MESTHA, LALIT KESHAV
Priority to US14/223,402 priority patent/US10201293B2/en
Priority to DE102014219495.4A priority patent/DE102014219495A1/en
Priority to US14/553,659 priority patent/US10219739B2/en
Publication of US20150094606A1 publication Critical patent/US20150094606A1/en
Priority to DE102015222498.8A priority patent/DE102015222498A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4818Sleep apnoea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1073Measuring volume, e.g. of limbs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • A61B5/747Arrangements for interactive communication between patient and care services, e.g. by using a telephone network in case of emergency, i.e. alerting emergency services

Definitions

  • the present invention is directed to systems and methods for identifying a patient's breathing pattern for respiratory function assessment.
  • a time-varying sequence of depth maps is received of a target region of a subject of interest over a period of inspiration and expiration.
  • the depth maps are processed to obtain a breathing signal for the subject which comprises a temporal sequence of instantaneous volumes across time intervals during inspiratory and expiratory breathing.
  • One or more segments of the breathing signal are then compared against reference breathing signals, each associated with a known pattern of breathing. As a result of the comparison, a breathing pattern for the subject is identified.
  • the identified breathing pattern is used to assess the subject's respiratory function.
  • FIG. 1 shows an anterior (front) view and a posterior (back) view of a subject of interest intended to be monitored for respiratory function assessment in accordance with the teachings hereof;
  • FIG. 2 shows the subject of FIG. 1 having a plurality of reflective marks arrayed in a uniform grid on their anterior thoracic region and on their posterior thoracic region;
  • FIG. 3 shows the subject of FIG. 1 wearing a shirt with a uniform pattern of reflective dots arrayed in uniform grid with a one inch dot pitch along a horizontal and a vertical direction;
  • FIG. 4 illustrates one embodiment of an example image-based depth sensing device acquiring video images of the target region of the subject of FIG. 3 being monitored for respiratory function assessment;
  • FIG. 5 is a flow diagram which illustrates one example embodiment of the present method for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment
  • FIG. 6 is a continuation of the flow diagram of FIG. 5 with flow continuing with respect to nodes A or B;
  • FIG. 7 is a functional block diagram of an example networked system for implementing various aspects of the present method described with respect to the flow diagrams of FIGS. 5 and 6 ;
  • FIG. 8 shows an example breathing pattern associated with normal breathing (eupnea) as observed normally under resting conditions
  • FIG. 9 shows an example Bradypnea breathing pattern characterized by an unusually slow rate of breathing
  • FIG. 10 shows an example Tachypnea breathing pattern characterized as an unusually fast respiratory rate
  • FIG. 11 shows an example Hypopnea breathing pattern characterized by an abnormally shallow and slow respiration rate
  • FIG. 12 shows an example Hyperpnea breathing pattern characterized by an exaggerated deep, rapid, or labored respiration
  • FIG. 13 shows an example Thoracoabdominal breathing pattern that involves trunk musculature to “suck” air into the lungs for pulmonary ventilation;
  • FIG. 14 shows an example Kussmaul breathing pattern characterized by rapid, deep breathing due to a stimulation of the respiratory center of the brain triggered by a drop in pH;
  • FIG. 15 shows an example Cheyne-Stokes respiration pattern which is characterized by a crescendo-decrescendo pattern of breathing followed by a period of central apnea;
  • FIG. 16 shows an example Biot's respiration pattern which is characterized by abrupt and irregularly alternating periods of apnea with periods of breathing that are consistent in rate and depth;
  • FIG. 17 shows an example Ataxic breathing pattern which is a completely irregular breathing pattern with continually variable rate and depth of breathing
  • FIG. 18 shows an example Apneustic breathing pattern which is characterized by a prolonged inspiratory phase followed by expiration apnea
  • FIG. 19 shows an example Agonal breathing which is abnormally shallow breathing pattern often related to cardiac arrest
  • FIG. 20 shows a normal respiration pattern measured via the use of a depth sensing device with the depth maps being processed in accordance with the teachings hereof;
  • FIG. 21 shows a test subject's Cheyne-Stokes breathing pattern measured using the techniques disclosed herein;
  • FIG. 22 shows a test subject's Biot's pattern measured using the techniques disclosed herein;
  • FIG. 23 shows a test subject's Apneustic pattern measured using the present methods.
  • FIG. 24 shows a test subject's Agonal breathing pattern measured using the present methods.
  • What is disclosed is a system and method for identifying a patient's breathing pattern for respiratory function assessment without contact and with a depth-capable imaging system.
  • a “subject of interest” refers to a person being monitored for respiratory function assessment. It should be appreciated that the use of the terms “human”, “person”, or “patient” herein is not to be viewed as limiting the scope of the appended claims solely to human subjects.
  • a “target region” refers to an area or region of the subject where respiratory function can be assessed.
  • the target region may be a subject's anterior thoracic region, a region of the subject's dorsal body, and/or a side view containing the subject's thoracic region.
  • a target region can be any view of a region of the subject's body which can facilitate respiratory function assessment.
  • FIG. 1 shows an anterior (frontal) view which outlines a target region 102 comprising the subject's anterior thoracic region.
  • Target region 103 is of the subject's posterior thoracic region.
  • Respiration is a process of inhaling of air into lungs and exhaling air out of the lungs followed by a post-expiratory pause.
  • Inhalation is an active process caused by a negative pressure having been induced in the chest cavity by the contraction of a relatively large muscle (often called the diaphragm) which changes pressure in the lungs by a forcible expansion of the lung's region where gas exchange takes place (i.e., alveolar cells).
  • Exhalation is a passive process where air is expelled from the lungs by the natural elastic recoil of the stretched alveolar cells.
  • the lining of alveolar cells has a surface-active phospholipoprotein complex which causes the lining of the lungs to naturally contract back to a neutral state once the external force causing the cell to stretch is released.
  • a post-expiratory pause occurs when there is an equalization of pressure between the lungs and the atmosphere.
  • “Inspiration” occurs when the subject forces the expansion of the thoracic cavity to bring air into their lungs.
  • a maximally forced inspiratory breath is when the subject cannot bring any more air into their lungs.
  • “Expiration” is when the subject forces the contraction of the thoracic cavity to expel air out of their lungs.
  • a maximally forced expiratory breath is when the subject cannot expel any more air from their lungs.
  • “Depth map sequence” is a reconstructed temporal sequence of 3D surface maps of a target region of a subject.
  • a depth map may be constructed based on the amount of deformation in a known pattern comprising, for instance, structured patterns of light projected onto the target region, textural characteristics present on the target region itself such as skin blemishes, scars, markings, and the like, which are detectable by a video camera's detector array.
  • FIG. 2 shows a subject of interest 201 having a plurality of reflective marks arrayed in a uniform pattern 202 on an anterior thoracic region.
  • Subject 203 is shown having a plurality of emissive marks such as LEDs arrayed in a uniform pattern 204 on their posterior thoracic region.
  • the pattern may alternatively be an array of reflective or emissive marks imprinted or otherwise fixed to an item of clothing worn by the subject which emit or reflect a wavelength range detectable by sensors in a video camera's detector array.
  • Reflective marks may be dots of reflective tape, reflective buttons, reflective fabric, or the like.
  • Emissive marks may be LED illuminators sewn or fixed to the shirt.
  • subject 300 is shown wearing shirt 301 with a uniform pattern of reflective dots arrayed in uniform grid with a 1 inch dot pitch along a horizontal and a vertical direction.
  • the pattern may be a uniform grid, a non-uniform grid, a textured pattern, or a pseudo-random pattern so long as the pattern's spatial characteristics are known apriori.
  • Higher-resolution patterns are preferable for reconstruction of higher resolution depth maps.
  • Depth maps may be obtained from video images captured using an image-based depth sensing device such as an image-based depth sensing device comprising any of: a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an array of cameras, an active stereo camera, and a 2D monocular video camera.
  • RGBD red green blue depth
  • Depth maps may also be obtained from data acquired by non-image-based depth sensing devices such as a LADAR device, a LiDAR device, a photo wave device, or a time-of-flight measurement device as a depth measuring system. Depth maps can be obtained from data obtained by any of a wide variety of depth-capable sensing devices or 3D reconstruction techniques.
  • non-image-based depth sensing devices such as a LADAR device, a LiDAR device, a photo wave device, or a time-of-flight measurement device as a depth measuring system.
  • Depth maps can be obtained from data obtained by any of a wide variety of depth-capable sensing devices or 3D reconstruction techniques.
  • receiving depth maps is intended to be widely construed and includes to download, upload, estimate, measure, obtain, or otherwise retrieve from a memory, hard drive, CDROM, or DVD.
  • the depth maps are measured with a depth-capable sensing device. It should be appreciated that depth maps can be obtained using a camera to capture images of the subject while illuminated by a projected pattern of structured light, the camera being sensitive to a wavelength range of the structured light. The depth maps are then generated based upon a comparison of spatial characteristics of reflections introduced by a movement in the subject's chest cage to known spatial characteristics of the projected patterns in conjunction with the known distance between the light projector and the camera, and using the characterized distortions at different locations to calculate the depth map for each image in the video.
  • Depth maps can be generated using distortions in patterned clothing worn by the subject as taught in the above-incorporated reference by Bernal et al.
  • the embodiments herein are discussed with respect to the patterned clothing embodiment.
  • a “reference breathing signal” refers to a volume signal that is associated with a known pattern of breathing. By a comparison of one or more segments of the subject's breathing signal against reference breathing signals which are associated with known breathing patterns, a pattern can be identified for the subject's breathing.
  • the reference breathing signal can be retrieved from, for example, a memory, a storage device such as a hard drive or removable media, or received from a remote device over a wired or wireless network.
  • the reference breathing signal may be volume signals generated using the depth capable sensor in a simulated environment by a respiratory expert. It can also be generated using the depth capable sensor on patients with identified respiratory diseases.
  • a “subject's breathing signal” refers to a temporal sequence of instantaneous volumes across time intervals during a period of an inspiratory and an expiratory breathing.
  • Instantaneous volumes are obtained from processing the depth maps.
  • the depth map comprises a 3D hull defined by a set of 3D coordinates namely their horizontal, vertical and depth coordinates (x, y and z respectively). Points in the hull can be used to form a triangular tessellation of the target area. By definition of a tessellation, the triangles fill the whole surface and do not overlap.
  • the coordinates of an anchor point at a given depth are computed.
  • the anchor point can be located on a reference surface, for example, the surface on which the subject lies.
  • the anchor point in conjunction with the depth map defines a 3D hull which has a volume.
  • the coordinates of points on an anchor surface corresponding to the set of depths of a reference surface can be computed.
  • the anchor surface in conjunction with the depth map also defines a 3D hull which has a volume.
  • a volume can be computed for each 3D hull obtained from each depth map.
  • a concatenation of all sequential volumes forms a temporal sequence of instantaneous volumes across time intervals during inspiration and expiration.
  • the signal can be de-trended to remove low frequency variations and smoothed using a Fast Fourier Transform (FFT) or a filter.
  • FFT Fast Fourier Transform
  • volumetric data can be calibrated so as to convert device-dependent volume data into device-independent data, for example in L, mL, or cm 3 .
  • a mapping or function that performs such conversion is deemed a calibration function.
  • These functions can be estimated, for example by performing regression or fitting of volumetric data measured via the procedure described above to volumetric data obtained with spirometers. It should be appreciated that, in environments where the patient is free to move around while being monitored for respiratory function, it may be necessary to build perspective-dependent calibration functions specific to the device from which the depth maps are being derived. Data capture from different points of view can be performed and perspective-dependent volume signals derived. Processing from each point of view will lead to perspective-dependent volume signals from which multiple calibration tables can be constructed. Calibration for various perspectives intermediate to those tested can be accomplished via interpolation.
  • a “segment of a breathing signal” refers to some or all of the subject's breathing signal.
  • a segment can be, for instance, one or more dominant cycles of the subject's breathing signal or a fraction or multiple fractions of one dominant cycle of the subject's breathing signal.
  • the dominant cycle may be selected in many ways; for example by extracting any one breathing cycle from the chosen segment, by averaging all the breathing cycles in a signal, by extracting the cycle with the smallest or largest period, among others.
  • a signal segment may comprise a phase-shifted portion of the subject's breathing signal. Methods for obtaining a segment of a signal are well established in the signal processing arts.
  • a segment of the subject's breathing signal is used herein for comparison purposes such that a breathing pattern for the subject can be identified.
  • Identifying a breathing pattern for the subject comprises visual inspection of the breathing pattern and then comparing that pattern to one or more known reference patterns and selecting a reference pattern that is a closest visual match.
  • a “breathing pattern” refers to a movement of the target region due to the flow of air over a period of inspiration and expiration.
  • the breathing pattern may be any of: Eupnea, Bradypnea, Tachypnea, Hypopnea, Apnea, Kussmaul, Cheyne-Stokes, Biot's, Ataxic, Apneustic, Agonal, or Thoracoabdominal, as are generally understood by medical doctors, nurses, pulmonologists, respiratory therapists, among others.
  • the identified breathing pattern for the subject can then be used by trained practitioners to determine any of: pulmonary fibrosis, pneumothorax, Infant Respiratory Distress Syndrome, asthma, bronchitis, or emphysema.
  • a “remote sensing environment” refers to non-contact, non-invasive sensing, i.e., the sensing device does not physically contact the subject being sensed.
  • the sensing device can be any distance away from the subject, for example, as close as less than an inch to as far as miles in the case of telemedicine which is enabled by remote communication.
  • the environment may be any settings such as, for example, a hospital, ambulance, medical office, and the like.
  • FIG. 4 illustrates one embodiment of an example image-based depth sensing device acquiring video images of the target region of the subject of FIG. 3 being monitored for respiratory function assessment.
  • the image-based depth sensing device used to obtain video images of the subject's target region from which the time-varying sequence of depth maps is obtained can be, for example, a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an active stereo camera, an array of cameras, or a 2D monocular video camera.
  • RGBD red green blue depth
  • a non-image-based depth sensing device is used to acquire depth measurement data from which the time-varying sequence of depth maps is obtained can be, for example, a LADAR device, a LiDAR device, a photo wave device, or a time-of-flight measurement device.
  • Examination room 400 has an example image-based depth sensing device 402 to obtain video images of a subject 301 shown resting his/her head on a pillow while his/her body is partially covered by sheet.
  • Subject 301 is being monitored for respiratory function assessment.
  • Patient 301 is wearing a shirt 301 shown with a patterned array of reflective marks, individually at 403 . It is to be noted that clothing with patterned array of reflective marks is not needed when patterns are projected by the illumination source system.
  • Video camera 402 is rotatably fixed to support arm 404 such that the camera's field of view 405 can be directed by a technician onto target region 406 .
  • Support arm 404 is mounted on a set of wheels (not shown) so that video acquisition system 402 can be moved from bed to bed and room to room.
  • Video camera 402 comprises imaging sensors arrayed on a detector grid.
  • the sensors of the video camera are at least sensitive to a wavelength of illumination source system 407 being reflected by the reflective marks 403 .
  • the illumination source system may be any light wavelength that is detectable by sensors on the camera's detector array.
  • the illumination sources may be manipulated as needed and may be invisible to the human visual system.
  • the illumination source system may be arranged such that it may project invisible/visible patterns of light on the subject.
  • a central processor integral to the video camera 402 and in communication with a memory (not shown) functions to execute machine readable program instructions which process the video to obtain the time-varying sequence of depth maps.
  • the obtained sequence of depth maps may be wirelessly communicated via transmission element 408 over network 401 to a remote device operated by, for instance, a nurse, doctor, or technician for further processing, as needed, and for respiratory function assessment of patient 300 .
  • the captured video images are wirelessly communicated over network 401 via antenna 408 to a remote device such as a workstation where the transmitted video signal is processed to obtain the time-varying sequence of depth maps.
  • the depth maps are, in turn, processed to obtain the time-varying breathing signal.
  • Camera system 402 may further include wireless and wired elements and may be connected to a variety of devices via other means such as coaxial cable, radio frequency, Bluetooth, or any other manner for communicating video signals, data, and results.
  • Network 401 is shown as an amorphous cloud wherein data is transferred in the form of signals which may be, for example, electronic, electromagnetic, optical, light, or other signals. These signals may be communicated to a server which transmits and receives data by means of a wire, cable, fiber optic, phone line, cellular link, RF, satellite, or other medium or communications pathway or protocol. Techniques for placing devices in networked communication are well established. As such, further discussion as to specific networking techniques is omitted herein.
  • FIG. 5 illustrates one embodiment of the present method for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment.
  • Flow begins at 500 and immediately proceeds to step 502 .
  • the depth maps are of the target region over a period of inspiration and expiration.
  • the target region may be, for example, the subject's anterior thoracic region, a region of the subject's dorsal body, and a side view containing the subject's thoracic region.
  • the depth sensing device may be an image-based depth sensing device or a non-image-based depth sensing device.
  • FIG. 1 Various example target regions are shown in FIG. 1 .
  • a breathing signal for the subject comprising a temporal sequence of volumes at instantaneous intervals across time intervals during inspiratory and expiratory breathing.
  • the inspiration may be a maximal forced inspiration and the expiration a maximal forced expiration, or the inspiration and expiration are tidal breathing.
  • the reference breathing signals can be retrieved from, for example, a database of reference signals or from a storage device.
  • the reference breathing signal can be received or otherwise obtained from a remote device over a wired or wireless network. Associated with each of the reference breathing signals is a breathing pattern.
  • step 508 compare at least one segment of the subject's breathing signal against the retrieved reference breathing signal.
  • step 510 a determination is made whether, as a result of the comparison in step 508 , the reference signal is a match. If so then processing proceeds with respect to node A of FIG. 6 which is a continuation of the flow diagram of FIG. 5 . If, as a result of the comparison performed in step 510 it is determined that the reference breathing signal matches the signal segment(s) of the subject's breathing signal then flow continues with respect to step 512 wherein the breathing pattern associated with the matching reference signal is determined to be the breathing pattern of the subject.
  • the identified breathing is used for respiratory function assessment of the subject.
  • the identified breathing pattern is processed by an artificial intelligence algorithm to determine whether an alert condition exists. If so, then an alert signal is automatically sent using, for example, transmissive element 408 of FIG. 4 .
  • the alert signal may comprise, for example, a light blinking, an alarm or a message flashing on a monitor display.
  • Such a notification can take the form of a text message sent to a cellphone of a medical practitioner such as a nurse, pulmonologist, doctor or respiratory therapist.
  • the notification alert may be a pre-recorded voice, text, direct phone call, or video message.
  • Such an alert or notification can take any of a variety of forms and would depend on the particular environment wherein the teachings hereof find their intended uses.
  • step 510 If, as a result of the comparison performed in step 510 , it is determined that the reference breathing signal does not match the signal segment(s) of the subject breathing signal then flow continues with respect to node B wherein, at step 516 , a determination is made whether more reference breathing signals remain to be obtained for comparison purposes. If so then flow repeats with respect to node C of FIG. 5 wherein, at step 506 , a next reference breathing signal is retrieved or is otherwise received or obtained and this next reference breathing signal is then compared to one or more segments of the subject's breathing signal. Otherwise, in this embodiment, further flow stops.
  • FIG. 7 shows a functional block diagram of an example networked system for implementing various aspects of the present method described with respect to the flow diagrams of FIGS. 5 and 6 .
  • the system 700 of FIG. 7 illustrates a plurality of modules, processors, and components placed in networked communication with a workstation 702 wherein depth measurement data in the form of a video signal or depth values is transmitted over network 401 via transmissive element 408 by depth sensing device 402 are received for processing.
  • Workstation 702 includes a hard drive (internal to computer housing 703 ) which reads/writes to a computer readable media 704 such as a floppy disk, optical disk, CDROM, DVD, magnetic tape, etc.
  • Case 703 houses a motherboard with a processor and memory, a communications link such as a network card, graphics card, and the like, and other software and hardware to perform the functionality of a computing device as is generally known in the arts.
  • the workstation includes a graphical user interface which, in various embodiments, comprises display 705 such as a CRT, LCD, touch screen, etc., a mouse 706 and keyboard 707 . Information may be entered by a user of the present system using the graphical user interface.
  • workstation 702 has an operating system and other specialized software configured to display a wide variety of numeric values, text, scroll bars, pull-down menus with user selectable options, and the like, for entering, selecting, or modifying information displayed on display 705 .
  • the embodiment shown is only illustrative. Although shown as a desktop computer, it should be appreciated that computer 702 can be any of a laptop, mainframe, client/server, or a special purpose computer such as an ASIC, circuit board, dedicated processor, or the like. Any of the Information obtained from any of the modules of system 700 including various characteristics of any of the depth sensors can be saved to storage device 708 .
  • Depth Data Processor 710 processes the acquired data to obtain a time-varying sequence of depths maps of the target region over a period of inspiration and expiration.
  • Depth Map Analyzer 712 receives the time-varying sequence of depth maps from Processor 710 and proceeds to process the received depth maps to produce a time-varying breathing signal for the subject being monitored for respiratory function assessment.
  • Breathing Signal Processor 714 receives the time-varying breathing signal and identifies one or more signal segments in the subject's breathing signal that will be used for comparison purposes and may further store the data to Memory 715 .
  • Signal Segment Display Module 716 receives the segment(s) of the subject's breathing signal and retrieves one or more records, collectively at 717 , containing reference breathing signals and associated breathing patterns which are shown by way of example in a first of n-records which may also contain associated medical conditions and recommendations.
  • the retrieved reference breathing signal segment(s) are displayed for the practitioner so that a matching reference breathing signal can be selected.
  • the breathing pattern associated with the selected reference breathing signal is determined to be a match for the subject's breathing pattern.
  • Notification Module 718 implements an artificial intelligence program to determine whether an alert signal needs to be sent to a nurse, doctor or respiratory therapist via antenna element 720 . Such an alert or notification can take any of a variety of forms.
  • Notification Module 718 may further communicate any of the values, data, diagrams, results generated by any of the modules of system 700 to a remote device.
  • any of the modules and processing units of FIG. 7 are in communication with workstation 702 via pathways (not shown) and may further be in communication with one or more remote devices over network 401 . Further, the workstation and any remote devices may further read/write to any of the records 716 which may be stored in a database, memory, or storage device (not shown). Any of the modules may communicate with storage devices 708 and memory 715 via pathways shown and not shown and may store/retrieve data, parameter values, functions, records, and machine readable/executable program instructions required to perform their intended functions. Some or all of the functionality for any of the modules of the functional block diagram of FIG. 7 may be performed, in whole or in part, by components internal to workstation 702 or by a special purpose computer system.
  • modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function.
  • a plurality of modules may collectively perform a single function.
  • Each module may have a specialized processor and memory capable of executing machine readable program instructions.
  • a module may comprise a single piece of hardware such as an ASIC, electronic circuit, or special purpose processor.
  • a plurality of modules may be executed by either a single special purpose computer system or a plurality of special purpose systems operating in parallel. Connections between modules include both physical and logical connections.
  • Modules may further include one or more software/hardware components which may further comprise an operating system, drivers, device controllers, and other apparatuses some or all of which may be connected via a network. It is also contemplated that one or more aspects of the present method may be implemented on a dedicated computer system and may also be practiced in distributed computing environments where tasks are performed by remote devices that are linked through a network.
  • FIG. 8 shows an example breathing pattern associated with normal breathing (Eupnea) as observed normally under resting conditions.
  • FIG. 9 shows an example Bradypnea breathing pattern which is characterized by an unusually slow rate of breathing. Bradypnea is typically characterized by a period of respiration less than 12 breaths per minute (bpm) for patients in the range of between 12 and 50 years of age. Rates of breathing differ for older adults as well as younger patients. If an individual has this type of breathing, it can mean that the individual is not receiving a proper amount of oxygen.
  • bpm breaths per minute
  • FIG. 10 shows an example Tachypnea breathing pattern characterized by an unusually fast respiratory rate typically greater than 20 breaths per minute (bpm).
  • Tachypnea can be associated with high fever when the body attempts to rid itself of excess heat. The rate of respiration increases at a ratio of about eight breaths per minute for every degree Celsius above normal. Other causes include pneumonia, compensatory respiratory alkalosis as the body tries to expel excess carbon dioxide, respiratory insufficiency, lesions in the respiratory control center of the brain, and poisoning.
  • Tachypnea of a newborn is an elevation of the respiratory rate which can be due to fetal lung water.
  • FIG. 11 shows an example Hypopnea breathing pattern characterized by an abnormally shallow and slow respiration rate. Hypopnea typically occurs with advanced age. In well-conditioned athletes, it may be appropriate and is often accompanied by a slow pulse. Otherwise, it is apparent when pleuritic pain limits excursion and is characteristic of damage to the brainstem. Hypopnea accompanied by a rapid, weak pulse, may mean a brain injury.
  • FIG. 12 shows an example Hyperpnea breathing pattern characterized by an exaggerated deep, rapid, or labored respiration. It occurs normally with exercise and abnormally with aspirin overdose, pain, fever, hysteria, or a condition in which the supply of oxygen is inadequate. Hyperpnea may indicate cardiac disease and respiratory disease. Also spelled hyperpnoea.
  • FIG. 13 shows an example Thoracoabdominal breathing that involves trunk musculature to “suck” air into the lungs for pulmonary ventilation. This is typical in reptiles and birds. In humans, it can indicate a neuromuscular disorder such as a cervical spinal injury or a diaphragmatic paralysis.
  • FIG. 14 shows an example Kussmaul breathing pattern characterized by rapid, deep breathing due to a stimulation of the respiratory center of the brain triggered by a drop in pH. Kussmaul breathing is normal during exercise but is often seen in patients with metabolic acidosis.
  • Apnea (now shown) is a cessation of breathing for an extended period such as 20 seconds or more, typically during sleep. Apnea is divided into three categories: (1) obstructive, resulting from obstruction of the upper airways; (2) central, caused by some pathology in the brain's respiratory control center; and (3) mixed, a combination of the two.
  • FIG. 15 shows an example Cheyne-Stokes respiration which is characterized by a crescendo-decrescendo pattern of breathing followed by a period of central apnea. This is often seen in conditions like stroke, brain tumor, traumatic brain injury, carbon monoxide poisoning, metabolic encephalopathy, altitude sickness, narcotics use and in non-rapid eye movement sleep of patients with congestive heart failure.
  • FIG. 16 shows an example Biot's respiration which is characterized by abrupt and irregularly alternating periods of apnea with periods of breathing that are consistent in rate and depth. Biot's respiration is indicative of an increased intracranial pressure.
  • FIG. 17 shows an example Ataxic breathing pattern which is a completely irregular breathing pattern with continually variable rate and depth of breathing. Ataxis is indicative of lesions in the respiratory centers in the brainstem.
  • FIG. 18 shows an example Apneustic breathing pattern which is characterized by a prolonged inspiratory phase followed by expiration apnea.
  • the rate of Apneustic breathing is usually around 1.5 breaths per minute (bpm).
  • An Apneustic breathing pattern is often associated with head injury.
  • FIG. 19 shows example Agonal breathing which is abnormally shallow breathing pattern often related to cardiac arrest.
  • FIG. 20 shows a normal respiration pattern captured using a depth sensing device with the depth maps being processed in accordance with the teachings hereof which matches well with the normal breathing pattern of FIG. 8 .
  • FIG. 21 shows an example Cheyne-Stokes breathing pattern generated using the techniques disclosed herein. Compared this to the Cheyne-Stokes pattern of FIG. 15 .
  • FIGS. 20 shows a normal respiration pattern captured using a depth sensing device with the depth maps being processed in accordance with the teachings hereof which matches well with the normal breathing pattern of FIG. 8 .
  • FIG. 21 shows an example Cheyne-Stokes breathing pattern generated using the techniques disclosed herein. Compared this to the Cheyne-Stokes pattern of FIG. 15 .
  • FIG. 22 , 23 and 24 shows, respectively, a Biot's pattern, an Apneustic pattern, and an Agonal pattern generated using the present methods. Compare these to the Biot's pattern of FIG. 16 , the Apneustic pattern of FIG. 18 and the Agonal pattern of FIG. 19 . As can be seen by an examination of the results, an experienced pulmonologist would be able to classify the breathing patterns generated using the teachings disclosed herein, and therefrom identify associated medical reasons for respiratory function assessment.

Abstract

What is disclosed is a system and method for identifying a patient's breathing pattern for respiratory function assessment without contact and with a depth-capable imaging system. In one embodiment, a time-varying sequence of depth maps are received of a target region of a subject of interest over a period of inspiration and expiration. Once received, the depth maps are processed to obtain a breathing signal for the subject. The subject's breathing signal comprises a temporal sequence of instantaneous volumes. One or more segments of the subject's breathing signal are then compared against one or more reference breathing signals each associated with a known pattern of breathing. As a result of the comparison, a breathing pattern for the subject is identified. The identified breathing pattern is then used to assess the subject's respiratory function. The teachings hereof find their uses in an array of diverse medical applications. Various embodiments are disclosed.

Description

    TECHNICAL FIELD
  • The present invention is directed to systems and methods for identifying a patient's breathing pattern for respiratory function assessment.
  • BACKGROUND
  • Monitoring respiratory events is of clinical importance in the early detection of potentially fatal conditions. Current technologies involve contact sensors the individual must wear which may lead to patient discomfort, dependency, loss of dignity, and further may fail due to a variety of reasons. Elderly patients and neonatal infants are even more likely to suffer adverse effects of such monitoring by contact sensors. Unobtrusive, non-contact methods are increasingly desirable for patient respiratory function assessment.
  • Accordingly, what is needed are systems and methods for identifying a patient's breathing pattern for respiratory function assessment without contact and with a depth-capable imaging system.
  • INCORPORATED REFERENCES
  • The following U.S. patents, U.S. patent applications, and Publications are incorporated herein in their entirety by reference.
  • “Processing A Video For Tidal Chest Volume Estimation”, U.S. patent application Ser. No. 13/486,637, by Bernal et al. which discloses a system and method for estimating tidal chest volume by analyzing distortions in reflections of structured illumination patterns captured in a video of a thoracic region of a subject of interest.
  • “Minute Ventilation Estimation Based On Depth Maps”, U.S. patent application Ser. No. 13/486,682, by Bernal et al. which discloses a system and method for estimating minute ventilation based on depth maps.
  • “Minute Ventilation Estimation Based On Chest Volume”, U.S. patent application Ser. No. 13/486,715, by Bernal et al. which discloses a system and method for estimating minute ventilation based on chest volume by analyzing distortions in reflections of structured illumination patterns captured in a video of a thoracic region of a subject of interest.
  • “Processing A Video For Respiration Rate Estimation”, U.S. patent application Ser. No. 13/529,648, by Bernal et al. which discloses a system and method for estimating a respiration rate for a subject of interest captured in a video containing a view of that subject's thoracic region.
  • “Respiratory Function Estimation From A 2D Monocular Video”, U.S. patent application Ser. No. 13/630,838, by Bernal et al. which discloses a system and method for processing a video acquired using an inexpensive 2D monocular video acquisition system to assess respiratory function of a subject of interest.
  • “Monitoring Respiration with a Thermal Imaging System”, U.S. patent application Ser. No. 13/103,406, by Xu et al. which discloses a thermal imaging system and method for capturing a video sequence of a subject of interest, and processing the captured images such that the subject's respiratory function can be monitored.
  • “Enabling Hybrid Video Capture Of A Scene Illuminated With Unstructured And Structured Illumination Sources”, U.S. patent application Ser. No. 13/533,605, by Xu et al. which discloses a system and method for enabling the capture of video of a scene illuminated with unstructured and structured illumination sources.
  • “Contemporaneously Reconstructing Images Captured Of A Scene Illuminated With Unstructured And Structured Illumination Sources”, U.S. patent application Ser. No. 13/533,678, by Xu et al. which discloses a system and method for reconstructing images captured of a scene being illuminated with unstructured and structured illumination sources.
  • Respiratory Physiology: The Essentials”, John B. West, Lippincott Williams & Wilkins; 9th Ed. (2011), ISBN-13: 978-1609136406.
  • BRIEF SUMMARY
  • What is disclosed is a system and method for identifying a patient's breathing pattern for respiratory function assessment without contact and with a depth-capable imaging system. In one embodiment, a time-varying sequence of depth maps is received of a target region of a subject of interest over a period of inspiration and expiration. The depth maps are processed to obtain a breathing signal for the subject which comprises a temporal sequence of instantaneous volumes across time intervals during inspiratory and expiratory breathing. One or more segments of the breathing signal are then compared against reference breathing signals, each associated with a known pattern of breathing. As a result of the comparison, a breathing pattern for the subject is identified. The identified breathing pattern is used to assess the subject's respiratory function. The teachings hereof find their uses in a wide array of medical applications.
  • Many features and advantages of the above-described system and method will become apparent from the following detailed description and accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing will be made apparent from the following detailed description taken in conjunction with the accompanying drawings:
  • FIG. 1 shows an anterior (front) view and a posterior (back) view of a subject of interest intended to be monitored for respiratory function assessment in accordance with the teachings hereof;
  • FIG. 2 shows the subject of FIG. 1 having a plurality of reflective marks arrayed in a uniform grid on their anterior thoracic region and on their posterior thoracic region;
  • FIG. 3 shows the subject of FIG. 1 wearing a shirt with a uniform pattern of reflective dots arrayed in uniform grid with a one inch dot pitch along a horizontal and a vertical direction;
  • FIG. 4 illustrates one embodiment of an example image-based depth sensing device acquiring video images of the target region of the subject of FIG. 3 being monitored for respiratory function assessment;
  • FIG. 5 is a flow diagram which illustrates one example embodiment of the present method for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment;
  • FIG. 6 is a continuation of the flow diagram of FIG. 5 with flow continuing with respect to nodes A or B;
  • FIG. 7 is a functional block diagram of an example networked system for implementing various aspects of the present method described with respect to the flow diagrams of FIGS. 5 and 6;
  • FIG. 8 shows an example breathing pattern associated with normal breathing (eupnea) as observed normally under resting conditions;
  • FIG. 9 shows an example Bradypnea breathing pattern characterized by an unusually slow rate of breathing;
  • FIG. 10 shows an example Tachypnea breathing pattern characterized as an unusually fast respiratory rate;
  • FIG. 11 shows an example Hypopnea breathing pattern characterized by an abnormally shallow and slow respiration rate;
  • FIG. 12 shows an example Hyperpnea breathing pattern characterized by an exaggerated deep, rapid, or labored respiration;
  • FIG. 13 shows an example Thoracoabdominal breathing pattern that involves trunk musculature to “suck” air into the lungs for pulmonary ventilation;
  • FIG. 14 shows an example Kussmaul breathing pattern characterized by rapid, deep breathing due to a stimulation of the respiratory center of the brain triggered by a drop in pH;
  • FIG. 15 shows an example Cheyne-Stokes respiration pattern which is characterized by a crescendo-decrescendo pattern of breathing followed by a period of central apnea;
  • FIG. 16 shows an example Biot's respiration pattern which is characterized by abrupt and irregularly alternating periods of apnea with periods of breathing that are consistent in rate and depth;
  • FIG. 17 shows an example Ataxic breathing pattern which is a completely irregular breathing pattern with continually variable rate and depth of breathing;
  • FIG. 18 shows an example Apneustic breathing pattern which is characterized by a prolonged inspiratory phase followed by expiration apnea;
  • FIG. 19 shows an example Agonal breathing which is abnormally shallow breathing pattern often related to cardiac arrest;
  • FIG. 20 shows a normal respiration pattern measured via the use of a depth sensing device with the depth maps being processed in accordance with the teachings hereof;
  • FIG. 21 shows a test subject's Cheyne-Stokes breathing pattern measured using the techniques disclosed herein;
  • FIG. 22 shows a test subject's Biot's pattern measured using the techniques disclosed herein;
  • FIG. 23 shows a test subject's Apneustic pattern measured using the present methods; and
  • FIG. 24 shows a test subject's Agonal breathing pattern measured using the present methods.
  • DETAILED DESCRIPTION
  • What is disclosed is a system and method for identifying a patient's breathing pattern for respiratory function assessment without contact and with a depth-capable imaging system.
  • NON-LIMITING DEFINITIONS
  • A “subject of interest” refers to a person being monitored for respiratory function assessment. It should be appreciated that the use of the terms “human”, “person”, or “patient” herein is not to be viewed as limiting the scope of the appended claims solely to human subjects.
  • A “target region” refers to an area or region of the subject where respiratory function can be assessed. For example, the target region may be a subject's anterior thoracic region, a region of the subject's dorsal body, and/or a side view containing the subject's thoracic region. It should be appreciated that a target region can be any view of a region of the subject's body which can facilitate respiratory function assessment. FIG. 1 shows an anterior (frontal) view which outlines a target region 102 comprising the subject's anterior thoracic region. Target region 103 is of the subject's posterior thoracic region.
  • “Respiration”, as is normally understood, is a process of inhaling of air into lungs and exhaling air out of the lungs followed by a post-expiratory pause. Inhalation is an active process caused by a negative pressure having been induced in the chest cavity by the contraction of a relatively large muscle (often called the diaphragm) which changes pressure in the lungs by a forcible expansion of the lung's region where gas exchange takes place (i.e., alveolar cells). Exhalation is a passive process where air is expelled from the lungs by the natural elastic recoil of the stretched alveolar cells. The lining of alveolar cells has a surface-active phospholipoprotein complex which causes the lining of the lungs to naturally contract back to a neutral state once the external force causing the cell to stretch is released. A post-expiratory pause occurs when there is an equalization of pressure between the lungs and the atmosphere.
  • “Inspiration” occurs when the subject forces the expansion of the thoracic cavity to bring air into their lungs. A maximally forced inspiratory breath is when the subject cannot bring any more air into their lungs.
  • “Expiration” is when the subject forces the contraction of the thoracic cavity to expel air out of their lungs. A maximally forced expiratory breath is when the subject cannot expel any more air from their lungs.
  • “Depth map sequence” is a reconstructed temporal sequence of 3D surface maps of a target region of a subject. There is a plurality of techniques known in the art for obtaining a depth map of a target region. For example, a depth map may be constructed based on the amount of deformation in a known pattern comprising, for instance, structured patterns of light projected onto the target region, textural characteristics present on the target region itself such as skin blemishes, scars, markings, and the like, which are detectable by a video camera's detector array. FIG. 2 shows a subject of interest 201 having a plurality of reflective marks arrayed in a uniform pattern 202 on an anterior thoracic region. Subject 203 is shown having a plurality of emissive marks such as LEDs arrayed in a uniform pattern 204 on their posterior thoracic region. The pattern may alternatively be an array of reflective or emissive marks imprinted or otherwise fixed to an item of clothing worn by the subject which emit or reflect a wavelength range detectable by sensors in a video camera's detector array. Reflective marks may be dots of reflective tape, reflective buttons, reflective fabric, or the like. Emissive marks may be LED illuminators sewn or fixed to the shirt. In FIG. 3, subject 300 is shown wearing shirt 301 with a uniform pattern of reflective dots arrayed in uniform grid with a 1 inch dot pitch along a horizontal and a vertical direction. It should be appreciated that the pattern may be a uniform grid, a non-uniform grid, a textured pattern, or a pseudo-random pattern so long as the pattern's spatial characteristics are known apriori. Higher-resolution patterns are preferable for reconstruction of higher resolution depth maps. Depth maps may be obtained from video images captured using an image-based depth sensing device such as an image-based depth sensing device comprising any of: a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an array of cameras, an active stereo camera, and a 2D monocular video camera. Depth maps may also be obtained from data acquired by non-image-based depth sensing devices such as a LADAR device, a LiDAR device, a photo wave device, or a time-of-flight measurement device as a depth measuring system. Depth maps can be obtained from data obtained by any of a wide variety of depth-capable sensing devices or 3D reconstruction techniques.
  • “Receiving depth maps” is intended to be widely construed and includes to download, upload, estimate, measure, obtain, or otherwise retrieve from a memory, hard drive, CDROM, or DVD. The depth maps are measured with a depth-capable sensing device. It should be appreciated that depth maps can be obtained using a camera to capture images of the subject while illuminated by a projected pattern of structured light, the camera being sensitive to a wavelength range of the structured light. The depth maps are then generated based upon a comparison of spatial characteristics of reflections introduced by a movement in the subject's chest cage to known spatial characteristics of the projected patterns in conjunction with the known distance between the light projector and the camera, and using the characterized distortions at different locations to calculate the depth map for each image in the video. Such a method is taught in the above-incorporated reference by Bernal et al. Depth maps can be generated using distortions in patterned clothing worn by the subject as taught in the above-incorporated reference by Bernal et al. The embodiments herein are discussed with respect to the patterned clothing embodiment.
  • A “reference breathing signal” refers to a volume signal that is associated with a known pattern of breathing. By a comparison of one or more segments of the subject's breathing signal against reference breathing signals which are associated with known breathing patterns, a pattern can be identified for the subject's breathing. The reference breathing signal can be retrieved from, for example, a memory, a storage device such as a hard drive or removable media, or received from a remote device over a wired or wireless network. The reference breathing signal may be volume signals generated using the depth capable sensor in a simulated environment by a respiratory expert. It can also be generated using the depth capable sensor on patients with identified respiratory diseases.
  • A “subject's breathing signal” refers to a temporal sequence of instantaneous volumes across time intervals during a period of an inspiratory and an expiratory breathing. Instantaneous volumes are obtained from processing the depth maps. In one embodiment, the depth map comprises a 3D hull defined by a set of 3D coordinates namely their horizontal, vertical and depth coordinates (x, y and z respectively). Points in the hull can be used to form a triangular tessellation of the target area. By definition of a tessellation, the triangles fill the whole surface and do not overlap. The coordinates of an anchor point at a given depth are computed. The anchor point can be located on a reference surface, for example, the surface on which the subject lies. The anchor point in conjunction with the depth map defines a 3D hull which has a volume. Alternatively, the coordinates of points on an anchor surface corresponding to the set of depths of a reference surface can be computed. The anchor surface in conjunction with the depth map also defines a 3D hull which has a volume. A volume can be computed for each 3D hull obtained from each depth map. A concatenation of all sequential volumes forms a temporal sequence of instantaneous volumes across time intervals during inspiration and expiration. The signal can be de-trended to remove low frequency variations and smoothed using a Fast Fourier Transform (FFT) or a filter. Additionally, the volumetric data can be calibrated so as to convert device-dependent volume data into device-independent data, for example in L, mL, or cm3. A mapping or function that performs such conversion is deemed a calibration function. These functions can be estimated, for example by performing regression or fitting of volumetric data measured via the procedure described above to volumetric data obtained with spirometers. It should be appreciated that, in environments where the patient is free to move around while being monitored for respiratory function, it may be necessary to build perspective-dependent calibration functions specific to the device from which the depth maps are being derived. Data capture from different points of view can be performed and perspective-dependent volume signals derived. Processing from each point of view will lead to perspective-dependent volume signals from which multiple calibration tables can be constructed. Calibration for various perspectives intermediate to those tested can be accomplished via interpolation.
  • A “segment of a breathing signal” refers to some or all of the subject's breathing signal. A segment can be, for instance, one or more dominant cycles of the subject's breathing signal or a fraction or multiple fractions of one dominant cycle of the subject's breathing signal. The dominant cycle may be selected in many ways; for example by extracting any one breathing cycle from the chosen segment, by averaging all the breathing cycles in a signal, by extracting the cycle with the smallest or largest period, among others. A signal segment may comprise a phase-shifted portion of the subject's breathing signal. Methods for obtaining a segment of a signal are well established in the signal processing arts. A segment of the subject's breathing signal is used herein for comparison purposes such that a breathing pattern for the subject can be identified.
  • “Identifying a breathing pattern” for the subject comprises visual inspection of the breathing pattern and then comparing that pattern to one or more known reference patterns and selecting a reference pattern that is a closest visual match.
  • A “breathing pattern” refers to a movement of the target region due to the flow of air over a period of inspiration and expiration. The breathing pattern may be any of: Eupnea, Bradypnea, Tachypnea, Hypopnea, Apnea, Kussmaul, Cheyne-Stokes, Biot's, Ataxic, Apneustic, Agonal, or Thoracoabdominal, as are generally understood by medical doctors, nurses, pulmonologists, respiratory therapists, among others. The identified breathing pattern for the subject can then be used by trained practitioners to determine any of: pulmonary fibrosis, pneumothorax, Infant Respiratory Distress Syndrome, asthma, bronchitis, or emphysema.
  • A “remote sensing environment” refers to non-contact, non-invasive sensing, i.e., the sensing device does not physically contact the subject being sensed. The sensing device can be any distance away from the subject, for example, as close as less than an inch to as far as miles in the case of telemedicine which is enabled by remote communication. The environment may be any settings such as, for example, a hospital, ambulance, medical office, and the like.
  • Example Image-Based System
  • Reference is now being made to FIG. 4 which illustrates one embodiment of an example image-based depth sensing device acquiring video images of the target region of the subject of FIG. 3 being monitored for respiratory function assessment. In this embodiment, the image-based depth sensing device used to obtain video images of the subject's target region from which the time-varying sequence of depth maps is obtained can be, for example, a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an active stereo camera, an array of cameras, or a 2D monocular video camera. In another embodiment where a non-image-based depth sensing device is used to acquire depth measurement data from which the time-varying sequence of depth maps is obtained can be, for example, a LADAR device, a LiDAR device, a photo wave device, or a time-of-flight measurement device.
  • Examination room 400 has an example image-based depth sensing device 402 to obtain video images of a subject 301 shown resting his/her head on a pillow while his/her body is partially covered by sheet. Subject 301 is being monitored for respiratory function assessment. Patient 301 is wearing a shirt 301 shown with a patterned array of reflective marks, individually at 403. It is to be noted that clothing with patterned array of reflective marks is not needed when patterns are projected by the illumination source system. Video camera 402 is rotatably fixed to support arm 404 such that the camera's field of view 405 can be directed by a technician onto target region 406. Support arm 404 is mounted on a set of wheels (not shown) so that video acquisition system 402 can be moved from bed to bed and room to room. Although patient 300 is shown in a prone position lying in a bed, it should be appreciated that video of the target region 406 can be captured while the subject is positioned in other supporting devices such as, for example, a chair or in a standing position. Video camera 402 comprises imaging sensors arrayed on a detector grid. The sensors of the video camera are at least sensitive to a wavelength of illumination source system 407 being reflected by the reflective marks 403. The illumination source system may be any light wavelength that is detectable by sensors on the camera's detector array. The illumination sources may be manipulated as needed and may be invisible to the human visual system. The illumination source system may be arranged such that it may project invisible/visible patterns of light on the subject.
  • A central processor integral to the video camera 402 and in communication with a memory (not shown) functions to execute machine readable program instructions which process the video to obtain the time-varying sequence of depth maps. The obtained sequence of depth maps may be wirelessly communicated via transmission element 408 over network 401 to a remote device operated by, for instance, a nurse, doctor, or technician for further processing, as needed, and for respiratory function assessment of patient 300. Alternatively, the captured video images are wirelessly communicated over network 401 via antenna 408 to a remote device such as a workstation where the transmitted video signal is processed to obtain the time-varying sequence of depth maps. The depth maps are, in turn, processed to obtain the time-varying breathing signal. Camera system 402 may further include wireless and wired elements and may be connected to a variety of devices via other means such as coaxial cable, radio frequency, Bluetooth, or any other manner for communicating video signals, data, and results. Network 401 is shown as an amorphous cloud wherein data is transferred in the form of signals which may be, for example, electronic, electromagnetic, optical, light, or other signals. These signals may be communicated to a server which transmits and receives data by means of a wire, cable, fiber optic, phone line, cellular link, RF, satellite, or other medium or communications pathway or protocol. Techniques for placing devices in networked communication are well established. As such, further discussion as to specific networking techniques is omitted herein.
  • Flow Diagram of One Embodiment
  • Reference is now being made to the flow diagram of FIG. 5 which illustrates one embodiment of the present method for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment. Flow begins at 500 and immediately proceeds to step 502.
  • At step 502, receive a time-varying sequence of depth maps of a target region of a subject of interest being monitored for breathing pattern identification. The depth maps are of the target region over a period of inspiration and expiration. The target region may be, for example, the subject's anterior thoracic region, a region of the subject's dorsal body, and a side view containing the subject's thoracic region. The depth sensing device may be an image-based depth sensing device or a non-image-based depth sensing device. Various example target regions are shown in FIG. 1.
  • At step 504, process the depth maps to obtain a breathing signal for the subject comprising a temporal sequence of volumes at instantaneous intervals across time intervals during inspiratory and expiratory breathing. The inspiration may be a maximal forced inspiration and the expiration a maximal forced expiration, or the inspiration and expiration are tidal breathing.
  • At step 506, retrieve a first reference breathing signal. The reference breathing signals can be retrieved from, for example, a database of reference signals or from a storage device. The reference breathing signal can be received or otherwise obtained from a remote device over a wired or wireless network. Associated with each of the reference breathing signals is a breathing pattern.
  • At step 508, compare at least one segment of the subject's breathing signal against the retrieved reference breathing signal.
  • At step 510, a determination is made whether, as a result of the comparison in step 508, the reference signal is a match. If so then processing proceeds with respect to node A of FIG. 6 which is a continuation of the flow diagram of FIG. 5. If, as a result of the comparison performed in step 510 it is determined that the reference breathing signal matches the signal segment(s) of the subject's breathing signal then flow continues with respect to step 512 wherein the breathing pattern associated with the matching reference signal is determined to be the breathing pattern of the subject.
  • At step 514, the identified breathing is used for respiratory function assessment of the subject. In this embodiment, further flow stops. In another embodiment, the identified breathing pattern is processed by an artificial intelligence algorithm to determine whether an alert condition exists. If so, then an alert signal is automatically sent using, for example, transmissive element 408 of FIG. 4. The alert signal may comprise, for example, a light blinking, an alarm or a message flashing on a monitor display. Such a notification can take the form of a text message sent to a cellphone of a medical practitioner such as a nurse, pulmonologist, doctor or respiratory therapist. The notification alert may be a pre-recorded voice, text, direct phone call, or video message. Such an alert or notification can take any of a variety of forms and would depend on the particular environment wherein the teachings hereof find their intended uses.
  • If, as a result of the comparison performed in step 510, it is determined that the reference breathing signal does not match the signal segment(s) of the subject breathing signal then flow continues with respect to node B wherein, at step 516, a determination is made whether more reference breathing signals remain to be obtained for comparison purposes. If so then flow repeats with respect to node C of FIG. 5 wherein, at step 506, a next reference breathing signal is retrieved or is otherwise received or obtained and this next reference breathing signal is then compared to one or more segments of the subject's breathing signal. Otherwise, in this embodiment, further flow stops.
  • It should be understood that the flow diagrams depicted herein are illustrative. One or more of the operations illustrated in the flow diagrams may be performed in a differing order. Other operations may be added, modified, enhanced, or consolidated. Variations thereof are intended to fall within the scope of the appended claims. All or portions of the flow diagrams may be implemented partially or fully in hardware in conjunction with machine executable instructions.
  • Example Networked System
  • Reference is now being made to FIG. 7 which shows a functional block diagram of an example networked system for implementing various aspects of the present method described with respect to the flow diagrams of FIGS. 5 and 6. The system 700 of FIG. 7 illustrates a plurality of modules, processors, and components placed in networked communication with a workstation 702 wherein depth measurement data in the form of a video signal or depth values is transmitted over network 401 via transmissive element 408 by depth sensing device 402 are received for processing.
  • Workstation 702 includes a hard drive (internal to computer housing 703) which reads/writes to a computer readable media 704 such as a floppy disk, optical disk, CDROM, DVD, magnetic tape, etc. Case 703 houses a motherboard with a processor and memory, a communications link such as a network card, graphics card, and the like, and other software and hardware to perform the functionality of a computing device as is generally known in the arts. The workstation includes a graphical user interface which, in various embodiments, comprises display 705 such as a CRT, LCD, touch screen, etc., a mouse 706 and keyboard 707. Information may be entered by a user of the present system using the graphical user interface. It should be appreciated that workstation 702 has an operating system and other specialized software configured to display a wide variety of numeric values, text, scroll bars, pull-down menus with user selectable options, and the like, for entering, selecting, or modifying information displayed on display 705. The embodiment shown is only illustrative. Although shown as a desktop computer, it should be appreciated that computer 702 can be any of a laptop, mainframe, client/server, or a special purpose computer such as an ASIC, circuit board, dedicated processor, or the like. Any of the Information obtained from any of the modules of system 700 including various characteristics of any of the depth sensors can be saved to storage device 708.
  • In the system 500, Depth Data Processor 710 processes the acquired data to obtain a time-varying sequence of depths maps of the target region over a period of inspiration and expiration. Depth Map Analyzer 712 receives the time-varying sequence of depth maps from Processor 710 and proceeds to process the received depth maps to produce a time-varying breathing signal for the subject being monitored for respiratory function assessment. Breathing Signal Processor 714 receives the time-varying breathing signal and identifies one or more signal segments in the subject's breathing signal that will be used for comparison purposes and may further store the data to Memory 715. Signal Segment Display Module 716 receives the segment(s) of the subject's breathing signal and retrieves one or more records, collectively at 717, containing reference breathing signals and associated breathing patterns which are shown by way of example in a first of n-records which may also contain associated medical conditions and recommendations. The retrieved reference breathing signal segment(s) are displayed for the practitioner so that a matching reference breathing signal can be selected. The breathing pattern associated with the selected reference breathing signal is determined to be a match for the subject's breathing pattern. In this embodiment, Notification Module 718 implements an artificial intelligence program to determine whether an alert signal needs to be sent to a nurse, doctor or respiratory therapist via antenna element 720. Such an alert or notification can take any of a variety of forms. Notification Module 718 may further communicate any of the values, data, diagrams, results generated by any of the modules of system 700 to a remote device.
  • It should be understood that any of the modules and processing units of FIG. 7 are in communication with workstation 702 via pathways (not shown) and may further be in communication with one or more remote devices over network 401. Further, the workstation and any remote devices may further read/write to any of the records 716 which may be stored in a database, memory, or storage device (not shown). Any of the modules may communicate with storage devices 708 and memory 715 via pathways shown and not shown and may store/retrieve data, parameter values, functions, records, and machine readable/executable program instructions required to perform their intended functions. Some or all of the functionality for any of the modules of the functional block diagram of FIG. 7 may be performed, in whole or in part, by components internal to workstation 702 or by a special purpose computer system.
  • Various modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function. A plurality of modules may collectively perform a single function. Each module may have a specialized processor and memory capable of executing machine readable program instructions. A module may comprise a single piece of hardware such as an ASIC, electronic circuit, or special purpose processor. A plurality of modules may be executed by either a single special purpose computer system or a plurality of special purpose systems operating in parallel. Connections between modules include both physical and logical connections. Modules may further include one or more software/hardware components which may further comprise an operating system, drivers, device controllers, and other apparatuses some or all of which may be connected via a network. It is also contemplated that one or more aspects of the present method may be implemented on a dedicated computer system and may also be practiced in distributed computing environments where tasks are performed by remote devices that are linked through a network.
  • Example Breathing Patterns
  • FIG. 8 shows an example breathing pattern associated with normal breathing (Eupnea) as observed normally under resting conditions.
  • FIG. 9 shows an example Bradypnea breathing pattern which is characterized by an unusually slow rate of breathing. Bradypnea is typically characterized by a period of respiration less than 12 breaths per minute (bpm) for patients in the range of between 12 and 50 years of age. Rates of breathing differ for older adults as well as younger patients. If an individual has this type of breathing, it can mean that the individual is not receiving a proper amount of oxygen.
  • FIG. 10 shows an example Tachypnea breathing pattern characterized by an unusually fast respiratory rate typically greater than 20 breaths per minute (bpm). Tachypnea can be associated with high fever when the body attempts to rid itself of excess heat. The rate of respiration increases at a ratio of about eight breaths per minute for every degree Celsius above normal. Other causes include pneumonia, compensatory respiratory alkalosis as the body tries to expel excess carbon dioxide, respiratory insufficiency, lesions in the respiratory control center of the brain, and poisoning. Tachypnea of a newborn is an elevation of the respiratory rate which can be due to fetal lung water.
  • FIG. 11 shows an example Hypopnea breathing pattern characterized by an abnormally shallow and slow respiration rate. Hypopnea typically occurs with advanced age. In well-conditioned athletes, it may be appropriate and is often accompanied by a slow pulse. Otherwise, it is apparent when pleuritic pain limits excursion and is characteristic of damage to the brainstem. Hypopnea accompanied by a rapid, weak pulse, may mean a brain injury.
  • FIG. 12 shows an example Hyperpnea breathing pattern characterized by an exaggerated deep, rapid, or labored respiration. It occurs normally with exercise and abnormally with aspirin overdose, pain, fever, hysteria, or a condition in which the supply of oxygen is inadequate. Hyperpnea may indicate cardiac disease and respiratory disease. Also spelled hyperpnoea.
  • FIG. 13 shows an example Thoracoabdominal breathing that involves trunk musculature to “suck” air into the lungs for pulmonary ventilation. This is typical in reptiles and birds. In humans, it can indicate a neuromuscular disorder such as a cervical spinal injury or a diaphragmatic paralysis.
  • FIG. 14 shows an example Kussmaul breathing pattern characterized by rapid, deep breathing due to a stimulation of the respiratory center of the brain triggered by a drop in pH. Kussmaul breathing is normal during exercise but is often seen in patients with metabolic acidosis.
  • Apnea (now shown) is a cessation of breathing for an extended period such as 20 seconds or more, typically during sleep. Apnea is divided into three categories: (1) obstructive, resulting from obstruction of the upper airways; (2) central, caused by some pathology in the brain's respiratory control center; and (3) mixed, a combination of the two.
  • FIG. 15 shows an example Cheyne-Stokes respiration which is characterized by a crescendo-decrescendo pattern of breathing followed by a period of central apnea. This is often seen in conditions like stroke, brain tumor, traumatic brain injury, carbon monoxide poisoning, metabolic encephalopathy, altitude sickness, narcotics use and in non-rapid eye movement sleep of patients with congestive heart failure.
  • FIG. 16 shows an example Biot's respiration which is characterized by abrupt and irregularly alternating periods of apnea with periods of breathing that are consistent in rate and depth. Biot's respiration is indicative of an increased intracranial pressure.
  • FIG. 17 shows an example Ataxic breathing pattern which is a completely irregular breathing pattern with continually variable rate and depth of breathing. Ataxis is indicative of lesions in the respiratory centers in the brainstem.
  • FIG. 18 shows an example Apneustic breathing pattern which is characterized by a prolonged inspiratory phase followed by expiration apnea. The rate of Apneustic breathing is usually around 1.5 breaths per minute (bpm). An Apneustic breathing pattern is often associated with head injury.
  • FIG. 19 shows example Agonal breathing which is abnormally shallow breathing pattern often related to cardiac arrest.
  • Performance Results
  • A person with training in respiratory diseases emulated various breathing patterns for our tests using an active-stereo-based system to acquire a time-series signal used to generate depth maps. Depth data was captured at 30 fps. The signals were processed in accordance with the teachings hereof and the resulting breathing patterns plotted for comparison purposes. FIG. 20 shows a normal respiration pattern captured using a depth sensing device with the depth maps being processed in accordance with the teachings hereof which matches well with the normal breathing pattern of FIG. 8. FIG. 21 shows an example Cheyne-Stokes breathing pattern generated using the techniques disclosed herein. Compared this to the Cheyne-Stokes pattern of FIG. 15. FIGS. 22, 23 and 24 shows, respectively, a Biot's pattern, an Apneustic pattern, and an Agonal pattern generated using the present methods. Compare these to the Biot's pattern of FIG. 16, the Apneustic pattern of FIG. 18 and the Agonal pattern of FIG. 19. As can be seen by an examination of the results, an experienced pulmonologist would be able to classify the breathing patterns generated using the teachings disclosed herein, and therefrom identify associated medical reasons for respiratory function assessment.
  • Various Embodiments
  • The teachings hereof can be implemented in hardware or software using any known or later developed systems, structures, devices, and/or software by those skilled in the applicable art without undue experimentation from the functional description provided herein with a general knowledge of the relevant arts. One or more aspects of the methods described herein are intended to be incorporated in an article of manufacture, including one or more computer program products, having computer usable or machine readable media. The article of manufacture may be included on at least one storage device readable by a machine architecture embodying executable program instructions capable of performing the methodology and functionality described herein. Additionally, the article of manufacture may be included as part of a complete system or provided separately, either alone or as various components. It will be appreciated that various features and functions, or alternatives thereof, may be desirably combined into other different systems or applications. Presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may become apparent and/or subsequently made by those skilled in the art, which are also intended to be encompassed with the scope of the following claims.
  • Accordingly, the embodiments set forth above are considered to be illustrative and not limiting. Various changes to the above-described embodiments may be made without departing from the spirit and scope of the invention. The teachings of any printed publications including patents and patent applications, are each separately hereby incorporated by reference in their entirety.

Claims (25)

What is claimed is:
1. A method for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment, the method comprising:
receiving a time-varying sequence of depth maps of a target region of a subject of interest being monitored for respiratory function assessment, said depth maps being of said target region over a period of inspiration and expiration;
processing said depth maps to obtain a breathing signal for said subject comprising a temporal sequence of instantaneous volumes across time intervals during inspiratory and expiratory breathing;
comparing at least one segment of said subject's breathing signal against a reference breathing signal associated with a known pattern of breathing; and
identifying, as a result of said comparison, a breathing pattern for said subject.
2. The method of claim 1, wherein said target region comprises one of: said subject's anterior thoracic region, a region of said subject's dorsal body, and a side view containing said subject's thoracic region.
3. The method of claim 1, wherein said depth maps are obtained from images captured using an image-based depth sensing device comprising any of: a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an array of cameras, an active stereo camera, and a 2D monocular video camera.
4. The method of claim 1, wherein said depth maps are obtained from data captured using a non-image-based depth sensing device comprising any of: a LADAR device, a LiDAR device, a photo wave device, and a time-of-flight measurement device.
5. The method of claim 1, wherein said depth maps are obtained from processing video images captured of said target region with patterned clothing using a video camera device comprising any of: a red green blue (RGB) camera, an infrared camera, a multispectral camera, and a hyperspectral camera.
6. The method of claim 1, wherein, in advance of said comparison, filtering said subject's breathing signal to remove unwanted noise.
7. The method of claim 1, wherein said segment comprises at least one of: a dominant cycle of said subject's breathing signal, multiple dominant cycles of said subject's breathing signal, a fraction of one dominant cycle of said subject's breathing signal, multiple fractions of a plurality of dominant cycles, and a phase-shifted portion of said subject's breathing signal.
8. The method of claim 1, wherein said identified breathing pattern is one of: Eupnea, Bradypnea, Tachypnea, Hypopnea, Apnea, Kussmaul, Cheyne-Stokes, Biot's, Ataxic, Apneustic, Agonal, and Thoracoabdominal.
9. The method of claim 1, further comprising using said identified breathing pattern to determine whether said subject has any of: pulmonary fibrosis, pneumothorax, Infant Respiratory Distress Syndrome, asthma, bronchitis, and emphysema.
10. The method of claim 1, wherein said instantaneous volumes comprise one of: a calibrated volume and an uncalibrated volume.
11. The method of claim 1, wherein said inspiration is a maximal forced inspiration and said expiration is a maximal forced expiration.
12. The method of claim 1, wherein said inspiration and expiration comprises forced inspiration and forced expiration.
13. The method of claim 1, wherein said reference breathing signal consists of a volume signal generated using a depth capable sensor in one of: a simulated environment by a respiratory expert or a computerized mannequin, and a clinical environment with patients with identified respiratory diseases.
14. A system for identifying a breathing pattern of a subject for respiratory function assessment in a remote sensing environment, the system comprising:
a memory and a storage device; and
a processor in communication with said memory and said storage device, said processor executing machine readable program instructions for performing:
receiving a time-varying sequence of depth maps of a target region of a subject of interest being monitored for respiratory function assessment, said depth maps being of said target region over a period of inspiration and expiration;
processing said depth maps to obtain a breathing signal for said subject comprising a temporal sequence of instantaneous volumes across time intervals during inspiratory and expiratory breathing;
comparing at least one segment of said subject's breathing signal against a reference breathing signal associated with a known pattern of breathing; and
identifying, as a result of said comparison, a breathing pattern for said subject.
15. The system of claim 14, wherein said target region comprises one of: said subject's anterior thoracic region, a region of said subject's dorsal body, and a side view containing said subject's thoracic region.
16. The system of claim 14, wherein said depth maps are obtained from images captured using an image-based depth sensing device comprising any of: a red green blue depth (RGBD) camera, an infrared depth camera, a passive stereo camera, an array of cameras, an active stereo camera, and a 2D monocular video camera.
17. The system of claim 14, wherein said depth maps are obtained from data captured using a non-image-based depth sensing device comprising any of: a LADAR device, a LiDAR device, a photo wave device, and a time-of-flight measurement device.
18. The system of claim 14, wherein said depth maps are obtained from processing video images captured of said target region with patterned clothing using a video camera device comprising any of: a red green blue (RGB) camera, an infrared camera, a multispectral camera, and a hyperspectral camera.
19. The system of claim 14, wherein, in advance of said comparison, filtering said subject's breathing signal to remove unwanted noise.
20. The system of claim 14, wherein said segment comprises at least one of:
a dominant cycle of said subject's breathing signal, multiple dominant cycles of said subject's breathing signal, a fraction of one dominant cycle of said subject's breathing signal, multiple fractions of a plurality of dominant cycles, and a phase-shifted portion of said subject's breathing signal.
21. The system of claim 14, wherein said identified breathing pattern is one of: Eupnea, Bradypnea, Tachypnea, Hypopnea, Apnea, Kussmaul, Cheyne-Stokes, Biot's, Ataxic, Apneustic, Agonal, and Thoracoabdominal.
22. The system of claim 14, further comprising using said identified breathing pattern to determine whether said subject has any of: pulmonary fibrosis, pneumothorax, Infant Respiratory Distress Syndrome, asthma, bronchitis, and emphysema.
23. The system of claim 14, wherein said instantaneous volumes comprise one of: a calibrated volume and an uncalibrated volume.
24. The system of claim 14, wherein said reference breathing signal consists of a volume signal generated using a depth capable sensor in one of: a simulated environment by a respiratory expert or a computerized mannequin, and a clinical environment with patients with identified respiratory diseases.
25. The system of claim 14, wherein identifying a breathing pattern for said subject is performed by an artificial intelligence program.
US14/044,043 2013-09-11 2013-10-02 Breathing pattern identification for respiratory function assessment Abandoned US20150094606A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US14/044,043 US20150094606A1 (en) 2013-10-02 2013-10-02 Breathing pattern identification for respiratory function assessment
US14/223,402 US10201293B2 (en) 2013-09-11 2014-03-24 Non-contact monitoring of spatio-temporal respiratory mechanics via depth sensing
DE102014219495.4A DE102014219495A1 (en) 2013-10-02 2014-09-25 IDENTIFYING BREATHING PATTERNS FOR ASSESSING BREATHING FUNCTION
US14/553,659 US10219739B2 (en) 2013-10-02 2014-11-25 Breathing pattern identification for respiratory function assessment
DE102015222498.8A DE102015222498A1 (en) 2013-10-02 2015-11-13 IDENTIFICATION OF BREATHING PATTERNS TO ASSESS THE RESPIRATORY FUNCTION

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/044,043 US20150094606A1 (en) 2013-10-02 2013-10-02 Breathing pattern identification for respiratory function assessment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/553,659 Continuation US10219739B2 (en) 2013-10-02 2014-11-25 Breathing pattern identification for respiratory function assessment

Publications (1)

Publication Number Publication Date
US20150094606A1 true US20150094606A1 (en) 2015-04-02

Family

ID=52673392

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/044,043 Abandoned US20150094606A1 (en) 2013-09-11 2013-10-02 Breathing pattern identification for respiratory function assessment
US14/553,659 Active 2034-06-21 US10219739B2 (en) 2013-10-02 2014-11-25 Breathing pattern identification for respiratory function assessment

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/553,659 Active 2034-06-21 US10219739B2 (en) 2013-10-02 2014-11-25 Breathing pattern identification for respiratory function assessment

Country Status (2)

Country Link
US (2) US20150094606A1 (en)
DE (2) DE102014219495A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9697599B2 (en) 2015-06-17 2017-07-04 Xerox Corporation Determining a respiratory pattern from a video of a subject
US10219739B2 (en) 2013-10-02 2019-03-05 Xerox Corporation Breathing pattern identification for respiratory function assessment
CN109640819A (en) * 2016-08-23 2019-04-16 皇家飞利浦有限公司 For the asthma attack of test object or the equipment, system and method for asthma
US20190150798A1 (en) * 2017-11-22 2019-05-23 Udisense Inc. Respiration monitor
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN109952056A (en) * 2016-11-02 2019-06-28 皇家飞利浦有限公司 Equipment, system and method for CO2 monitoring
US10339654B2 (en) 2013-01-24 2019-07-02 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
USD854074S1 (en) 2016-05-10 2019-07-16 Udisense Inc. Wall-assisted floor-mount for a monitoring camera
USD855684S1 (en) 2017-08-06 2019-08-06 Udisense Inc. Wall mount for a monitoring camera
CN110269624A (en) * 2019-07-16 2019-09-24 浙江伽奈维医疗科技有限公司 A kind of respiration monitoring device and its monitoring of respiration method based on RGBD camera
US10438349B2 (en) 2014-07-23 2019-10-08 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
KR20200013380A (en) * 2018-07-30 2020-02-07 주식회사 모두의연구소 Method for respiration measurement using ultrasonic wave based on image segmentation and device for the same
US10602988B2 (en) 2016-03-21 2020-03-31 International Business Machines Corporation Obtainment of cleaned sequences relating to a center of gravity
US20200129117A1 (en) * 2018-10-29 2020-04-30 Altair Medical Ltd. Condition Detector
US10663553B2 (en) 2011-08-26 2020-05-26 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US10660541B2 (en) 2015-07-28 2020-05-26 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
USD900429S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle band with decorative pattern
USD900428S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle band
USD900431S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle blanket with decorative pattern
USD900430S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle blanket
US10869611B2 (en) 2006-05-19 2020-12-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US11232576B2 (en) * 2019-03-20 2022-01-25 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for determining motion of an object in imaging
US11297284B2 (en) 2014-04-08 2022-04-05 Udisense Inc. Monitoring camera and mount
US11389119B2 (en) * 2016-09-06 2022-07-19 Photorithm, Inc. Generating a breathing alert
RU2780966C2 (en) * 2016-11-10 2022-10-04 Конинклейке Филипс Н.В. Selection of image obtaining parameter for image generation system

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9602917B2 (en) 2013-10-15 2017-03-21 Stratoscientific, Inc. Acoustic collection system for handheld electronic devices
US9414155B2 (en) 2013-10-15 2016-08-09 Stratoscientific, Inc. Acoustic collection system for handheld electronic devices
USD746802S1 (en) * 2013-10-15 2016-01-05 Stratoscientific, Inc. Electronic device case with stethoscope
US11864926B2 (en) 2015-08-28 2024-01-09 Foresite Healthcare, Llc Systems and methods for detecting attempted bed exit
US10206630B2 (en) 2015-08-28 2019-02-19 Foresite Healthcare, Llc Systems for automatic assessment of fall risk
US10398353B2 (en) 2016-02-19 2019-09-03 Covidien Lp Systems and methods for video-based monitoring of vital signs
JP6050535B1 (en) * 2016-03-10 2016-12-21 株式会社ネクステッジテクノロジー Gauze for measurement
US10453202B2 (en) * 2016-06-28 2019-10-22 Foresite Healthcare, Llc Systems and methods for use in detecting falls utilizing thermal sensing
US10939824B2 (en) 2017-11-13 2021-03-09 Covidien Lp Systems and methods for video-based monitoring of a patient
CA3086527A1 (en) 2018-01-08 2019-07-11 Covidien Lp Systems and methods for video-based non-contact tidal volume monitoring
DE102018209359A1 (en) * 2018-06-12 2019-12-12 Siemens Healthcare Gmbh Determination of a characteristic characterizing an intentional stopping of a breath
US11617520B2 (en) * 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
WO2020146326A1 (en) * 2019-01-07 2020-07-16 Cates Lara M B Computer-based dynamic rating of ataxic breathing
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11850026B2 (en) 2020-06-24 2023-12-26 The Governing Council Of The University Of Toronto Remote portable vital signs monitoring
US10991190B1 (en) 2020-07-20 2021-04-27 Abbott Laboratories Digital pass verification systems and methods
US20220225892A1 (en) * 2021-01-20 2022-07-21 Vivonics, Inc. System and method for detecting and/or monitoring the presence of at least one of pneumothorax, hemopneumothorax, or hemothorax in a living subject using one or more light sources and one or more light detectors
US20220270241A1 (en) * 2021-02-22 2022-08-25 Fresenius Medical Care Holdings, Inc. System and Method for Measuring Edema at Home and Measurement Pattern for Use Therewith

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6290654B1 (en) * 1998-10-08 2001-09-18 Sleep Solutions, Inc. Obstructive sleep apnea detection apparatus and method using pattern recognition
US20040111040A1 (en) * 2002-12-04 2004-06-10 Quan Ni Detection of disordered breathing
US20070276278A1 (en) * 2003-04-10 2007-11-29 Michael Coyle Systems and methods for monitoring cough
US20080159591A1 (en) * 2007-01-03 2008-07-03 Science Applications International Corporation Human detection with imaging sensors
US20080275349A1 (en) * 2007-05-02 2008-11-06 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US20100063419A1 (en) * 2008-09-05 2010-03-11 Varian Medical Systems Technologies, Inc. Systems and methods for determining a state of a patient
US20110040217A1 (en) * 2009-07-22 2011-02-17 Atreo Medical, Inc. Optical techniques for the measurement of chest compression depth and other parameters during cpr
US20110144517A1 (en) * 2009-01-26 2011-06-16 Miguel Angel Cervantes Video Based Automated Detection of Respiratory Events
US20110190598A1 (en) * 2010-01-31 2011-08-04 Vladimir Shusterman Health Data Dynamics, Its Sources and Linkage with Genetic/Molecular Tests

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6062216A (en) * 1996-12-27 2000-05-16 Children's Medical Center Corporation Sleep apnea detector system
US6519486B1 (en) 1998-10-15 2003-02-11 Ntc Technology Inc. Method, apparatus and system for removing motion artifacts from measurements of bodily parameters
GB0014059D0 (en) * 2000-06-09 2000-08-02 Chumas Paul D Method and apparatus
US7778691B2 (en) * 2003-06-13 2010-08-17 Wisconsin Alumni Research Foundation Apparatus and method using synchronized breathing to treat tissue subject to respiratory motion
US7385614B2 (en) * 2005-03-28 2008-06-10 Silicon Graphics, Inc. Compositing images using logically divided object space
US8075499B2 (en) * 2007-05-18 2011-12-13 Vaidhi Nathan Abnormal motion detector and monitor
CA2720871A1 (en) * 2008-04-03 2009-10-08 Kai Medical, Inc. Non-contact physiologic motion sensors and methods for use
EP2387384B1 (en) * 2009-01-16 2016-05-04 Koninklijke Philips N.V. Method for automatic alignment of a position and orientation indicator and device for monitoring the movements of a body part
US8866621B2 (en) * 2009-02-25 2014-10-21 Empire Technology Development Llc Sudden infant death prevention clothing
US9205279B2 (en) * 2009-07-17 2015-12-08 Cyberheart, Inc. Heart tissue surface contour-based radiosurgical treatment planning
US20130124148A1 (en) * 2009-08-21 2013-05-16 Hailin Jin System and Method for Generating Editable Constraints for Image-based Models
EP2478708A2 (en) * 2009-09-18 2012-07-25 Logos Technologies Inc. Method for the compression of an aerial image using an image prediction based on a depth model of the terrain
US8649562B2 (en) 2009-10-06 2014-02-11 Koninklijke Philips N.V. Method and system for processing a signal including at least a component representative of a periodic phenomenon in a living being
US20110251493A1 (en) 2010-03-22 2011-10-13 Massachusetts Institute Of Technology Method and system for measurement of physiological parameters
WO2011150362A2 (en) * 2010-05-28 2011-12-01 Mayo Foundation For Medical Education And Research Sleep apnea detection system
CN103503024B (en) 2011-04-14 2016-10-05 皇家飞利浦有限公司 For from the equipment of feature signal extraction information and method
US8790269B2 (en) 2011-05-09 2014-07-29 Xerox Corporation Monitoring respiration with a thermal imaging system
US8945150B2 (en) * 2011-05-18 2015-02-03 Restoration Robotics, Inc. Systems and methods for selecting a desired quantity of follicular units
TW201309266A (en) * 2011-08-26 2013-03-01 Univ Nat Taiwan System and method of respiratory detection
US9226691B2 (en) 2012-06-01 2016-01-05 Xerox Corporation Processing a video for tidal chest volume estimation
US20130324874A1 (en) 2012-06-01 2013-12-05 Xerox Corporation Minute ventilation estimation based on chest volume
US9301710B2 (en) 2012-06-01 2016-04-05 Xerox Corporation Processing a video for respiration rate estimation
US8971985B2 (en) 2012-06-01 2015-03-03 Xerox Corporation Minute ventilation estimation based on depth maps
US9155475B2 (en) 2012-06-26 2015-10-13 Xerox Corporation Enabling hybrid video capture of a scene illuminated with unstructured and structured illumination sources
US9141868B2 (en) 2012-06-26 2015-09-22 Xerox Corporation Contemporaneously reconstructing images captured of a scene illuminated with unstructured and structured illumination sources
US8792969B2 (en) 2012-11-19 2014-07-29 Xerox Corporation Respiratory function estimation from a 2D monocular video
US20150094606A1 (en) 2013-10-02 2015-04-02 Xerox Corporation Breathing pattern identification for respiratory function assessment
US20150073281A1 (en) 2013-09-11 2015-03-12 Xerox Corporation Generating a flow-volume loop for respiratory function assessment
US9504426B2 (en) 2013-12-06 2016-11-29 Xerox Corporation Using an adaptive band-pass filter to compensate for motion induced artifacts in a physiological signal extracted from video
US20150245787A1 (en) 2014-03-03 2015-09-03 Xerox Corporation Real-time video processing for respiratory function analysis

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6290654B1 (en) * 1998-10-08 2001-09-18 Sleep Solutions, Inc. Obstructive sleep apnea detection apparatus and method using pattern recognition
US20040111040A1 (en) * 2002-12-04 2004-06-10 Quan Ni Detection of disordered breathing
US20070276278A1 (en) * 2003-04-10 2007-11-29 Michael Coyle Systems and methods for monitoring cough
US20080159591A1 (en) * 2007-01-03 2008-07-03 Science Applications International Corporation Human detection with imaging sensors
US20080275349A1 (en) * 2007-05-02 2008-11-06 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US20100063419A1 (en) * 2008-09-05 2010-03-11 Varian Medical Systems Technologies, Inc. Systems and methods for determining a state of a patient
US20110144517A1 (en) * 2009-01-26 2011-06-16 Miguel Angel Cervantes Video Based Automated Detection of Respiratory Events
US20110040217A1 (en) * 2009-07-22 2011-02-17 Atreo Medical, Inc. Optical techniques for the measurement of chest compression depth and other parameters during cpr
US20110190598A1 (en) * 2010-01-31 2011-08-04 Vladimir Shusterman Health Data Dynamics, Its Sources and Linkage with Genetic/Molecular Tests

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Aliverti, Andrea, et al. "Optoelectronic plethysmography in intensive care patients.", 2000, American journal of respiratory and critical care medicine 161.5: 1546-1552. *
Chen, Huijun, et al. "Color structured light system of chest wall motion measurement for respiratory volume evaluation." Journal of biomedical optics 15.2 (2010): 026013-026013. *
Nozoe, Masafumi, Kyoshi Mase, and Akimitsu Tsutou. "Regional Chest Wall Volume Changes During Various Breathing Maneuvers in Normal Men.", 2011, Journal of the Japanese Physical Therapy Association 14.1: 12-18 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10869611B2 (en) 2006-05-19 2020-12-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US10663553B2 (en) 2011-08-26 2020-05-26 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10339654B2 (en) 2013-01-24 2019-07-02 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10219739B2 (en) 2013-10-02 2019-03-05 Xerox Corporation Breathing pattern identification for respiratory function assessment
US11297284B2 (en) 2014-04-08 2022-04-05 Udisense Inc. Monitoring camera and mount
US11100636B2 (en) 2014-07-23 2021-08-24 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10438349B2 (en) 2014-07-23 2019-10-08 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9697599B2 (en) 2015-06-17 2017-07-04 Xerox Corporation Determining a respiratory pattern from a video of a subject
US10660541B2 (en) 2015-07-28 2020-05-26 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US11382568B2 (en) 2016-03-21 2022-07-12 International Business Machines Corporation Obtainment of cleaned sequences relating to a center of gravity
US11633155B2 (en) 2016-03-21 2023-04-25 International Business Machines Corporation Obtainment of cleaned sequences relating to a center of gravity
US10602988B2 (en) 2016-03-21 2020-03-31 International Business Machines Corporation Obtainment of cleaned sequences relating to a center of gravity
USD854074S1 (en) 2016-05-10 2019-07-16 Udisense Inc. Wall-assisted floor-mount for a monitoring camera
US20190200937A1 (en) * 2016-08-23 2019-07-04 Koninklijke Philips N.V. Device, system and method for detection of an asthma attack or asthma of a subject
US10925548B2 (en) * 2016-08-23 2021-02-23 Koninklijke Philips N.V. Device, system and method for detection of an asthma attack or asthma of a subject
CN109640819A (en) * 2016-08-23 2019-04-16 皇家飞利浦有限公司 For the asthma attack of test object or the equipment, system and method for asthma
US20220361824A1 (en) * 2016-09-06 2022-11-17 Nutrits, Ltd. Generating a breathing alert
US11389119B2 (en) * 2016-09-06 2022-07-19 Photorithm, Inc. Generating a breathing alert
CN109952056A (en) * 2016-11-02 2019-06-28 皇家飞利浦有限公司 Equipment, system and method for CO2 monitoring
RU2780966C2 (en) * 2016-11-10 2022-10-04 Конинклейке Филипс Н.В. Selection of image obtaining parameter for image generation system
USD855684S1 (en) 2017-08-06 2019-08-06 Udisense Inc. Wall mount for a monitoring camera
WO2019104108A1 (en) * 2017-11-22 2019-05-31 Udisense Inc. Respiration monitor
US10874332B2 (en) * 2017-11-22 2020-12-29 Udisense Inc. Respiration monitor
US20190150798A1 (en) * 2017-11-22 2019-05-23 Udisense Inc. Respiration monitor
KR102108630B1 (en) * 2018-07-30 2020-05-07 주식회사 모두의연구소 Method for respiration measurement using ultrasonic wave based on image segmentation and device for the same
KR20200013380A (en) * 2018-07-30 2020-02-07 주식회사 모두의연구소 Method for respiration measurement using ultrasonic wave based on image segmentation and device for the same
US20200129117A1 (en) * 2018-10-29 2020-04-30 Altair Medical Ltd. Condition Detector
USD900429S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle band with decorative pattern
USD900430S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle blanket
USD900431S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle blanket with decorative pattern
USD900428S1 (en) 2019-01-28 2020-11-03 Udisense Inc. Swaddle band
US11232576B2 (en) * 2019-03-20 2022-01-25 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for determining motion of an object in imaging
CN110269624A (en) * 2019-07-16 2019-09-24 浙江伽奈维医疗科技有限公司 A kind of respiration monitoring device and its monitoring of respiration method based on RGBD camera

Also Published As

Publication number Publication date
US20150094597A1 (en) 2015-04-02
DE102014219495A1 (en) 2015-04-02
DE102015222498A1 (en) 2016-05-25
US10219739B2 (en) 2019-03-05

Similar Documents

Publication Publication Date Title
US10219739B2 (en) Breathing pattern identification for respiratory function assessment
US8792969B2 (en) Respiratory function estimation from a 2D monocular video
US10506952B2 (en) Motion monitor
Massaroni et al. Contactless methods for measuring respiratory rate: A review
US11612338B2 (en) Body motion monitor
Yu et al. Noncontact respiratory measurement of volume change using depth camera
US20150073281A1 (en) Generating a flow-volume loop for respiratory function assessment
Reyes et al. Tidal volume and instantaneous respiration rate estimation using a volumetric surrogate signal acquired via a smartphone camera
CN111565638B (en) System and method for video-based non-contact tidal volume monitoring
JP6054584B2 (en) Treatment system having a patient interface for acquiring a patient's life state
JP5980720B2 (en) Video processing for respiratory rate estimation
RU2635479C2 (en) System for measuring vital activity indicators using camera
US9443289B2 (en) Compensating for motion induced artifacts in a physiological signal extracted from multiple videos
CN105491942A (en) Monitoring system and method for monitoring the hemodynamic status of a subject
JP5323532B2 (en) Respiration measurement method and respiration measurement device
US20170055878A1 (en) Method and system for respiratory monitoring
CN106413533A (en) Device, system and method for detecting apnoea of a subject
Chatterjee et al. Real-time respiration rate measurement from thoracoabdominal movement with a consumer grade camera
CN115334959A (en) Sleep state detection for apnea-hypopnea index calculation
WO2023179757A1 (en) Lung function detection method, system and apparatus, and computer device and storage medium
US20220233096A1 (en) Systems and methods for non-contact respiratory monitoring
JP6415462B2 (en) Apparatus and method for determining respiratory volume signal from image data
Nesar et al. Improving touchless respiratory monitoring via lidar orientation and thermal imaging
US20220240790A1 (en) Systems and methods for non-contact heart rate monitoring
Lorato Video respiration monitoring: Towards remote apnea detection in the clinic

Legal Events

Date Code Title Description
AS Assignment

Owner name: XEROX CORPORATION, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MESTHA, LALIT KESHAV;SHILLA, ERIBAWEIMON;BERNAL, EDGAR A.;AND OTHERS;SIGNING DATES FROM 20130927 TO 20131001;REEL/FRAME:031327/0606

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION