GB2547357A - Analysis of breathing data - Google Patents

Analysis of breathing data Download PDF

Info

Publication number
GB2547357A
GB2547357A GB1705631.8A GB201705631A GB2547357A GB 2547357 A GB2547357 A GB 2547357A GB 201705631 A GB201705631 A GB 201705631A GB 2547357 A GB2547357 A GB 2547357A
Authority
GB
United Kingdom
Prior art keywords
subject
time
entropy value
calculating
entropy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1705631.8A
Other versions
GB201705631D0 (en
GB2547357B (en
Inventor
Lasenby Joan
Hessel De Boer Willem
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pneumacare Ltd
Original Assignee
Pneumacare Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pneumacare Ltd filed Critical Pneumacare Ltd
Priority to GB1705631.8A priority Critical patent/GB2547357B/en
Priority claimed from GB1600152.1A external-priority patent/GB2532883B/en
Publication of GB201705631D0 publication Critical patent/GB201705631D0/en
Publication of GB2547357A publication Critical patent/GB2547357A/en
Application granted granted Critical
Publication of GB2547357B publication Critical patent/GB2547357B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/087Measuring breath flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/087Measuring breath flow
    • A61B5/0873Measuring breath flow using optical means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • A61B5/1135Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing by monitoring thoracic expansion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • G01B11/167Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge by projecting a pattern on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30076Plethysmography

Abstract

A method of analysing the breathing of a subject 104, comprising: projecting a pattern of radiation onto at least part of a body of the subject 104; recording at each of a plurality of instances in time image data representing a three dimensional configuration of at least a portion of the projected pattern at that instance in time; processing the image data to produce a data array representing a shape of at least, part of the body of the subject 104 as a function of time; and calculating a value indicative of disorder for one or more values or parameters of or calculated from the data array. The value indicative of disorder may be an entropy value, and may comprise an approximate entropy value or a sample entropy value. Calculating the entropy value may comprise calculating an entropy value for a comparative parameter which may comprise a phase angle or a principle angle and/or spread. Respiratory rate, tidal volume, minute ventilation, peak inspiratory flow, fractional inspiratory time, peak and/or mean inspiratory and/or expiratory flow measures and the percentage contribution of the chest and/or abdomen to the tidal volume may also be calculated.

Description

ANALYSIS OF BREATHING DATA Field
The invention relates to a method of analysing the breathing of a subject and to a related measurement system.
Background
Monitoring the breathing and/or lung functions of a subject provides data that is often useful for detecting and/or diagnosing the presence of lung disease and/or pulmonary obstructions and/or other conditions. Accurate measurement and analysis of breathing and/or lung functions generally requires the use of devices such as masks or mouthpieces coupled to the mouth of a subject to be monitored. These devices tend to be uncomfortable and/or invasive and do not lend themselves to monitoring difficult and/or frail subjects, for example neonatal subjects.
Respiratory Inductance Plethysmography is another method of monitoring the breathing and/or lung functions of a subject, which is carried out by measuring the movement of the chest and abdominal wall. This approach is based on the theory that the sum of the changes in the anteroposterior diameter of the abdomen and the rib cage correspond to changes in lung volume. Two elastic bands that incorporate sinusoid wire coils are used to measure this diameter. One of the bands is placed around the rib cage under the armpits and the other around the abdomen at the level of the umbilicus.
Expansion of the bands changes the self-inductance of the coils, allowing the change in the anteroposterior diameter to be measured. The bands are connected to an oscillator and demodulation electronics, which convert the readings into digital waveforms. These waveforms are then used to analyse and/or describe various measures of complex respiratory patterns, for example respiratory rate, tidal volume, minute ventilation, peak inspiratory flow, fractional inspiratory time, peak and/or mean inspiratory and/or expiratory flow measures and the percent contribution of the chest and/or abdomen to the tidal volume.
The fitting of the bands on the subject to be monitored requires manoeuvring of the subject, which can in some cases be problematic in difficult and/or frail subjects.
Another known method and associated apparatus for monitoring the breathing and/or lung functions of a subject is disclosed in WO2010066824 together with an apparatus for carrying out the method, the contents of which are incorporated herein. This method involves an optical measurement of the three-dimensional shape of the body of the subject over time and derives breathing volume data from the changes in shape. Breathing data of the subject may be compared with breathing data associated with healthy and disease states in a statistical database to diagnose the subject’s lung function.
It is desirable to provide a method of processing the measurement data that allows a more accurate diagnosis to be made.
Summary
According to a first aspect of the invention, there is provided a method of analysing the breathing of a subject as specified in claim 1.
According to a second aspect of the invention, there is provided a measurement system as specified in claim 11.
Optional features are specified in the dependent claims.
Brief Description of the Drawings
Certain embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 shows a block diagram of a measuring system;
Figure 2 shows a flow diagram of the method;
Figure 3 shows six breathing modes of a healthy subject;
Figure 4 shows six breathing modes of a subject suffering from COPD;
Figure 5 shows the time evolution of the modes of Figure 3;
Figure 6 shows the time evolution of the modes of Figure 4;
Figure 7 shows a flow diagram of the breath detection method;
Figure 8 shows a measurement of breathing volume;
Figure 9 shows an exemplary embodiment of the measurement unit of Figure 1;
Figure 10 shows five measures of synchronicity between chest and abdomen volumetric contributions to tidal breathing volume;
Figure 11 shows a first example of principal angle and spread between chest and abdomen volumetric contributions to tidal breathing volume; and
Figure 12 shows a second example of principal angle and spread between chest and abdomen volumetric contributions to tidal breathing volume.
Detailed Description of the Certain Embodiments
Figure 1 shows a block diagram of an embodiment of a measuring system for measuring a 4-dimensional data array of the shape of the body of a subject and analysing the data. The measurement system includes a measurement unit 1 for measuring the shape of a body as a function of time. The system may measure movement of a subject using a method as disclosed in patent application WO 2010/066824 or any other method providing similar measurement data. The measurement data of the measurement unit may be stored in a measurement storage 2 for later use or use at a different location.
The measurement system further includes an apparatus 3 for analysing the measurement data. The analysis is carried out by a processing unit 4 of the apparatus; the analysis may be carried out independently from the measurement. The processing unit may receive the measurement data from the measurement storage 2, for example when the measurement unit and the apparatus are at different locations or measurement data is analysed at a time after the measurements were taken. It may also receive the measurement data directly from the measurement unit 1, for example when the measurement unit is located near the apparatus or the measurement unit and apparatus are integrated in a single device. The processing unit can store data in and retrieve data from a data storage 5, such as disc drives or Solid State Drives (SSDs). The data storage may include a signature database 6, which can be used for classifying measured subjects. Results of the analysis can be output to and shown on a display unit 7.
The processing unit 4 and the data storage 5 may form part of a computing device. Its components may be interconnected using a systems bus. Other components may be a volatile memory such as a random access memory; an input/output interface; input devices such as a keyboard; one or more interfaces for communicating with e.g. the measurement unit 1 and the measurement storage 2.
The measurement unit 1 measures movement of a part of the body, in particular of the chest wall and / or the abdomen. Normally, the anterior part of the body is measured; alternatively or in addition a posterior or more or less lateral part may also be measured. When measuring the breathing of a subject, such as a human or an animal, the movement of the chest wall is usually measured. Instead or in addition the movement of the abdomen may be measured. The method produces reliable results both for tidal breathing and for forced manoeuvres; the classification of the subject is consistent between measurements on tidal breathing and forced manoeuvers, where the latter might contain movement artefacts. The present method allows classification of a subject on respiratory diseases using tidal-only breathing data. Prior art methods, using for example flow-volume loops, require the subject to perform a forced breathing manoeuvre. The method is therefore suitable for determining the breathing pattern of healthy and sick subjects, even of unconscious subjects.
The shape of the body is measured using a measuring grid having a 2-dimensional pattern of grid points. Such a grid can be realized by illuminating the body with a chessboard pattern as disclosed in the above mentioned patent application WO 2010/066824. The 3-dimensional position of the body is measured at each grid point as a function of time. The grid has ni by n2 grid points; ni x n2 may be 10 x 15, resulting in ng=150 grid points. When images of the body are captured with a video camera operating at 30 frames per second with a measurement time of 4 seconds, the position of the body for each grid point will be measured at nt = 30 * 4 = 120 points in time. The present analysis method, providing a decomposition of the body surface movement into modes, requires only a relatively short period of measurements, usually only a few seconds in duration, without the need for the subject to perform a forced manoeuvre. An analysis of breathing modes can be made on just a small section, e.g. 6-10 seconds, of tidal breathing. As a consequence, the method can be applied even to subjects that are not capable of forced manoeuvres and allows a quicker measurement of patients.
The data of a measurement of a subject output by the measurement unit 1 includes nt frames of ni * n2 3-dimensional positions. If a position is represented by three spatial dimensions (x,y,z), the data output is nt* ^ * n2 * 3 numbers. The data can be organized in a 4-dimensional array A, two dimensions being for the grid, one dimension for the position and one dimension for the time. Alternatively, the data can be organized in a 3-dimensional array, one dimension being for the grid, one dimension for the position and one dimension for the time. The grid can be represented by a 1-dimensional array having elements 1 to ng, where each element relates to one grid point of the grid. The data may be input to the measurement storage 2 or to the processing unit 4.
It should be noted that the word ‘dimension’ is used in two different contexts. When used in the context of an array, it refers to the array dimension. When used in the context of position, it refers to the spatial dimension.
The processing unit 4 analyses the data array A by carrying out the three steps of mapping, decomposition and signature formation, shown as steps 10, 11 and 12, respectively in the flow diagram of Figure 2.
The first step is mapping the data array A onto a 2-dimensional array M(i,j). The mapping may be carried out such that each row of M corresponds to a grid point. When the different rows of the 2-dimensional array correspond to different grid points, the principal modes that can be derived from the decomposition of M correspond to properties of the chest wall movement as a function of the grid point. Such spatially separated information is useful in diagnosis. If the grid includes ng= ni * n2 grid points, the 2-dimensional array will have ng rows relating to the grid points. The index i of the array M(i,j) relates to the grid points and numbers from 1 to ng.
The data array may be mapped onto M such that each row includes a plurality of positions at the points of time where the measurements were taken. In this embodiment, the data array includes positions for each grid point at nt moments in time and each position represents a point (x,y,z) in 3-dimensional space. These nt positions are mapped onto the rows of the 2-dimensional array M such that M will have 3nt columns. The index j of M(i,j) runs from 1 to 3nt. If ^=10, n2=15 and nt=120, the matrix M is 150 x 360. When the rows correspond to the grid points and the columns to the position of these grid points on the body as a function of time, the principal modes will be surface modes of the measured subject, representing breathing patterns. Many lung diseases have characteristic surface modes, making the modes particularly useful in diagnosis.
The data array may be mapped onto M in other ways. For example, it may be mapped onto an array M of nt x (3*ni*n2), Each of the nt rows corresponds to a particular point of time and gives the n^n2 positions of the grid at that moment in time. The resulting modes will represent time signatures.
The second step in the analysis is the decomposition of the array M into principal modes by the processing unit 4, shown as step 11 in Figure 2. Several methods for decomposition may be used, of which Singular Value Decomposition (SVD) is a fast method that works efficiently on very large matrices; see e.g. ‘A multilinear singular value decomposition’ by Lathauwer et al. in Siam J. Matrix Anal. Appl., Vol. 21, No 4, pp. 1253-1278. Matrix M of ng rows by 3nt columns is decomposed so that M = USVT. U is a unitary matrix of ng x ng, S is a diagonal matrix of ng x 3nt, and VT a unitary matrix of 3nt x 3nt. VT is the conjugate transpose of V. The columns of U are the eigenvectors of MMT; they are orthogonal and normalised. The diagonal values of S are the square roots of the nonnegative eigenvalues of both MTM as well as MMT. The columns of V are eigenvectors of MTM. The elements of U and S are independent of time. The time dependence of the measured motion is included in the matrix VT.
The third step of the analysis carried out by the processing unit 4 is the formation of a signature of the subject, shown as step 12 in Figure 2. The signature may be stored in the data storage 5 of the apparatus 3. The columns of the matrix U represent the principal modes of the breathing pattern of the measured body. When the motion of the chest wall of a subject is measured, the principal modes are the breathing patterns of the subject. The measured motion is best characterized by low-numbered columns of U, i.e. by the modes that give the largest contribution to the measured variation of the surface during breathing.
The first columns of U may represent a static orientation of the body, which may not be relevant for classification of a subject. For example, the first two or three columns represent the main planar orientation of the subject. The columns representing static orientation may be omitted from the signature. The number of columns of U selected for the signature can be determined by choosing those columns of U that correspond to diagonal elements of S which are above a predetermined value, possibly excluding columns relating to static orientation. In the analysis of breathing patterns 2, 3, 4, 5 or 6 modes are normally sufficient for classification. Higher order modes can usually be ignored, because they provide a very small contribution to the breathing pattern and may represent only noise.
In addition to using columns of U for the signature, parts of the matrix S and / or V may be used. For example, the weight in the matrix S may be included in the signature, which could be done by weighting the columns of U with the corresponding eigenvalues of S. The method of combining the information in U, S and / or V may be dependent on the type of classification used.
Since U is time independent, it is useful for forming the signature. Moreover, measurement data taken at different times or taken from different subjects do not have to be normalised to a specific, predetermined breathing rate before deriving the eigen modes, as is necessary in some prior art analysis methods. The important surface movement patterns of the chest are contained in the first few principal modes of U. Therefore, more information is obtained about the subject than from a volume measurement using a prior art method, as the latter does not distinguish between modes. In determining the volume, all localised surface movement information is discarded. It is this information that is used in a classification. The modes are the most important parts of the breathing pattern and are therefore included in the signature.
Figure 3 shows a graphical presentation of the first six breathing modes of a healthy subject during tidal breathing and the first part of a forced expiration. The order of the modes in the Figure is from left to right and top to bottom. Grid rows go from abdomen to chest with ascending row number. Grid columns go from the left side of the chest wall to the right side, with ascending column number. The presentation can be displayed on the display unit 7. The visualisation is obtained by taking a column of U and unwrapping it by distributing its ng elements over an ni x n2 array. The result is shown in the Figure as a surface on an ni x n2 grid. The example shown in Figure 3 has a grid of 15 x 10 grid points. The vertical axis shows the dimensionless, normalised position of the mode.
Figure 4 shows the first six breathing modes from the matrix U of a Chronic Obstructive Pulmonary Disease (COPD) patient during tidal breathing and the first part of a forced expiration. The first two modes of both Figures 3 and 4 represent static orientation of the body.
The importance of each breathing mode is represented by the corresponding eigenvalue of the matrix S. The six eigenvalues corresponding to the six modes of the healthy subject in Figure 3 are: 134.78; 93,36; 16.57; 5.30; 0.73; 0.29. The small eigenvalues for the last two modes indicate that these modes are less important because they contribute only a very small amount to the measured chest motion; these modes may be at least partly noise. The six eigenvalues for the six modes of the COPD patient in Figure 4 are: 160.19; 109.11; 18.56; 4.71; 1.42; 0.57.
The columns of V give a normalised time series of how the modes move in the x, y and z direction. Figures 5 and 6 show plots of the first six columns of V corresponding to the six breathing modes shown in Figures 3 and 4, respectively. The horizontal axis of each of the six plots represents the frame number from 1 to 1500, subdivided into three parts for the coordinates x, y and z. The vertical axis of the plots represents the number by which the coordinate must be multiplied to obtain the time evolution of the mode. The first 500 frames represent the value with which the x coordinates of all grid points of a corresponding mode must be multiplied. The next 500 frames represent the values for the y coordinate and the last 500 frames for the z coordinate.
The time evolution of each of the modes or a combination of modes can be visualised on the display unit 7 by combining the one or more mode shapes of U, the one or more corresponding magnitudes of S and the one or more corresponding time dependences of V. The combination can be made by calculating M = USVT for the relevant modes.
Instead of or in addition to using the modes of U in the signature, features extracted from U may be used in the signature. One such feature is the 2-dimensional frequencies of each mode, which may be formed via a 2D Fourier transformation of a mode. The 2-dimensional frequencies of certain modes are characteristic for specific disease states, such as disorders of the lungs, and are therefore suitable for classification.
The features extracted from the decomposed matrix A and combined in a signature of a measured subject may be used to classify the subject. The classification is shown as a step 13 in Figure 2, which can be included or can be omitted from the method. In the classification step the signature of the subject is compared to previously generated signatures stored in the database 6. Each of these previously acquired signatures is representative of a particular disease state, including, but not limited to, COPD, asthma, pneumonia, or a healthy state. The result of the classification is an association of the breathing signature of the subject with a particular disease state or the healthy state or the closeness to such a state.
The database 6 stores the signatures of a large number of subjects, each labelled with its disease or healthy state. The disease state may include the disease progression. The measurement data arrays of the subjects are analysed, usually offline, and the signatures formed. A clustering algorithm forms clusters of signatures for each of the disease labels and stores them in the database. A simple form of classification consists of reducing each of the cluster members to an average value and an approximate radius, for example a standard deviation. The result is a representative signature for each disease class and an expected spread of the members of this class from the representative signature. In another form of classification the labelled cluster members or their extracted feature vectors are kept in the database for use in a subsequent classification via a k-nearest neighbours method. The database can be updated and refined when additional labelled signatures become available. The larger the number of labelled signatures, the more accurate is the classification. Good results are obtained when each cluster has at least 40 signatures. The classification is improved if the signatures in the database have been obtained under similar conditions, such as lighting environment, breathing manoeuver, grid coverage, grid size and patient position. On the other hand, the population of signatures should capture as wide a range of conditions in which the system is to be expected to perform.
The actual classification of a subject is carried out in the processing unit 4. The distance of the signature to the representative signature of each cluster is determined, known as classification or k-mean search. Alternatively, the label of the signature can be assigned as the most common label among its k nearest neighbours, where ‘nearest’ implies ‘smallest distance’. The output of the classification step can be that the subject is in the disease state corresponding to that of the representative signature his signature is nearest to, or, for each disease label, a probability that subject is in that particular disease state.
The method as shown in Figure 2 includes mapping, decomposition and signature formation, together forming the analysis of the data array. The results of the analysis may be stored and used later for classification by a physician or for other purposes. The analysis may also be followed by a classification using a signature database, forming a four-step method shown by elements 10-13. In a special embodiment the analysis is preceded directly by a step in which a subject is measured and the data array is filled; the analysis may again be followed by a classification.
Rows and column of a matrix can be interchanged in the composition of the matrices U, S and V. Although the above described embodiments refer to a rectangular coordinate system (x,y,z), any other coordinate system may be used, e.g. a cylindrical coordinate system having its axis along the longitudinal axis of the body.
The signature formed from the measured data becomes more accurate if the data on which the signature is based relates to a whole number of breaths. It could be regarded as ‘normalising’ the data input to the signature decomposition. This can be achieved by analysing the data and manually selecting the desired integer number of breaths. The following method allows to automate the selection of breaths, making the analysis less time consuming.
Figure 7 shows a flow diagram of a breath detection method for identifying individual breaths or a series of breaths. Measured parameter values representing a breathing pattern over time are input to the method. The parameter can e.g. be a measured breathing volume or a chest wall motion over time. The breathing volume may be obtained from a spirometer. If the method of Figure 7 is used together with the method of Figure 2, the breathing volume or chest wall movement over time may be derived from the above matrix M or from USVT. In the latter case, noise in the method can be reduced by using only the most significant modes of U in the calculation of the motion. The parameter can also be an average of the z-position over the measurement grid.
Figure 8 shows a measurement 30 of breathing volume V over time t of a subject; the measurement includes tidal breathing and one forced manoeuvre. The vertical axis shows the volume in litres, the horizontal axis shows time in frame numbers, taken at 30 frames per second. A measurement of chest wall motion would have resulted in a similar plot. The measured parameter values are used as input for the method of Figure 7.
In the first step 20 in the method of Figure 7 an average value of the parameter values is formed. The average may be an exponential moving average. When the parameter values V are given at discrete points of time i, the average A can be determined as A(i) = (1-a)A(i-1) + aV(i). Alpha can have any value in the range [0,1], Good results for a wide range of input are obtained using a value of alpha equal to 0.008. The moving average is shown in Figure 8 as line 31.
The average may be determined by other algorithms, e.g. using a low-pass filter having a Gaussian response function. In a specific embodiment the filter can have a size of three times the sample rate and a sigma equal to the sample rate, where the sample rate is the number of frames per second of the measurement unit.
In the second step 21 of the method the parameter is divided in time segments. Each segment lies between subsequent intersections of the parameter and the average. Figure 8 shows three consecutive intersections 32-34, defining two successive segments 35 and 36. The method is not sensitive to the way in which the intersections are determined. An intersection may be defined as the point in time where (V(i) - A(i)) changes sign.
In the third step 22 of the method an extreme value of the parameter within each segment is determined. An extreme value may be determined by comparing a value of the parameter with its two neighbouring values. The method will find the maximum 37 as extreme values for segment 35 and the minimum 38 for segment 36 as shown in Figure 8. If the extreme value of a segment is a maximum, it corresponds to the end of an inhalation and the start of an exhalation. If the extreme value is a minimum, it corresponds to the end of an exhalation and the start of an inhalation. The method can be made more robust by checking whether the extreme value V(i) is larger than the average A(i) for a maximum and whether the extreme value is smaller than the average for a minimum. Alternatively, the signed area of (V(i) - A(i)) over the segment can be determined; an extreme value in a segment having a positive area is a maximum, corresponding to the start of an exhalation and an extreme value in a segment having a negative area is a minimum, corresponding to the start of an inhalation.
Since each segment has only one extreme value of the parameter, it must include either the start of an exhalation (being the end of an inhalation) or the end of an exhalation (being the start of an inhalation). Hence, if a segment includes the start of an exhalation, the next segment will include the end of the exhalation. An individual breath, including an inhalation and an exhalation, corresponds to the period between two successive maximum values or between two successive minimum values. A selection of measurement data extending over a certain number p of whole breaths can be made by starting the selection at the point of time of an extreme value of the parameter and stopping the selection at the 2pth extreme value of the parameter following the starting point.
The method can be carried out in an apparatus similar to apparatus 3 in Figure 1 including a processing unit 4 and a data storage 5, forming a computer device. In a realtime implementation the apparatus is connected to the output of a measurement unit and measured values are fed into the apparatus. The data may be input in frames at regular intervals, e.g. 30 frames per second. The measured values are pre-processed if necessary to obtain the parameter values required for the method, e.g. a breathing volume or a chest wall motion. In an embodiment in which the above moving average method is used, the moving average is updated at each frame being input into the apparatus, the check for start of a new segment is carried out, and the moving extreme value is updated.
The method of Figure 7 can be used to determine the start of a signature decomposition by monitoring the breathing of a subject under test, The breath detection method performs an on-line scan of the breaths until it has detected a sufficient number of good quality contiguous breaths, upon which the apparatus 4 is signalled to commence signature decomposition without additional operator intervention.
In on off-line implementation of the method the apparatus is connected to a measurement storage, from which the measured values are retrieved. The processing can be carried out similar to the on-line implementation.
The method of identifying breaths can be used for ventilator settings in respiratory treatment, e.g. for determining volume drift over time in patients with an obstructive disease. It can also be used in the context of ventilator triggering, where the detection of an inhalation (i.e. the end of an exhalation) triggers the administration of oxygen.
The method can also be used for analysing the shape of individual breaths, as is common in the veterinary field. The method is used to segment the measurements in whole breaths.
The method is suitable as a step in forming a signature in the method of Figure 2. The method may be applied to the data array output by the measurement unit 1 in Figure 1, e.g. by using the average z-position over the measurement grid as parameter. The steps of mapping, decomposition and signature formation can then be applied to a part of the data array corresponding to an integer number of breaths.
Each of the methods of Figure 2 and 7 can be carried out by a computing device in which software is stored in a memory and is coded to execute the method when running in the processing unit. The software may be stored on a data carrier.
Referring now to Figure 9, there is shown an exemplary embodiment of the measurement unit 1 of Figure 1. The measurement unit 1 includes first and second cameras 100, 101 and a projector 102 attached to a mounting frame 103. The projector 102 projects a pattern of radiation onto the chest and abdominal area of a subject 104 laying prone on a surface 105. The cameras 100, 101 are mounted either side of the projector and are angled toward the subject 104 at an angle of between about 10 degrees to about 40 degrees, depending on the spacing between the frame 103 and the subject 104.
In use, the measurement unit 1 acquires across a predetermined time period from the cameras 100, 101 at each of a plurality of instances in time image data representing a three dimensional configuration of the projected pattern at that instance in time. The image data is then processed to produce an array of breathing data representative of the shape of the subject 104 as a function of time. This data may then be used in accordance with the methods described above.
The data array may also be divided into two or more sub-arrays each representing a three dimensional configuration of a respective part of the subject 104, for example the subject’s chest and abdomen or the left and right sides of the subject’s trunk region. The sub-arrays may be processed to approximate a volume of each of the parts of the subject over time and may be displayed through the display unit 7, for example to assist a clinician with diagnosing a condition in the subject 104.
The sub-arrays or volumes may also be processed to calculate one or more comparative parameters representing the change over time in the relative volume contribution of each of the parts of the subject 104.
Figure 10 illustrates one such comparative parameter, namely the phase angle φ between chest or rib cage measurements RC and abdomen measurements AB, where:
It will be appreciated by those skilled in the art that the phase angle φ may be calculated across the entire time period (i.e. an overall phase) or across a subset of the entire time period (i.e. a windowed phase). The subset may correspond to a single breath (i.e. a per-breath phase) or a predetermined number of breaths.
Figures 11 and 12 illustrate examples of two further comparative parameters, namely the principal angle θι, θ2 and spread Si, S2 between chest or rib cage measurements RC and abdomen measurements AB.
These parameters are calculated using principal component analysis by calculating a 2x2 covariance matrix, where M = Cov(AB - mean(AB),RC - mean(RC)) and diagonalising the covariance matrix, where:
The principal angle θι, θ2 is the angle (in degrees) between the x=y line and ph while the spread is the ratio:
The angle change is also useful to track, which is done by taking three successive points Po, Pi, Pi, treating these two-dimensional vectors as three-dimensional by making the z-coordinate 0 and performing the three-dimensional cross product: a = (p2— Pi) x (Pi - Po) and making the product of its sign and magnitude the angle change.
Further parameters that may be useful to calculate and display on the display unit 7 include respiratory rate, tidal volume, minute ventilation, peak inspiratory flow, fractional inspiratory time, peak and/or mean inspiratory and/or expiratory flow measures and/or the percent contribution of the chest and/or abdomen to the tidal volume.
One particularly useful analysis to carry out is to determine or quantify the entropy of one or more of the measured values or parameters described above. Specifically, the approximate entropy or sample entropy, or an equivalent measure of disorder, of some of the aforementioned parameters have been found to be useful in determining a comparative state of wellness or illness.
In particular, it has been found that the entropy associated with breathing data derived from a healthy subject is higher, in many cases considerably higher, than breathing data derived from an ill patient.
Entropy in the form of sample entropy (SampEn), approximate entropy (ApEn) or fuzzy entropy (FuzzyEn) can be calculated as described, for example, in section 2 of CHEN, W et al. Measuring complexity using FuzzyEn, ApEn, and SampEn, Medical Engineering & Physics, 31(1), 61-68, 2009.
The above embodiments are to be understood as illustrative examples. Further embodiments are envisaged. It is to be understood that any feature described in relation to any one embodiment may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, or any combination of any other of the embodiments. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the accompanying claims.

Claims (11)

Claims
1. A method of analysing the breathing of a subject, the method comprising the steps of: projecting a pattern of radiation onto at least part of a body of the subject; recording at each of a plurality of instances in time image data representing a three dimensional configuration of at least a portion of the projected pattern at that instance in time; processing the image data to produce a data array representing a shape of at least part of the body of the subject as a function of time; and calculating a value indicative of disorder for one or more values or parameters of or calculated from the data array.
2. A method according to claim 1, wherein the value indicative of disorder is an entropy value.
3. A method according to claim 2, wherein the entropy value comprises an approximate entropy value or a sample entropy value.
4. A method according to claim 2 or 3, further comprising the steps of: dividing the data array into two or more sub-arrays each representing a three dimensional configuration of a respective part of the projected pattern or of the subject; processing the sub-arrays to approximate a volume of each of the parts of the projected pattern or of the subject over time.
5. A method according to claim 4, wherein the entropy calculation step comprises calculating an entropy value for one or more of the approximated volumes and/or for an aggregate of the approximated volumes.
6. A method according to claim 4 or 5 further comprising calculating a comparative parameter representing the change over time in the relative volume contribution of each of the parts of the projected pattern or of the subject.
7. A method according to claim 6, wherein the entropy calculation step comprises calculating an entropy value for the comparative parameter.
8. A method according to claim 6 or 7, wherein the comparative parameter comprises a phase angle or a principal angle and/or spread.
9. A method according to any one of claims 2 to 8, further comprising calculating one or more further parameters selected from respiratory rate, tidal volume, minute ventilation, peak inspiratory flow, fractional inspiratory time, peak and/or mean inspiratory and/or expiratory flow measures and the percent contribution of the chest and/or abdomen to the tidal volume.
10. A method according to claim 9, wherein the entropy calculation step comprises calculating an entropy value for the one or more further parameters.
11. A measurement system according to claim 10, wherein the measurement unit comprises first and second cameras and a projector attached to a mounting frame, wherein the projector is configured to project the pattern of radiation and the first and second cameras are configured to obtain the image data.
11. A measurement system comprising: a measurement unit for measuring a shape of at least part of a body of a subject as a function of time; and a processing unit for analysing measurement data of the measurement unit; wherein the measurement system is configured to perform a method according to claim 1.
12. A measurement system according to claim 11, wherein the measurement unit comprises first and second cameras and a projector attached to a mounting frame, wherein the projector is configured to project the pattern of radiation and the first and second cameras are configured to obtain the image data. Amendments to claims have been filed as follows Claims
1. A method of analysing the breathing of a subject, the method comprising the steps of: projecting a pattern of radiation onto at least part of a body of the subject; recording at each of a plurality of instances in time image data representing a three dimensional configuration of at least a portion of the projected pattern at that instance in time; processing the image data to produce a data array representing a shape of at least part of the body of the subject as a function of time; and calculating a value indicative of disorder for one or more values or parameters of or calculated from the data array, wherein the value indicative of a disorder is an entropy value.
2. A method according to claim 1, wherein the entropy value comprises an approximate entropy value or a sample entropy value.
3. A method according to claim 1 or 2, further comprising the steps of: dividing the data array into two or more sub-arrays each representing a three dimensional configuration of a respective part of the projected pattern or of the subject; processing the sub-arrays to approximate a volume of each of the parts of the projected pattern or of the subject over time.
4. A method according to claim 3, wherein the entropy calculation step comprises calculating an entropy value for one or more of the approximated volumes and/or for an aggregate of the approximated volumes.
5. A method according to claim 3 or 4 further comprising calculating a comparative parameter representing the change over time in the relative volume contribution of each of the parts of the projected pattern or of the subject.
6. A method according to claim 5, wherein the entropy calculation step comprises calculating an entropy value for the comparative parameter.
7. A method according to claim 5 or 6, wherein the comparative parameter comprises a phase angle or a principal angle and/or spread.
8. A method according to any preceding claim, further comprising calculating one or more further parameters selected from respiratory rate, tidal volume, minute ventilation, peak inspiratory flow, fractional inspiratory time, peak and/or mean inspiratory and/or expiratory flow measures and the percent contribution of the chest and/or abdomen to the tidal volume.
9. A method according to claim 8, wherein the entropy calculation step comprises calculating an entropy value for the one or more further parameters.
10. A measurement system comprising: a measurement unit for measuring a shape of at least part of a body of a subject as a function of time; and a processing unit for analysing measurement data of the measurement unit; wherein the measurement system is configured to perform a method according to claim 1.
GB1705631.8A 2013-11-27 2013-11-27 Analysis of breathing data Active GB2547357B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1705631.8A GB2547357B (en) 2013-11-27 2013-11-27 Analysis of breathing data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1705631.8A GB2547357B (en) 2013-11-27 2013-11-27 Analysis of breathing data
GB1600152.1A GB2532883B (en) 2012-11-27 2013-11-27 Analysis of breathing data

Publications (3)

Publication Number Publication Date
GB201705631D0 GB201705631D0 (en) 2017-05-24
GB2547357A true GB2547357A (en) 2017-08-16
GB2547357B GB2547357B (en) 2017-10-25

Family

ID=58744731

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1705631.8A Active GB2547357B (en) 2013-11-27 2013-11-27 Analysis of breathing data

Country Status (1)

Country Link
GB (1) GB2547357B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110367986A (en) * 2019-06-25 2019-10-25 南京理工大学 Breath signal approximate entropy feature extracting method based on EWT

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005253608A (en) * 2004-03-10 2005-09-22 Sumitomo Osaka Cement Co Ltd Condition analysis apparatus
US20060082590A1 (en) * 2004-10-14 2006-04-20 Stevick Glen R Method and apparatus for dynamic space-time imaging system
WO2010066824A2 (en) * 2008-12-11 2010-06-17 Pneumacare Ltd Method and apparatus for monitoring an object
WO2011132118A2 (en) * 2010-04-21 2011-10-27 Koninklijke Philips Electronics N.V. Respiratory motion detection apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005253608A (en) * 2004-03-10 2005-09-22 Sumitomo Osaka Cement Co Ltd Condition analysis apparatus
US20060082590A1 (en) * 2004-10-14 2006-04-20 Stevick Glen R Method and apparatus for dynamic space-time imaging system
WO2010066824A2 (en) * 2008-12-11 2010-06-17 Pneumacare Ltd Method and apparatus for monitoring an object
WO2011132118A2 (en) * 2010-04-21 2011-10-27 Koninklijke Philips Electronics N.V. Respiratory motion detection apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110367986A (en) * 2019-06-25 2019-10-25 南京理工大学 Breath signal approximate entropy feature extracting method based on EWT

Also Published As

Publication number Publication date
GB201705631D0 (en) 2017-05-24
GB2547357B (en) 2017-10-25

Similar Documents

Publication Publication Date Title
US10390739B2 (en) Analysis of breathing data
US10970926B2 (en) System and method for lung-volume-gated x-ray imaging
US11089974B2 (en) Monitoring the location of a probe during patient breathing
US9402565B2 (en) Method and system for analyzing craniofacial complex images
US9665935B2 (en) Image processing device and program
US8676298B2 (en) Medical image alignment apparatus, method, and program
JP4560643B2 (en) Ventilation distribution measurement method using respiratory CT images
US11950940B2 (en) System and method for determining radiation parameters
Houssein et al. Estimation of respiratory variables from thoracoabdominal breathing distance: a review of different techniques and calibration methods
GB2547357A (en) Analysis of breathing data
JP6273940B2 (en) Image analysis apparatus, image photographing system, and image analysis program
US10441192B2 (en) Impedance shift detection
JP6852545B2 (en) Image display system and image processing equipment
TW201941219A (en) Diagnosis support program
WO2014117205A1 (en) Method and system for clinical measurement of lung health
White Prediction and Characterization of Lung Tissue Motion during Quiet Respiration