WO2014153665A1 - System and method for monitoring a subject - Google Patents

System and method for monitoring a subject Download PDF

Info

Publication number
WO2014153665A1
WO2014153665A1 PCT/CA2014/050316 CA2014050316W WO2014153665A1 WO 2014153665 A1 WO2014153665 A1 WO 2014153665A1 CA 2014050316 W CA2014050316 W CA 2014050316W WO 2014153665 A1 WO2014153665 A1 WO 2014153665A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
subject
vector
signal
readings
Prior art date
Application number
PCT/CA2014/050316
Other languages
French (fr)
Other versions
WO2014153665A8 (en
Inventor
Andrew W. ECKFORD
William H. GAGE
Original Assignee
Engage Biomechanics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Engage Biomechanics Inc. filed Critical Engage Biomechanics Inc.
Publication of WO2014153665A1 publication Critical patent/WO2014153665A1/en
Publication of WO2014153665A8 publication Critical patent/WO2014153665A8/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6829Foot or ankle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/30General characteristics of devices characterised by sensor means
    • A61G2203/36General characteristics of devices characterised by sensor means for motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G7/00Beds specially adapted for nursing; Devices for lifting patients or disabled persons
    • A61G7/05Parts, details or accessories of beds
    • A61G7/057Arrangements for preventing bed-sores or for supporting patients with burns, e.g. mattresses specially adapted therefor

Definitions

  • the present invention relates generally to systems and methods for monitoring a subject, and more specifically, systems and methods for monitoring a motion or position of the subject.
  • the position or motion of a subject may be relevant in a number of applications.
  • some medical conditions can be caused by prolonged compression of a part of a patient's body.
  • An example of such a medical condition includes pressure ulcers or bedsores, where the patient is more prone to develop pressure ulcers on parts of the body in contact for prolonged periods with a supporting surface (e.g. a bed during resting).
  • a technique to prevent or reduce pressure ulcers includes turning patients on a regular schedule, such as moving them from their side to their back or vice versa.
  • U.S. Patent Application Publication No. 201 1/0263950 describes a system for monitoring medical conditions including pressure ulcers by detecting the orientation of a patient. The orientation is determined by calculating the angle of tilt of a patient using readings from an
  • accelerometer attached to the user.
  • U.S. Patent No. 7,648,441 describes a gait therapy device that detects differences between a user's left step and right step using a gait sensor.
  • the gait therapy device is directed to the limited motion of steps and detecting coarse characteristics associated with a user's step.
  • U.S. Patent No. 8,036,842 describes a system for classifying body motion of a user by mounting a three-axis accelerometer on the torso of a user.
  • U.S. Patent No. 7,773,224 describes a system for identifying a motion type by matching an acceleration signature, generated from a motion sensor worn on the user, to a stored acceleration signature corresponding to a particular motion type.
  • the pattern recognition or matching techniques described may be complex (such as requiring generation and use of a hidden Markov Model to model human motion) and/or involve complex computations (such as Fast Fourier Transforms) that increase the processing power demands of the system.
  • one object of the present invention is to provide a computer based method for monitoring the motion or position of a subject, which overcomes at least one the problems associated with known computer based motion monitoring systems and methods.
  • a method of monitoring a subject includes obtaining a plurality of readings from a plurality of sensors for detecting motion of the subject, generating a subject vector using the plurality of readings and selecting one of a set of model vectors based on the subject vector.
  • the set of model vectors correspond to a set of respective states.
  • the set of respective states include any one or more of a position and a motion.
  • the method also includes associating the plurality of readings with the respective state of the selected model vector.
  • a computer readable medium that includes computer executable instructions that when executed by a processor, cause the processor to obtain a plurality of readings from a plurality of sensors for detecting motion of the subject, generate a subject vector using the plurality of readings and select one of a set of model vectors based on the subject vector.
  • the set of model vectors correspond to a set of respective states.
  • the set of respective states include any one or more of a position and a motion.
  • the computer readable medium also includes computer executable instructions that when executed by the processor, cause the processor to associate the plurality of readings with the respective state of the selected model vector.
  • a system for monitoring a subject includes a plurality of sensors for detecting motion of the subject, and a processor coupled to memory and the plurality of sensors.
  • the memory stores computer executable instructions that when executed by the processor cause the processor to obtain a plurality of readings from the plurality of sensors, generate a subject vector using the plurality of readings and select one of a set of model vectors based on the subject vector.
  • the set of model vectors correspond to a set of respective states.
  • the set of respective states include any one or more of a position and a motion.
  • the memory also stores computer executable instructions that when executed by the processor, cause the processor to associate the plurality of readings with the respective state of the selected model vector.
  • Figure 1 is a block diagram of an example configuration of a system for monitoring a subject.
  • Figure 2 is a perspective view of a portion of an example subject with sensors attached thereon.
  • Figure 3 is a block diagram of an example configuration of a monitoring application.
  • Figure 4 is a flow diagram of example computer executable instructions for monitoring a subject.
  • Figure 5 is a flow diagram of example computer executable instructions for signal alignment.
  • Figure 6 is a plot of an example obtained signal.
  • Figure 7 is a plot of example posture signals extracted from an obtained signal.
  • Figure 8 is a plot of an example model signal.
  • Figure 9 is a plot of another example obtained signal.
  • Figure 10 is a plot of example gait signals extracted from an obtained signal. DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 a block diagram of an example configuration of a system 100 for monitoring a subject, such as a person's motion, is provided.
  • the system 100 includes one or more sensors 102 for detecting motion of a subject, a server 104 for receiving and processing sensor readings obtained from the sensors 102 and one or more receiver devices 106 for communicating with the server 104, such as receiving alerts or other messages from the server 104.
  • the one or more sensors 102 can detect motion of a subject, such as human movement.
  • the sensor 102 may include an accelerometer used to measure proper acceleration (i.e. the type of acceleration associated with the phenomenon of weight experienced by a test mass residing in the frame of reference of the accelerometer) and thus measures weight per unit of mass, a quantity also known as specific force or "g-force".
  • the accelerometer may comprise a single axis (i.e. 1-axis) or multiple axis (e.g. 2-axis or 3-axis) accelerometer, or a combination of accelerometers with a lower number of axes to form an accelerometer with a higher number of axes.
  • the sensors 102 may include an accelerometer to measure proper acceleration, a gyroscope to measuring angular velocity, a magnetometer to measure magnetic field, a contact switch or pressure sensor to determine whether a particular part is in contact with a surface (e.g. the ground), or any other sensor of which its readings can be used to determine a state of the subject, including its position, orientation, relative position with respect to other parts of the subject, path of motion, velocity, and other properties associated thereof. It will also be appreciated that a sensor 102 may obtain readings on one or more parameters that can be used to determine directly or indirectly a state of the subject, and/or that a plurality of sensors 102 can be used in combination to obtain readings on all the desired parameters.
  • the sensors 102 are preferably linked to a communication subsystem 108 capable of receiving data (such as instructions or queries) from, and sending data (such as sensor readings) to, the server 104 via a communication link.
  • the communication link can be established according to Bluetooth, Zigbee or other standardized or custom wireless communication protocols, or the communication link can be a wired connection.
  • the communication subsystem 108 may be configured to communicate with the server 104 via a network accessible through the communication link, such as a public network (e.g. the Internet) or a private network (e.g. intranet, local area network, private area network, USB hub, etc.). It will be appreciated that the communication subsystem 108 may incorporate or connect to additional components (e.g. an access point or bridge to connect to a wireless network) that may be needed in order to establish a communication link or access a communication network.
  • additional components e.g. an access point or bridge to connect to a wireless network
  • the server 104 can obtain, store and process readings from the sensors 102.
  • the server 104 includes a processor 1 10 and memory 1 12.
  • the memory 1 12 stores, among other data, one or more software applications that can be executed by the processor 1 10.
  • the server 104 also includes a communication subsystem 108 to send and receive data from the sensors 102, as discussed above, and to send and receive data from one or more receiver devices 106.
  • the memory 1 12 stores a monitoring application 1 14 executed by the processor 1 10.
  • the monitoring application 1 14 may maintain a database of sensor readings using database software such as MySQL, with additional software to interface between the database and the other components of the system 100. For example, a person may desire prompt notification of significant events, such as when certain kinds of activity occur or have not occurred. Events may be indicated by monitoring, analyzing and processing the sensor readings obtained from monitoring a subject.
  • the monitoring application 1 14 may include an alert component for generating a notification, such as to a monitoring station, a third-party or the monitored subject, when an event of interest has been determined to have occurred.
  • the notification can be provided in real-time immediately after the event has been detected, or within a specified amount of time after the event occurs.
  • notifications can be stored in a database, along with other identifiers such as date/time values and sensor identification etc.
  • Various other types of information based on the detected event can also be provided and different types of information can be given different priority and/or prominence when displayed in the notification. For example, real-time notification alerts can be given the highest priority and most prominent display, whereas summaries of historical data may be provided with given lower priority and/or less prominent display.
  • the processor 1 10 can include one or more processing units capable of performing the operations described herein, such as central processing units, field programmable gate arrays, graphics processing units, microprocessors, digital signal processors, etc.
  • the memory 104 can include random access memory, flash memory, read only memory, and other known storage elements. It will also be appreciated that other components can be included in the server 104 that are well known to those skilled in the art.
  • the receiver devices 106 can receive and/or request data from the server 104.
  • Each receiver device 106 includes a communication subsystem 108 to send and receive data from the server 104 in a similar manner as discussed above, such as through a wireless network.
  • the data from a server 104 can be provided to a receiver device 106 through a web page accessible by the receiver device 106, using a dedicated application loaded on the receiver device 106 (e.g. a mobile application), by messaging (e.g. email, text message) the receiver device 106, by manual or automated phone call to the receiver device 106, or any other means of communication supported by the receiver device 106.
  • the receiver devices 106 can include electronic devices having a display to display information regarding the sensor readings.
  • Example receiver devices 106 can include cellular phones, smart-phones, tablet computers, wireless organizers, personal digital assistants, desktop computers, laptops, handheld wireless communication devices, wirelessly enabled notebook computers, portable gaming devices, and the like.
  • FIG. 2 a front perspective diagram of an example portion of a subject having sensors 102 attached thereon, is provided.
  • a plurality of sensors 102 is attached to numerous parts of the right leg 202 of a user 200.
  • sensor readings from sensors 102 can be obtained as the user 200 takes walking steps to analyze the movement of the user's right foot, ankle, knee, and hip for gait analysis. In some cases, a subset of such sensors may only be needed. It can be appreciated that placement of the sensors 102 at different parts of the user 200 (e.g. arms, back, head, neck, etc.) can be used to monitor and analyze motion of such parts.
  • the obtained sensor readings can be processed and analyzed for a wide range of applications related to the study of kinesiology and human movement, such as gait analysis, sports training, physical therapy, design and evaluation of prosthetic limbs, etc.
  • the sensors 102 can be attached to a subject, such as a user 200, by any means, such as by using an adhesive or an elastic or tensor bandage.
  • the sensors 102 can be embedded in or attached to clothing, preferably form-fitting clothing, worn by the user 200.
  • the sensors 102 can be embedded in loosely worn clothing or housed within a pendant, wristwatch or other accessory worn by the user.
  • An example of a wearable wireless sensor 102 is the connectBlue ® cB-OLP425i-26 device, having Bluetooth wireless communication capabilities, and accelerometer sensors. It will be appreciated that in artificial parts (i.e. body parts), the sensors 102 can be incorporated into or added onto the construction of the part, such as the case for robotic arms or prosthetic limbs for example.
  • the collection of sensor readings obtained from the sensors 102 for a subject will be referred to herein as a "signal" or the "obtained signal”.
  • the sensor readings obtained from sensors 102 can be sampled and stored as sequences of finite-precision sampled data.
  • a signal may include more than one sequence of sensor readings as a sensor 102 may take multiple readings (e.g. 3-axis accelerometer can measure 3 sequences of measurements to represent the signal, one sequence of measurements for the x, y and z axes of the accelerometer, respectively) and/or multiple sensors 102 can be used to monitor a subject (see Figure 2).
  • Each sequence of measurements forming a signal will be referred to herein as a "signal sequence”.
  • model signal The collection of model sensor readings corresponding to a known state of a subject (or a representative subject to which the observed subject is to be evaluated against) will be referred to herein as a "model signal”.
  • set of model signals The model signals for all the known states will be referred to as the "set of model signals”.
  • a signal "class” will be referred herein as a set of signals that share underlying characteristics, which can be recognized as belonging to the same set. For example, if an accelerometer sensor is attached to the ankle, and the wearer of the sensor walks a certain distance, the signal generated during each step can be considered part of a class of "gait signals". In another example, for a stationary state, the posture during a stationary period can be considered part of a class of "posture signals”.
  • a signal "feature” is a particular property of a class of signals that is common to all signals in that class. For example, gait signals have clearly identifiable features such as “heel strike” (the time at which the heel touches the ground) and “toe off (the time when the toe leaves the ground). These features are consistent from stride to stride and can be identified by examining the signal. As another example, posture signals have a clear feature when the subject changes from one posture to another. Another feature of a posture signal is that the signal sequences are substantially constant.
  • the monitoring application 1 14 can be a software application stored in the memory 1 12 of the server 104.
  • the monitoring application 1 14 can request and receive sensor readings from one or more of the sensors 102, and provide information regarding such sensor readings to one or more receiver devices 106.
  • the monitoring application 1 14 preferably evaluates sensor readings obtained by sensors 102 (i.e. the obtained signal) against a set of model signals corresponding to a set of known states of the subject (or a representative subject to which the subject is to be evaluated against).
  • the monitoring application 1 14 generates a model vector in a common vector space, for each of the model signals.
  • the monitoring application 1 14 also generates a subject vector in the same vector space for the obtained signal so that the subject vector can be directly compared and analyzed against the set of model vectors.
  • the difference between the subject vector and a model vector can be indicative of the amount of similarity in the underlying obtained signal and model signal.
  • the monitoring application 1 14 selects the best "match" for the obtained signal among the set of model signals by matching the subject vector with a model vector. Once a match is determined, the obtained signal can be associated with information of the selected model signal, such as its class or other characteristics.
  • the monitoring application 1 14 includes a subject vector generator 304 for generating a subject vector using the obtained signal.
  • the monitoring application 1 14 also includes a model vector selector 308 for selecting one of the model vectors based on the subject vector.
  • the model vectors can correspond to a set of respective states, such as the orientation or position of the subject or a representative subject, or a motion performed by the subject or a representative subject.
  • the monitoring application 1 14 also includes a state evaluator 312 for evaluating the respective state associated with the selected model vector, associating the state of the selected model vector to the obtained signal, and providing information regarding such state to one or more receiver devices 106.
  • the monitoring application 1 14 optionally includes an alignment module 300 for aligning the obtained signal with one or more model signals in the model readings storage 302, prior to generating the subject vector.
  • the monitoring application 1 14 also optionally includes a vector space generator 306 for generating a vector space based on the set of model signals stored in the model readings storage 302.
  • the subject vector generator 304 generates a subject vector using the obtained signal.
  • multiple sensors 102 can be used to monitor the subject, and thus, the obtained signal for a single motion performed by the subject can include sensor readings from multiple sensors 102.
  • the subject vector generator 304 combines the sensor readings from multiple sensors 102 to create a single subject vector that can be compared and evaluated against model vectors.
  • the model vector selector 308 selects one of the model vectors from the set of model vectors stored in the model vector storage 310, based on the subject vector.
  • the set of model vectors may correspond to different states of the subject that is being monitored, or different states of a representative sample of subjects that the monitored subject is to be evaluated against.
  • the sensors 102 may be used to monitor the gait pattern of the user 200 such that a subject vector generated from the obtained signal represents a gait cycle of the user 200.
  • the set of model vectors can correspond to the gait patterns associated with healthy persons, as well as persons with different degrees of paralysis, injuries and other medical conditions that have been identified as affecting gait patterns.
  • the model vector selector 308 selects the model vector that most closely matches the subject vector, such as the model vector at the shortest distance to the subject vector in the vector space.
  • the selected model vector can be indicative that the monitored subject is in the same state associated with the selected model vector. For example, if the subject vector matches a model vector that corresponds to a gait pattern of a person having the condition of pes cavus (i.e. a high arch foot), then it can be determined that the user 200 also has the same condition.
  • the set of model vectors may be further differentiated by the types of pes cavus, and/or the extent that pes cavus is present, thus enabling the selected model vector to provide further detailed information on the state of the user 200.
  • the state evaluator 312 can evaluate the state associated with the selected model vector, and provide information regarding the state to a receiver device 106. For example, the state evaluator 312 may determine that the state associated with the selected model vector is a gait pattern for a person having the condition of pes cavus, and thus email a diagnostic report to a medical professional and/or the user 200.
  • the monitoring application 1 14 can optionally include an alignment module 300 for aligning sensor readings obtained from the sensors 102 with one or more model sensor readings.
  • the alignment module 300 may also extract a subset of the sequence of readings in the obtained signal. This may be necessary in cases such as where the sensor readings capture multiple instances of the same state, such as a repeated motion performed by the user 200 (e.g. multiple gait cycles), while the model sensor readings from which the model vectors are generated may correspond to a single instance of a state (e.g. single gait cycle).
  • a number of techniques may be used to align and extract the required window of sensor readings so that a comparison of the subject vector and set of model vectors can be performed. It will be appreciated that if the sensor readings are already aligned with underlying model signals, the alignment module 300 may not be needed.
  • the monitoring application 1 14 can optionally include a vector space generator 306 for generating a vector space that can be used to define the model vectors, as well as the subject vector, so that the vectors have a common basis to be compared and analyzed.
  • a vector space generator 306 for generating a vector space that can be used to define the model vectors, as well as the subject vector, so that the vectors have a common basis to be compared and analyzed.
  • a number of techniques may be used to generate the vector space so that a comparison of the subject vector and set of model vectors can be performed. It will be appreciated that if the subject vector and the set of model vectors are already defined in a common vector space, the vector space generator 306 may not be needed.
  • any module, subsystem or component exemplified herein that executes instructions or operations may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
  • Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data, except transitory propagating signals per se.
  • Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the system 100 or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions or operations that may be stored or otherwise held by such computer readable media.
  • an example set of computer executable instructions is provided for monitoring a subject, such as user 200.
  • a plurality of sensor readings is obtained (i.e. the obtained signal) from a plurality of sensors 102 for detecting motion of a subject.
  • the plurality of readings is optionally aligned with at least one of a set of model plurality of readings (i.e. the set of model signals). This may improve the signal correlation that is performed subsequently as part of 406.
  • a subject vector is generated using the plurality of readings.
  • a model vector is selected based on the subject vector. The selected model vector belongs to a set of model vectors corresponding to a set of respective states.
  • the set of respective states includes one or more positions and/or motions.
  • the positions and motions can include previous positions and motions of the subject, and/or positions and motions from a representative sample that the subject is to be evaluated against.
  • the plurality of readings is associated with the respective state of the selected model vector.
  • information based on the respective state associated with the plurality of readings is provided, such as a notification alert.
  • a plurality of sensor readings (i.e. the obtained signal) is obtained from a plurality of sensors 102 for detecting motion of a subject.
  • the sensors 102 in conjunction with the communication subsystem 108 of the server 104, can implement 400 ( Figure 3).
  • the obtained signal is optionally aligned with at least one of the set of model signals.
  • the alignment module 300 in conjunction with the model readings storage 302 can implement 402 ( Figure 3).
  • Timing of a series of human movement are subject to natural time distortion, even when the human subject attempts to repeat a movement exactly.
  • the interval between each footfall is slightly different and features within each step, such as the heel striking the ground or the toe leaving the ground, may vary in time within each step.
  • the peaks and other features of the obtained signal are never perfectly repeatable. Due to the inherent variation in human movement and variability in timing as to when the sensors 102 are configured to start and stop obtaining readings, the obtained signal may not always be aligned with the model signals, even when of the same class or performed by the same subject. If the signal sequences of the obtained signal and the model signals are not the same length, and/or the features between the model signals and the obtained signals are not aligned, alignment of the obtained signal to one or more of the set of model signals in the same class can be performed to reduce or remove the time distortion, and improve the quality of the subsequent comparison and matching that is be performed at 406.
  • Signal alignment of two signals can be performed by identifying features between the two signals.
  • signal alignment may be done heuristically in accordance with a particular feature rule that can be identified, such as based on identifying local maxima/minima of the signals or abrupt transitions of the signals. This may be done manually by a user observing the signal, or automatically with signal processing techniques described herein or known in the art.
  • an example set of computer executable instructions is provided for performing signal alignment of an obtained signal to a model signal.
  • a "key" signal sequence in the obtained signal is identified for use as the basis for the alignment.
  • features from the signal sequence of the model signal corresponding to the key signal sequence of the obtained signal are identified.
  • corresponding features in the key signal sequence of the obtained signal is identified.
  • the key signal sequence in the obtained signal is resampled.
  • the resampling operation of 506 is repeated for all other signal sequences in the obtained signal.
  • a "key" signal sequence in the obtained signal is identified to extract features from, and use as the basis for the alignment.
  • the key signal sequence can be the signal sequence with the strongest signal, highest signal-to-noise ratio, or satisfying some other predetermined criteria. For example, in a gait signal, the signal sequence along an axis of the accelerometer may be closely aligned with the direction of motion such that the signal sequence is suitable for use as the key signal sequence. In another example, the key signal sequence can be determined based on the type of sensor used to obtain the signal sequence.
  • the signal sequence obtained from the heel strike switch may be the least ambiguous and can be used as the key signal sequence.
  • features from the corresponding key signal sequence of the model signal are identified.
  • the model signals may include additional information such as annotation of its features, and these annotated features may be stored in a database. Providing annotated feature information can reduce the signal processing operations performed during alignment.
  • Features can include local maxima/minima, abrupt transitions, and other detectable characteristics.
  • corresponding features in the key signal sequence of the obtained signal is identified. It will be appreciated that due to noise or individual variation, there may be a different number of features of different types between the key signal sequences of the model signal and the obtained signal. In an example, excess features can be discarded. The beginning and the end of a signal sequence can also be considered to be features. The features in the model signal are enumerated, and the number of samples between each pair of features is recorded.
  • the key signal sequence in the obtained signal is resampled between corresponding pairs of features to ensure that the number of samples between feature pairs is the same as in the corresponding key signal sequence of the model signal.
  • the resampling operation of 506 is repeated for all other signal sequences as performed for the key signal sequence such that all the signal sequences of the obtained signal are aligned in the same manner.
  • the model signals and the obtained signal have the same length and correspondence of samples, and the obtained signal can be used to generate a subject vector to be analyzed and evaluated against the set of model vectors.
  • FIG. 6-8 an example of the alignment process described above is illustrated with an example obtained signal 600 comprising posture signals 602, 604, 606.
  • posture signals the transition between two postures is a recurring pattern.
  • the signals may be analyzed for recurring patterns.
  • a posture signal is characterized by a static signal (when the subject is stationary), and abrupt transitions (when the subject is moving into a new posture).
  • Figure 6 illustrates an obtained signal 600 comprising three posture signals 602, 604, 606 from a sleeping subject who is wearing a three-axis accelerometer.
  • the obtained signal 600 includes two transitions T1 , T2 between posture signals 602 and 604, and between posture signals 604 and 606. Transitions from one posture to another are detectable by using known signal processing techniques.
  • the posture changes represent trigger events, separating the posture signals 602, 604, 606 from one another contained in the obtained signal 600. Using the trigger events, the obtained signal 600 can be separated into three portions representing different posture signals 602, 604, 606 (see Figure 7). The portions of the obtained signal 600 containing the trigger events have been removed from the extracted posture signals 602, 604, 606.
  • the model signal 800 includes the signal sequences 802, 804, 806 for the x, y and z axis of the accelerometer, respectively, corresponding to the state when the subject is lying perfectly still on his/her back.
  • the duration of the model signal 800 in Figure 8 is 100 seconds, or the equivalent number of samples.
  • the postures signals 602, 604, 606 have different lengths. In order to improve the quality of the matching process at 406 the number of samples must be exactly the same between the model signal 800 and each of the posture signals 602, 604, 606. Further, the samples between the signals must be in exact correspondence (e.g. the fifth sample in the model signal 800 must correspond to the same part and time of each of the fifth samples of the posture signals 602, 604, 606).
  • posture signals 602, 604, 606 in the obtained signal 600 are resampled so that they have the correct number (i.e. same number of samples as the model signal 800) and correspondence in the samples.
  • This can be achieved by performing linear interpolation (e.g. add new samples in between two posture signal samples by taking a linear combination of the two samples) or any other known signal processing technique.
  • the result of alignment will be three posture signals
  • the obtained signal 600, and resulting posture signals 602, 604, 606 may include small variations, even when the subject may be maintaining a static position. Variations can be caused by fine movements of the subject, such as breathing or fidgeting of the subject while sleeping.
  • the obtained signal 600 can be pre-filtered to remove one or more of such variations, such a variation relating to breathing, if such variation(s) is(are) not of interest. Removal of variations can be accomplished in a number of known manners, such as by using a low pass-filter. If, on the other hand, a variation such as breathing is to be monitored, in order to monitor the desired state of the subject, a high-pass filter or band-pass filter may be applied to the obtained signal 600 to remove the posture signal components.
  • the variations may be incorporated into a model signal corresponding to a subject sleeping on his/her back with an amount of fidgeting to indicate poor sleep quality.
  • the extent of variation can be represented by different model signals to correspond to different degrees and types of fidgeting.
  • the obtained signal can be partitioned into portions using trigger events in a similar manner as described in the example of extracting posture signal.
  • the gait signal there may be other key events that are used to align the features of the signals. Therefore, resampling may occur in two or more different ways within the same gait signal.
  • Figure 9 illustrates an obtained signal 900 comprising two and half gait cycles.
  • the obtained signal 900 of the subject's gait is only detected by a single accelerometer placed on the lower leg, a heel contact sensor (to detect heel strike) and a toe contact sensor (to detect toe off).
  • the monitoring application 1 14 can, in this example, identify the initiation of a heel contact as the trigger event to separate the obtained signal 900 into individual gait signals 902, 904 of a single gait cycle in length ( Figure 9). In an example, the monitoring application 1 14 can disregard portions of a gait signal that do not include a full gait cycle, such as partial gait signal 906.
  • the model signals (not shown) corresponding to gait signals also include measurements for an accelerometer, heel contact sensor, and toe contact sensor.
  • the monitoring application 1 14 aligns the features of the gait signals 902, 904 to the model signals.
  • the features to be aligned in the gait signals 902, 904 can include the initial heel contact A to the subsequent heel contact A (i.e. the entire gait cycle).
  • a feature to be aligned can also include the point of toe off B.
  • the monitoring application 1 14 may perform resampling independently for two portions of the gait cycle: (i) initial heel contact A to toe off B; and (ii) toe off B to next heel contact A. The resulting resampled gait signal will have the same number of samples and alignment of the identified features.
  • further alignment using additional features can be performed by identifying four regions to align independently, such as (i) initial heel contact A to toe off B; (ii) toe off B to next heel contact A; (iii) initial heel contact A to toe contact C; and (iv) heel off D to toe off B.
  • the accelerometer signal sequences of Figures 9 and 10 may be included in the obtained signal and model signals. It will be appreciated that the acceleration measurements alone may have distinguishing features that can be identified, such as the change in the measurements from heel contact and toe off, for example.
  • the obtained signal or one or more portions thereof can be aligned with one or more model signals to improve subsequent matching of the subject vector and model vectors that are derived from the obtained signal and model signals, respectively.
  • a subject vector is generated using the obtained signal.
  • the subject vector generator 304 and vector space generator 306 can implement 404 ( Figure 3).
  • the subject vector is generated using the obtained signal by mapping the obtained signal to the same vector space as the set of model signals.
  • the subject vector can be generated according to the following:
  • a vector space is generated using the set of model signals. If there are m model signals, we form m corresponding orthonormal basis vectors using the Gram-Schmidt procedure or other known orthonormalization technique.
  • Each model signal is then mapped into the vector space.
  • the ; ' th "coordinate" of the model vector in the vector space, c[;] can be computed by taking the vector dot product:
  • model vector has m coordinates in vector space: c[1 ], c[2], c[m].
  • the set of m coordinates is found for every known model signal and stored as a separate model vector.
  • the subject vector of m coordinates can be generated in the vector space by repeating step b using the obtained signal instead of the model signal.
  • a model vector is selected based on the subject vector.
  • the model vector selector 308 can implement 406 ( Figure 3).
  • the Euclidean distance between the subject vector having coordinates s[1], s[2], s[m] and the each of the model vectors having respective coordinates c[1], c[2], c[m] can be calculated by
  • model vector selected can be chosen as the best match using one of the following rules:
  • b Closest match, with maximum: Define a maximum distance D. Find the model vector with the smallest distance d. If that smallest distance is less than D, then select the closest model vector. If the smallest d is greater than the maximum distance D, then no model vectors are selected.
  • a model vector is selected, the subject is determined to have a state associated with the selected model vector, such as the subject having experienced a motion that resulted in the model signal. If no model vectors are selected, the state of the subject can be determined to be of unknown type.
  • Obtained signal [+1.0, +0.5, -0.1 , -0.3, +0.3, +0.8, -0.4, -0.7]
  • the set of model signals includes two signals, and thus two basis vectors are obtained from orthonormalizing the set of model signals.
  • the basis vectors are normalized to be unit vectors as:
  • Basis vector V [+1.0, +1.0, +1.0, +1.0, +1.0, +1.0, +1.0]/sqrt(8)
  • Basis vector 2 [+1.0, +1.0, -1.0, -1.0, +1.0, +1.0, -1.0, -1.0]/sqrt(8)
  • the subject vector in the vector space defined by the basis vectors is (0.39, 1.45).
  • the set of model vectors includes:
  • Model vector 1 (+1 , +1)
  • Model vector 2 (+1 , -1)
  • the number of model vectors and the number of basis vectors will be the same.
  • One of the set of model vectors is selected based on the subject vector.
  • the subject vector is evaluated against the set of model vectors by computing the Euclidean distance:
  • the respective state associated with the selected model vector is also associated with the subject vector and the underlying obtained signal.
  • the state evaluator 312 can implement 408 ( Figure 3).
  • the model vector belongs to class of gait signals
  • the obtained signal can also be classified to be a gait signal.
  • the selected model vector can correspond to a particular body position or posture, in which case the obtained signal is also associated with such position, thus indicating that the monitored subject was in such position at the time the sensor readings were taken.
  • the state evaluator 312 can implement step 410 ( Figure 3).
  • the nature of the information can depend on the desired application of the system 100.
  • the information provided can include a notification alert providing information on the motion, requesting prompt action, etc.
  • alerts may be provided immediately after associating a state with the obtained signal.
  • an obtained signal associated with a patient being in a given position for a prolonged period of time may require immediate intervention to prevent one or more pressure ulcers from forming.
  • a notification alert may be generated and sent to a suitable recipient (as described above).
  • an alert may be desired upon determining that a series of obtained signals have been associated with a specific sequence of states.
  • a certain sequence of matching signals may lead to an alert.
  • the same model vector corresponding to a posture state that is matched many times in a row would indicate that the subject has maintained the same posture for a period of time.
  • an alert may be sent to prompt caregivers to move the patient into a new position, thus preventing pressure ulcers.
  • the lack of a match to a model vector or model vectors over a long period of time can indicate that the subject is not performing prescribed exercises (i.e. motions) associated with the model vectors.
  • the subject can then be prompted to perform the exercises through an alert or message.
  • a correct sequence of exercises i.e. a correct sequence of matches to model vectors associated with performing the exercise
  • a "confirmation" alert can be sent to the subject, indicating that the exercise was done properly, Otherwise, a "try again” alert can be sent if the correct sequences of model vectors was not matched.
  • an alert can be provided when a single model vector is matched, or when a predetermined sequence of model vectors is matched. Similarly, an alert can be provided when a single model vector is not matched, or when a predetermined sequence of model vectors is not matched.
  • an alert is sent to a party, such as a user of a receiver device 106, the user can cancel the alert.
  • An alert may also be cancelled by the monitoring application 1 14 upon detecting that the condition that resulted in the alert has ended (i.e. the selected model vector or sequence of selected model vectors no longer satisfy the criteria that prompted the alert).
  • the signal sequences forming an obtained signal or a model signals can be filtered or pre-processed using signal processing techniques prior to the operations of 400-410 in Figure 4.
  • the pre-processing may be applied to a subset of the signal sequences forming a signal, and/or to different segments of a signal sequence.
  • Pre-processing techniques may include high-pass filtering to remove a constant offset to the signal, band-stop filtering to remove components related to vibration or breathing, band-pass filtering to enhance measurements of breathing, resampling or other pre-processing techniques that may be available or that can be incorporated in to the system 100.
  • the number of model signals corresponding to different states may change over time, such as when new states are discovered, resulting in new model signals to be added to the set of model signals.
  • the operations at 402 to 406 of Figure 4 can be repeated to generate a new vector space used to define a new set of model vectors and subject vector.
  • the set of model signals can be generated in various ways.
  • the set of model signals may be pre-loaded with signals representing a sample of subjects representative of the general population of subjects, while in a large number of different states (e.g. in different positions and/or performing a variety of motions).
  • the set of model signals can be generated over time from monitoring a particular individual subject. As the individual subject may have particular characteristics in their movement, which can also change over time, a set of model signals customized to the particular individual subject can enable the set of model signals to adapt to individual variances, thus maintaining a library of data that captures developments in the subject's motion (e.g. an individual patient whose motion improves as a result of therapy).
  • the set of model signals can be generated with supervision (i.e. (human intervention).
  • an expert such as a doctor or physiotherapist
  • can inspect a signal, as well as other available data e.g. video of the subject performing a motion or having a certain posture
  • An interface can be provided to facilitate the input of the expert (e.g. a graphical editor allowing the expert to directly annotate the waveforms).
  • the expert may also be prompted to identify a candidate model signal, as discussed below.
  • the user being monitored can indicate what kind of activity he/she is performing.
  • the set of model signals can be updated using an automated process without supervision (i.e. human intervention is not required).
  • a system may update a set of model signals as follows:
  • a database is maintained of all sensor readings and signals taken from the sensors.
  • the signals may be annotated with characteristics of a signal, such as features of its signal sequences.
  • the signals can be analyzed to identify the subset of signals (referred to herein as the set of "non-matching signals") that do not match any of the current set of model signals.
  • the matching procedure described with respect to Figure 4, or any other matching technique to identify similar signals within a given threshold can be used.
  • the unique signals within the set of non-matching signals can then be
  • S is a set of candidate model signals.
  • a subset S may be incorporated directly into the set of matching signals.
  • an expert or the subject may be prompted to confirm that a candidate model signal may be added to the set of model signals.
  • the computational burden or complexity of calculating a matching score for every possible pair of non-matching signals may be prohibitive. In an example, only a subset of every possible pair of non-matching signal is considered.
  • the complexity can be mitigated by using a number of techniques, such as by discarding all non- matching signals that do not match with other non-matching signals within a predetermined time (i.e. that do not form any subsets S within a given amount of time).
  • supervised techniques to generate a set of model signals can be used in combination with unsupervised techniques.
  • an expert or user can choose to, or be prompted to, provide supplemental information (e.g. type of activity taking place) on a newly identified candidate model signal.
  • the system 100 may prompt a user or expert to input supplemental information when the automated process is unable to detect or misidentifies the state to be associated with the model signal.
  • the system 100 can constantly track similar kinds of movement or positions recognized by the set of model signals.
  • the set of model signals can be derived from a large numbers of subjects in the population.
  • Model signals derived from monitoring a subject may be used as model signals to monitor another subject, or may be used to guide the selection of model signals in monitoring the other subject.
  • Collaborative filtering may be used to identify members of the population with similar characteristics and demographics to the subject being monitored, to improve the likelihood that a set of model signals will be applicable to the monitored subject.
  • model signals from monitoring an actual subject increases the likelihood that the model signals represent plausibly states. For example, the model signals should be plausible in terms of human motion that can be physically achieved, and any signal that is not compatible with human motion should be rejected. By obtaining model signals from monitoring an actual person, the non-compatible motions are automatically avoided. As a result, the monitoring application 1 14 may match motions consistent with human movement more quickly.
  • a signal in the impossible or improbable region may imply that connected limbs, such as hip and thigh, are moving as though they are unconnected, such as if one is accelerating sharply while the other is stationary.
  • the monitoring application 1 14 may also check the plausibility of a motion detected in a signal by evaluating the signal against a mathematical model of the biomechanical system (e.g., Hill's muscle model or Pandy's limb model, among others). If any known physical limitations of the human anatomy are violated, the signal may be safely rejected.
  • a mathematical model of the biomechanical system e.g., Hill's muscle model or Pandy's limb model, among others.
  • the subject may include other living beings, such as animals, and non-living subjects, such as robotic subjects, artificial or prosthetic limbs, etc.

Abstract

PCT Application CA/2014 Blakes Ref: 78749/00003 Doc No: 22525944 27 22525944. ABSTRACT A system and method for monitoring a subject is provided. The method includes 2 obtaining a plurality of readings from a plurality of sensors for detecting motion of the 3 subject, generating a subject vector using the plurality of readings and selecting one of a set 4 of model vectors based on the subject vector. The set of model vectors correspond to a set of respective states. The set of respective states include any one or more of a position and 6 a motion. The method also includes associating the plurality of readings with the respective 7 state of the selected model vector. 8

Description

SYSTEM AND METHOD FOR MONITORING A SUBJECT CROSS REFERENCE TO PRIOR APPLICATIONS
[0001] The present application claims priority under the Paris Convention to US
Application Number 61/806,754, filed March 29, 2013, the entire contents of which are incorporated herein by reference. TECHNICAL FIELD
[0002] The present invention relates generally to systems and methods for monitoring a subject, and more specifically, systems and methods for monitoring a motion or position of the subject. BACKGROUND
[0003] The position or motion of a subject, such as a person, may be relevant in a number of applications. For example, some medical conditions can be caused by prolonged compression of a part of a patient's body. An example of such a medical condition includes pressure ulcers or bedsores, where the patient is more prone to develop pressure ulcers on parts of the body in contact for prolonged periods with a supporting surface (e.g. a bed during resting). A technique to prevent or reduce pressure ulcers includes turning patients on a regular schedule, such as moving them from their side to their back or vice versa. U.S. Patent Application Publication No. 201 1/0263950 describes a system for monitoring medical conditions including pressure ulcers by detecting the orientation of a patient. The orientation is determined by calculating the angle of tilt of a patient using readings from an
accelerometer attached to the user. However, it may be difficult to calculate an orientation of a patient involving specific placement of multiple body parts, or to calculate complex or arbitrary motions of a patient.
[0004] In another example, U.S. Patent No. 7,648,441 describes a gait therapy device that detects differences between a user's left step and right step using a gait sensor.
However, the gait therapy device is directed to the limited motion of steps and detecting coarse characteristics associated with a user's step.
[0005] In another example, U.S. Patent No. 8,036,842 describes a system for classifying body motion of a user by mounting a three-axis accelerometer on the torso of a user.
Sensor information can be classified based on recognizing patterns in the sensor information and comparing them to known patterns. U.S. Patent No. 7,773,224 describes a system for identifying a motion type by matching an acceleration signature, generated from a motion sensor worn on the user, to a stored acceleration signature corresponding to a particular motion type. The pattern recognition or matching techniques described may be complex (such as requiring generation and use of a hidden Markov Model to model human motion) and/or involve complex computations (such as Fast Fourier Transforms) that increase the processing power demands of the system.
[0006] It is an object of the present invention to overcome or mitigate at least one of the above disadvantages. In particular, one object of the present invention is to provide a computer based method for monitoring the motion or position of a subject, which overcomes at least one the problems associated with known computer based motion monitoring systems and methods. SUMMARY OF THE INVENTION
[0007] In an aspect, there is provided a method of monitoring a subject. The method includes obtaining a plurality of readings from a plurality of sensors for detecting motion of the subject, generating a subject vector using the plurality of readings and selecting one of a set of model vectors based on the subject vector. The set of model vectors correspond to a set of respective states. The set of respective states include any one or more of a position and a motion. The method also includes associating the plurality of readings with the respective state of the selected model vector.
[0008] In another aspect, there is provided a computer readable medium that includes computer executable instructions that when executed by a processor, cause the processor to obtain a plurality of readings from a plurality of sensors for detecting motion of the subject, generate a subject vector using the plurality of readings and select one of a set of model vectors based on the subject vector. The set of model vectors correspond to a set of respective states. The set of respective states include any one or more of a position and a motion. The computer readable medium also includes computer executable instructions that when executed by the processor, cause the processor to associate the plurality of readings with the respective state of the selected model vector.
[0009] In another aspect, there is provided a system for monitoring a subject. The system includes a plurality of sensors for detecting motion of the subject, and a processor coupled to memory and the plurality of sensors. The memory stores computer executable instructions that when executed by the processor cause the processor to obtain a plurality of readings from the plurality of sensors, generate a subject vector using the plurality of readings and select one of a set of model vectors based on the subject vector. The set of model vectors correspond to a set of respective states. The set of respective states include any one or more of a position and a motion. The memory also stores computer executable instructions that when executed by the processor, cause the processor to associate the plurality of readings with the respective state of the selected model vector. BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Embodiments of the invention will now be described by way of example only with reference to the accompanying drawings in which:
[0011] Figure 1 is a block diagram of an example configuration of a system for monitoring a subject.
[0012] Figure 2 is a perspective view of a portion of an example subject with sensors attached thereon.
[0013] Figure 3 is a block diagram of an example configuration of a monitoring application.
[0014] Figure 4 is a flow diagram of example computer executable instructions for monitoring a subject.
[0015] Figure 5 is a flow diagram of example computer executable instructions for signal alignment.
[0016] Figure 6 is a plot of an example obtained signal.
[0017] Figure 7 is a plot of example posture signals extracted from an obtained signal.
[0018] Figure 8 is a plot of an example model signal.
[0019] Figure 9 is a plot of another example obtained signal.
[0020] Figure 10 is a plot of example gait signals extracted from an obtained signal. DETAILED DESCRIPTION OF THE INVENTION
[0021] It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate
corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the example embodiments described herein. However, it will be understood by those of ordinary skill in the art that the example embodiments described herein may be practised without these specific details.
[0022] In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the example embodiments described herein. Also, the description is not to be considered as limiting the scope of the example embodiments described herein.
[0023] Referring to Figure 1 , a block diagram of an example configuration of a system 100 for monitoring a subject, such as a person's motion, is provided. The system 100 includes one or more sensors 102 for detecting motion of a subject, a server 104 for receiving and processing sensor readings obtained from the sensors 102 and one or more receiver devices 106 for communicating with the server 104, such as receiving alerts or other messages from the server 104.
[0024] The one or more sensors 102 can detect motion of a subject, such as human movement. For example, the sensor 102 may include an accelerometer used to measure proper acceleration (i.e. the type of acceleration associated with the phenomenon of weight experienced by a test mass residing in the frame of reference of the accelerometer) and thus measures weight per unit of mass, a quantity also known as specific force or "g-force". The accelerometer may comprise a single axis (i.e. 1-axis) or multiple axis (e.g. 2-axis or 3-axis) accelerometer, or a combination of accelerometers with a lower number of axes to form an accelerometer with a higher number of axes. By obtaining and analyzing readings from one or more accelerometers attached to a part of a subject (e.g. human arm), the position, orientation and/or motion of that part can be determined.
[0025] It will be appreciated that the sensors 102 may include an accelerometer to measure proper acceleration, a gyroscope to measuring angular velocity, a magnetometer to measure magnetic field, a contact switch or pressure sensor to determine whether a particular part is in contact with a surface (e.g. the ground), or any other sensor of which its readings can be used to determine a state of the subject, including its position, orientation, relative position with respect to other parts of the subject, path of motion, velocity, and other properties associated thereof. It will also be appreciated that a sensor 102 may obtain readings on one or more parameters that can be used to determine directly or indirectly a state of the subject, and/or that a plurality of sensors 102 can be used in combination to obtain readings on all the desired parameters. [0026] The sensors 102 are preferably linked to a communication subsystem 108 capable of receiving data (such as instructions or queries) from, and sending data (such as sensor readings) to, the server 104 via a communication link. The communication link can be established according to Bluetooth, Zigbee or other standardized or custom wireless communication protocols, or the communication link can be a wired connection. The communication subsystem 108 may be configured to communicate with the server 104 via a network accessible through the communication link, such as a public network (e.g. the Internet) or a private network (e.g. intranet, local area network, private area network, USB hub, etc.). It will be appreciated that the communication subsystem 108 may incorporate or connect to additional components (e.g. an access point or bridge to connect to a wireless network) that may be needed in order to establish a communication link or access a communication network.
[0027] The server 104 can obtain, store and process readings from the sensors 102. The server 104 includes a processor 1 10 and memory 1 12. The memory 1 12 stores, among other data, one or more software applications that can be executed by the processor 1 10. The server 104 also includes a communication subsystem 108 to send and receive data from the sensors 102, as discussed above, and to send and receive data from one or more receiver devices 106. The memory 1 12 stores a monitoring application 1 14 executed by the processor 1 10.
[0028] In an example, the monitoring application 1 14 may maintain a database of sensor readings using database software such as MySQL, with additional software to interface between the database and the other components of the system 100. For example, a person may desire prompt notification of significant events, such as when certain kinds of activity occur or have not occurred. Events may be indicated by monitoring, analyzing and processing the sensor readings obtained from monitoring a subject. The monitoring application 1 14 may include an alert component for generating a notification, such as to a monitoring station, a third-party or the monitored subject, when an event of interest has been determined to have occurred. The notification can be provided in real-time immediately after the event has been detected, or within a specified amount of time after the event occurs. It will also be understood that notifications can be stored in a database, along with other identifiers such as date/time values and sensor identification etc. Various other types of information based on the detected event can also be provided and different types of information can be given different priority and/or prominence when displayed in the notification. For example, real-time notification alerts can be given the highest priority and most prominent display, whereas summaries of historical data may be provided with given lower priority and/or less prominent display.
[0029] It will be appreciated that the processor 1 10 can include one or more processing units capable of performing the operations described herein, such as central processing units, field programmable gate arrays, graphics processing units, microprocessors, digital signal processors, etc. The memory 104 can include random access memory, flash memory, read only memory, and other known storage elements. It will also be appreciated that other components can be included in the server 104 that are well known to those skilled in the art.
[0030] The receiver devices 106 can receive and/or request data from the server 104. Each receiver device 106 includes a communication subsystem 108 to send and receive data from the server 104 in a similar manner as discussed above, such as through a wireless network. In an example, the data from a server 104 can be provided to a receiver device 106 through a web page accessible by the receiver device 106, using a dedicated application loaded on the receiver device 106 (e.g. a mobile application), by messaging (e.g. email, text message) the receiver device 106, by manual or automated phone call to the receiver device 106, or any other means of communication supported by the receiver device 106. In an example, the receiver devices 106 can include electronic devices having a display to display information regarding the sensor readings. Example receiver devices 106 can include cellular phones, smart-phones, tablet computers, wireless organizers, personal digital assistants, desktop computers, laptops, handheld wireless communication devices, wirelessly enabled notebook computers, portable gaming devices, and the like.
[0031] Referring to Figure 2, a front perspective diagram of an example portion of a subject having sensors 102 attached thereon, is provided. In this example, a plurality of sensors 102 is attached to numerous parts of the right leg 202 of a user 200. In the example, sensor readings from sensors 102 can be obtained as the user 200 takes walking steps to analyze the movement of the user's right foot, ankle, knee, and hip for gait analysis. In some cases, a subset of such sensors may only be needed. It can be appreciated that placement of the sensors 102 at different parts of the user 200 (e.g. arms, back, head, neck, etc.) can be used to monitor and analyze motion of such parts. In an example, the obtained sensor readings can be processed and analyzed for a wide range of applications related to the study of kinesiology and human movement, such as gait analysis, sports training, physical therapy, design and evaluation of prosthetic limbs, etc. [0032] The sensors 102 can be attached to a subject, such as a user 200, by any means, such as by using an adhesive or an elastic or tensor bandage. In another example, the sensors 102 can be embedded in or attached to clothing, preferably form-fitting clothing, worn by the user 200. In other applications that require less precise motion monitoring, the sensors 102 can be embedded in loosely worn clothing or housed within a pendant, wristwatch or other accessory worn by the user. An example of a wearable wireless sensor 102 is the connectBlue® cB-OLP425i-26 device, having Bluetooth wireless communication capabilities, and accelerometer sensors. It will be appreciated that in artificial parts (i.e. body parts), the sensors 102 can be incorporated into or added onto the construction of the part, such as the case for robotic arms or prosthetic limbs for example.
[0033] For the sake of clarity, the following definitions will be used herein. The collection of sensor readings obtained from the sensors 102 for a subject will be referred to herein as a "signal" or the "obtained signal". The sensor readings obtained from sensors 102 can be sampled and stored as sequences of finite-precision sampled data. A signal may include more than one sequence of sensor readings as a sensor 102 may take multiple readings (e.g. 3-axis accelerometer can measure 3 sequences of measurements to represent the signal, one sequence of measurements for the x, y and z axes of the accelerometer, respectively) and/or multiple sensors 102 can be used to monitor a subject (see Figure 2). Each sequence of measurements forming a signal will be referred to herein as a "signal sequence".
[0034] The collection of model sensor readings corresponding to a known state of a subject (or a representative subject to which the observed subject is to be evaluated against) will be referred to herein as a "model signal". The model signals for all the known states will be referred to as the "set of model signals".
[0035] A signal "class" will be referred herein as a set of signals that share underlying characteristics, which can be recognized as belonging to the same set. For example, if an accelerometer sensor is attached to the ankle, and the wearer of the sensor walks a certain distance, the signal generated during each step can be considered part of a class of "gait signals". In another example, for a stationary state, the posture during a stationary period can be considered part of a class of "posture signals".
[0036] A signal "feature" is a particular property of a class of signals that is common to all signals in that class. For example, gait signals have clearly identifiable features such as "heel strike" (the time at which the heel touches the ground) and "toe off (the time when the toe leaves the ground). These features are consistent from stride to stride and can be identified by examining the signal. As another example, posture signals have a clear feature when the subject changes from one posture to another. Another feature of a posture signal is that the signal sequences are substantially constant.
[0037] Referring to Figure 3, an example configuration of a monitoring application 1 14 is provided. The monitoring application 1 14 can be a software application stored in the memory 1 12 of the server 104. The monitoring application 1 14 can request and receive sensor readings from one or more of the sensors 102, and provide information regarding such sensor readings to one or more receiver devices 106.
[0038] The monitoring application 1 14 preferably evaluates sensor readings obtained by sensors 102 (i.e. the obtained signal) against a set of model signals corresponding to a set of known states of the subject (or a representative subject to which the subject is to be evaluated against). In an example, the monitoring application 1 14 generates a model vector in a common vector space, for each of the model signals. The monitoring application 1 14 also generates a subject vector in the same vector space for the obtained signal so that the subject vector can be directly compared and analyzed against the set of model vectors. The difference between the subject vector and a model vector can be indicative of the amount of similarity in the underlying obtained signal and model signal. In an example, the monitoring application 1 14 selects the best "match" for the obtained signal among the set of model signals by matching the subject vector with a model vector. Once a match is determined, the obtained signal can be associated with information of the selected model signal, such as its class or other characteristics.
[0039] The monitoring application 1 14 includes a subject vector generator 304 for generating a subject vector using the obtained signal. The monitoring application 1 14 also includes a model vector selector 308 for selecting one of the model vectors based on the subject vector. The model vectors can correspond to a set of respective states, such as the orientation or position of the subject or a representative subject, or a motion performed by the subject or a representative subject. The monitoring application 1 14 also includes a state evaluator 312 for evaluating the respective state associated with the selected model vector, associating the state of the selected model vector to the obtained signal, and providing information regarding such state to one or more receiver devices 106.
[0040] The monitoring application 1 14 optionally includes an alignment module 300 for aligning the obtained signal with one or more model signals in the model readings storage 302, prior to generating the subject vector. The monitoring application 1 14 also optionally includes a vector space generator 306 for generating a vector space based on the set of model signals stored in the model readings storage 302.
[0041] As mentioned above, the subject vector generator 304 generates a subject vector using the obtained signal. As indicated in Figure 2, multiple sensors 102 can be used to monitor the subject, and thus, the obtained signal for a single motion performed by the subject can include sensor readings from multiple sensors 102. The subject vector generator 304 combines the sensor readings from multiple sensors 102 to create a single subject vector that can be compared and evaluated against model vectors.
[0042] The model vector selector 308 selects one of the model vectors from the set of model vectors stored in the model vector storage 310, based on the subject vector. The set of model vectors may correspond to different states of the subject that is being monitored, or different states of a representative sample of subjects that the monitored subject is to be evaluated against. For example, in Figure 2, the sensors 102 may be used to monitor the gait pattern of the user 200 such that a subject vector generated from the obtained signal represents a gait cycle of the user 200. In this example, the set of model vectors can correspond to the gait patterns associated with healthy persons, as well as persons with different degrees of paralysis, injuries and other medical conditions that have been identified as affecting gait patterns.
[0043] In an example, the model vector selector 308 selects the model vector that most closely matches the subject vector, such as the model vector at the shortest distance to the subject vector in the vector space. The selected model vector can be indicative that the monitored subject is in the same state associated with the selected model vector. For example, if the subject vector matches a model vector that corresponds to a gait pattern of a person having the condition of pes cavus (i.e. a high arch foot), then it can be determined that the user 200 also has the same condition. The set of model vectors may be further differentiated by the types of pes cavus, and/or the extent that pes cavus is present, thus enabling the selected model vector to provide further detailed information on the state of the user 200.
[0044] The state evaluator 312 can evaluate the state associated with the selected model vector, and provide information regarding the state to a receiver device 106. For example, the state evaluator 312 may determine that the state associated with the selected model vector is a gait pattern for a person having the condition of pes cavus, and thus email a diagnostic report to a medical professional and/or the user 200.
[0045] The monitoring application 1 14 can optionally include an alignment module 300 for aligning sensor readings obtained from the sensors 102 with one or more model sensor readings. The alignment module 300 may also extract a subset of the sequence of readings in the obtained signal. This may be necessary in cases such as where the sensor readings capture multiple instances of the same state, such as a repeated motion performed by the user 200 (e.g. multiple gait cycles), while the model sensor readings from which the model vectors are generated may correspond to a single instance of a state (e.g. single gait cycle). As discussed below, a number of techniques may be used to align and extract the required window of sensor readings so that a comparison of the subject vector and set of model vectors can be performed. It will be appreciated that if the sensor readings are already aligned with underlying model signals, the alignment module 300 may not be needed.
[0046] The monitoring application 1 14 can optionally include a vector space generator 306 for generating a vector space that can be used to define the model vectors, as well as the subject vector, so that the vectors have a common basis to be compared and analyzed. As discussed below, a number of techniques may be used to generate the vector space so that a comparison of the subject vector and set of model vectors can be performed. It will be appreciated that if the subject vector and the set of model vectors are already defined in a common vector space, the vector space generator 306 may not be needed.
[0047] It will be appreciated that any module, subsystem or component exemplified herein that executes instructions or operations may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data, except transitory propagating signals per se. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the system 100 or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions or operations that may be stored or otherwise held by such computer readable media.
[0048] Referring to Figure 4, an example set of computer executable instructions is provided for monitoring a subject, such as user 200. At 400, a plurality of sensor readings is obtained (i.e. the obtained signal) from a plurality of sensors 102 for detecting motion of a subject. At 402, the plurality of readings is optionally aligned with at least one of a set of model plurality of readings (i.e. the set of model signals). This may improve the signal correlation that is performed subsequently as part of 406. At 404, a subject vector is generated using the plurality of readings. At 406, a model vector is selected based on the subject vector. The selected model vector belongs to a set of model vectors corresponding to a set of respective states. The set of respective states includes one or more positions and/or motions. The positions and motions can include previous positions and motions of the subject, and/or positions and motions from a representative sample that the subject is to be evaluated against. At 408, the plurality of readings is associated with the respective state of the selected model vector. At 410, information based on the respective state associated with the plurality of readings is provided, such as a notification alert.
[0049] As noted above, at 400, a plurality of sensor readings (i.e. the obtained signal) is obtained from a plurality of sensors 102 for detecting motion of a subject. In an example configuration of the monitoring application 1 14, the sensors 102, in conjunction with the communication subsystem 108 of the server 104, can implement 400 (Figure 3).
[0050] As noted above, at 402, the obtained signal is optionally aligned with at least one of the set of model signals. In an example configuration of the monitoring application 1 14, the alignment module 300 in conjunction with the model readings storage 302 can implement 402 (Figure 3).
[0051] Timing of a series of human movement are subject to natural time distortion, even when the human subject attempts to repeat a movement exactly. For example, in the walking motion, the interval between each footfall is slightly different and features within each step, such as the heel striking the ground or the toe leaving the ground, may vary in time within each step.
[0052] As a result, the peaks and other features of the obtained signal are never perfectly repeatable. Due to the inherent variation in human movement and variability in timing as to when the sensors 102 are configured to start and stop obtaining readings, the obtained signal may not always be aligned with the model signals, even when of the same class or performed by the same subject. If the signal sequences of the obtained signal and the model signals are not the same length, and/or the features between the model signals and the obtained signals are not aligned, alignment of the obtained signal to one or more of the set of model signals in the same class can be performed to reduce or remove the time distortion, and improve the quality of the subsequent comparison and matching that is be performed at 406.
[0053] Signal alignment of two signals, such as the obtained signal and a model signal, can be performed by identifying features between the two signals. In an example, signal alignment may be done heuristically in accordance with a particular feature rule that can be identified, such as based on identifying local maxima/minima of the signals or abrupt transitions of the signals. This may be done manually by a user observing the signal, or automatically with signal processing techniques described herein or known in the art.
[0054] Referring to Figure 5, an example set of computer executable instructions is provided for performing signal alignment of an obtained signal to a model signal. At 500, a "key" signal sequence in the obtained signal is identified for use as the basis for the alignment. At 502, features from the signal sequence of the model signal corresponding to the key signal sequence of the obtained signal are identified. At 504, corresponding features in the key signal sequence of the obtained signal is identified. At 506, the key signal sequence in the obtained signal is resampled. At 508, the resampling operation of 506 is repeated for all other signal sequences in the obtained signal.
[0055] As noted above, at 500, a "key" signal sequence in the obtained signal is identified to extract features from, and use as the basis for the alignment. The key signal sequence can be the signal sequence with the strongest signal, highest signal-to-noise ratio, or satisfying some other predetermined criteria. For example, in a gait signal, the signal sequence along an axis of the accelerometer may be closely aligned with the direction of motion such that the signal sequence is suitable for use as the key signal sequence. In another example, the key signal sequence can be determined based on the type of sensor used to obtain the signal sequence. For example, in a gait signal that is obtained from accelerometer sensors as well as a heel strike switch or pressure sensor, the signal sequence obtained from the heel strike switch may be the least ambiguous and can be used as the key signal sequence. [0056] As noted above, at 502, features from the corresponding key signal sequence of the model signal are identified. In an example, the model signals may include additional information such as annotation of its features, and these annotated features may be stored in a database. Providing annotated feature information can reduce the signal processing operations performed during alignment. Features can include local maxima/minima, abrupt transitions, and other detectable characteristics.
[0057] As noted above, at 504, corresponding features in the key signal sequence of the obtained signal is identified. It will be appreciated that due to noise or individual variation, there may be a different number of features of different types between the key signal sequences of the model signal and the obtained signal. In an example, excess features can be discarded. The beginning and the end of a signal sequence can also be considered to be features. The features in the model signal are enumerated, and the number of samples between each pair of features is recorded.
[0058] As noted above, at 506, the key signal sequence in the obtained signal is resampled between corresponding pairs of features to ensure that the number of samples between feature pairs is the same as in the corresponding key signal sequence of the model signal.
[0059] As noted above, at 508, the resampling operation of 506 is repeated for all other signal sequences as performed for the key signal sequence such that all the signal sequences of the obtained signal are aligned in the same manner.
[0060] Once the alignment is performed, the model signals and the obtained signal have the same length and correspondence of samples, and the obtained signal can be used to generate a subject vector to be analyzed and evaluated against the set of model vectors.
[0061] In Figures 6-8, an example of the alignment process described above is illustrated with an example obtained signal 600 comprising posture signals 602, 604, 606. For posture signals, the transition between two postures is a recurring pattern. For other types of movement, the signals may be analyzed for recurring patterns.
[0062] A posture signal is characterized by a static signal (when the subject is stationary), and abrupt transitions (when the subject is moving into a new posture). Figure 6 illustrates an obtained signal 600 comprising three posture signals 602, 604, 606 from a sleeping subject who is wearing a three-axis accelerometer. The obtained signal 600 includes two transitions T1 , T2 between posture signals 602 and 604, and between posture signals 604 and 606. Transitions from one posture to another are detectable by using known signal processing techniques.
[0063] At a change in posture, the signal sequences 610, 612, 614 of the obtained signal 600 corresponding to the x, y and z axes of the accelerometer, respectively, change values significantly. Moreover, the postures last over significantly different periods of time. The posture changes represent trigger events, separating the posture signals 602, 604, 606 from one another contained in the obtained signal 600. Using the trigger events, the obtained signal 600 can be separated into three portions representing different posture signals 602, 604, 606 (see Figure 7). The portions of the obtained signal 600 containing the trigger events have been removed from the extracted posture signals 602, 604, 606.
[0064] Referring to Figure 8, an example model signal 800 is displayed. The model signal 800 includes the signal sequences 802, 804, 806 for the x, y and z axis of the accelerometer, respectively, corresponding to the state when the subject is lying perfectly still on his/her back. In this example, the duration of the model signal 800 in Figure 8 is 100 seconds, or the equivalent number of samples. However, the postures signals 602, 604, 606 have different lengths. In order to improve the quality of the matching process at 406 the number of samples must be exactly the same between the model signal 800 and each of the posture signals 602, 604, 606. Further, the samples between the signals must be in exact correspondence (e.g. the fifth sample in the model signal 800 must correspond to the same part and time of each of the fifth samples of the posture signals 602, 604, 606).
[0065] Therefore, posture signals 602, 604, 606 in the obtained signal 600 are resampled so that they have the correct number (i.e. same number of samples as the model signal 800) and correspondence in the samples. This can be achieved by performing linear interpolation (e.g. add new samples in between two posture signal samples by taking a linear combination of the two samples) or any other known signal processing technique. In the example of Figures 6-8, the result of alignment will be three posture signals
corresponding to a different posture of the subject that can each be evaluated against the model signal 800.
[0066] As shown in Figures 6 and 7, the obtained signal 600, and resulting posture signals 602, 604, 606 may include small variations, even when the subject may be maintaining a static position. Variations can be caused by fine movements of the subject, such as breathing or fidgeting of the subject while sleeping. In an example, the obtained signal 600 can be pre-filtered to remove one or more of such variations, such a variation relating to breathing, if such variation(s) is(are) not of interest. Removal of variations can be accomplished in a number of known manners, such as by using a low pass-filter. If, on the other hand, a variation such as breathing is to be monitored, in order to monitor the desired state of the subject, a high-pass filter or band-pass filter may be applied to the obtained signal 600 to remove the posture signal components.
[0067] In another example, the variations may be incorporated into a model signal corresponding to a subject sleeping on his/her back with an amount of fidgeting to indicate poor sleep quality. The extent of variation can be represented by different model signals to correspond to different degrees and types of fidgeting.
[0068] In Figures 9 and 10, the example alignment process described above is illustrated with an example gait signal. As mentioned above, for gait signals, the start of each step (e.g., heel strike) and features within each step (e.g. toe off) can serve as recurring patterns to base alignment.
[0069] The obtained signal can be partitioned into portions using trigger events in a similar manner as described in the example of extracting posture signal. However, within the gait signal there may be other key events that are used to align the features of the signals. Therefore, resampling may occur in two or more different ways within the same gait signal.
[0070] Figure 9 illustrates an obtained signal 900 comprising two and half gait cycles. For clarity, in this example, the obtained signal 900 of the subject's gait is only detected by a single accelerometer placed on the lower leg, a heel contact sensor (to detect heel strike) and a toe contact sensor (to detect toe off).
[0071] The monitoring application 1 14 can, in this example, identify the initiation of a heel contact as the trigger event to separate the obtained signal 900 into individual gait signals 902, 904 of a single gait cycle in length (Figure 9). In an example, the monitoring application 1 14 can disregard portions of a gait signal that do not include a full gait cycle, such as partial gait signal 906.
[0072] The model signals (not shown) corresponding to gait signals also include measurements for an accelerometer, heel contact sensor, and toe contact sensor. The monitoring application 1 14 aligns the features of the gait signals 902, 904 to the model signals. For example, the features to be aligned in the gait signals 902, 904 can include the initial heel contact A to the subsequent heel contact A (i.e. the entire gait cycle). [0073] In another example, a feature to be aligned can also include the point of toe off B. In this example, the monitoring application 1 14 may perform resampling independently for two portions of the gait cycle: (i) initial heel contact A to toe off B; and (ii) toe off B to next heel contact A. The resulting resampled gait signal will have the same number of samples and alignment of the identified features.
[0074] In another example, further alignment using additional features can be performed by identifying four regions to align independently, such as (i) initial heel contact A to toe off B; (ii) toe off B to next heel contact A; (iii) initial heel contact A to toe contact C; and (iv) heel off D to toe off B.
[0075] In another example, only the accelerometer signal sequences of Figures 9 and 10 may be included in the obtained signal and model signals. It will be appreciated that the acceleration measurements alone may have distinguishing features that can be identified, such as the change in the measurements from heel contact and toe off, for example.
[0076] Therefore, it can be seen that the obtained signal or one or more portions thereof, can be aligned with one or more model signals to improve subsequent matching of the subject vector and model vectors that are derived from the obtained signal and model signals, respectively.
[0077] Referring back to Figure 4, at 404, a subject vector is generated using the obtained signal. In an example configuration of the monitoring application 1 14, the subject vector generator 304 and vector space generator 306 can implement 404 (Figure 3).
[0078] The subject vector is generated using the obtained signal by mapping the obtained signal to the same vector space as the set of model signals. In an example, the subject vector can be generated according to the following:
a. A vector space is generated using the set of model signals. If there are m model signals, we form m corresponding orthonormal basis vectors using the Gram-Schmidt procedure or other known orthonormalization technique.
b. Each model signal is then mapped into the vector space. Let =
Figure imgf000018_0001
represent one of the m orthonormal basis vectors from the orthonormalization and let p = [p1( p2, ... , pk] represent a model signal. If a model signal includes more than one signal sequence, all the signal sequences of a model signal can be combined to form p, such as by concatenation according to a predetermined order. The ;'th "coordinate" of the model vector in the vector space, c[;], can be computed by taking the vector dot product:
k
Figure imgf000019_0001
Then resulting model vector has m coordinates in vector space: c[1 ], c[2], c[m]. The set of m coordinates is found for every known model signal and stored as a separate model vector.
c. The subject vector of m coordinates can be generated in the vector space by repeating step b using the obtained signal instead of the model signal. Let s[1], s[2], s[m] represent the coordinates of the subject vector.
[0079] It will be appreciated that if the number of model signals is substantially less than the length of each model signal, the number of coordinates in the resulting model vectors and subject vector will be substantially less than the length of the corresponding model signals and obtained signal. As a result, orthonormalization can reduce the number of computations performed at 408 to select a model vector by operating on data (i.e. the model vectors and subject vector) of shorter length than selecting a model signal directly from the obtained signal.
[0080] As noted above, at 406, a model vector is selected based on the subject vector. In an example configuration of the monitoring application 1 14, the model vector selector 308 can implement 406 (Figure 3). In an example, the Euclidean distance between the subject vector having coordinates s[1], s[2], s[m] and the each of the model vectors having respective coordinates c[1], c[2], c[m] can be calculated by
Figure imgf000019_0002
[0081] In an example, the model vector selected can be chosen as the best match using one of the following rules:
a. Closest match: Select the model vector with the smallest distance d to the subject vector. If more than one model vector has the distance d, select one of model vectors at random.
b. Closest match, with maximum: Define a maximum distance D. Find the model vector with the smallest distance d. If that smallest distance is less than D, then select the closest model vector. If the smallest d is greater than the maximum distance D, then no model vectors are selected.
[0082] If a model vector is selected, the subject is determined to have a state associated with the selected model vector, such as the subject having experienced a motion that resulted in the model signal. If no model vectors are selected, the state of the subject can be determined to be of unknown type.
[0083] The example matching process described above will be illustrated with the following example. For clarity, the example only considers two sensors and sensor readings of 4 samples in length. Sensor 1 and sensor 2 obtain the following signal sequences while monitoring a subject:
Sensor 1 : [+1.0, +0.5, -0.1 , -0.3]
Sensor 2: [+0.3, +0.8, -0.4, -0.7]
[0084] To generate the obtained signal, the signal sequences of sensor 2 is
concatenated to the end of signal sequence of sensor 1 :
Obtained signal [+1.0, +0.5, -0.1 , -0.3, +0.3, +0.8, -0.4, -0.7]
[0085] In this example, the set of model signals includes two signals, and thus two basis vectors are obtained from orthonormalizing the set of model signals. The basis vectors are normalized to be unit vectors as:
Basis vector V. [+1.0, +1.0, +1.0, +1.0, +1.0, +1.0, +1.0, +1.0]/sqrt(8) Basis vector 2: [+1.0, +1.0, -1.0, -1.0, +1.0, +1.0, -1.0, -1.0]/sqrt(8)
[0086] The coordinates of the subject vector are then generated using the obtained signal by taking the dot product between it and the two basis vectors:
Coordinate 1 :
[+1.0, +0.5, -0.1 , -0.3, +0.3, +0.8, -0.4, -0.7]
· [+1.0, +1.0, +1.0, +1.0, +1.0, +1.0, +1.0, +1.0]/sqrt(8)
= (+1.0 + 0.5 - 0.1 - 0.3 + 0.3 + 0.8 - 0.4 - 0.7)/sqrt(8)
= 1.1/sqrt(8)
= 0.39
Coordinate 2:
[+1.0, +0.5, -0.1 , -0.3, +0.3, +0.8, -0.4, -0.7] · [+1.0, +1.0, -1.0, -1.0, +1.0, +1.0, -1.0, -1.0]/sqrt(8)
= (+1.0 + 0.5 + 0.1 + 0.3 + 0.3 + 0.8 + 0.4 + 0.7)/sqrt(8) = 4.1/sqrt(8)
= 1.45
[0087] The subject vector in the vector space defined by the basis vectors is (0.39, 1.45). In this example, the set of model vectors includes:
Model vector 1 : (+1 , +1)
Model vector 2: (+1 , -1)
Typically, the number of model vectors and the number of basis vectors will be the same.
[0088] One of the set of model vectors is selected based on the subject vector. In this example, the subject vector is evaluated against the set of model vectors by computing the Euclidean distance:
dsubject vector and model vector 1 = Sqrt((0.39— 1)2 + (1.45— 1)2) = 0.76 dsubject vector and model vector 2 = Sqrt(0.39-1)2 + (1.45 + 1)2) = 2.52
[0089] Because the distance between the subject vector and the model vector 1 is the shortest, the model vector 1 is selected from the set of model vectors. However, if the matching process required that the distance be less than a maximum distance D = 0.6, then neither model vector would be selected.
[0090] As noted above, at 408, the respective state associated with the selected model vector is also associated with the subject vector and the underlying obtained signal. In an example configuration of the monitoring application 1 14, the state evaluator 312 can implement 408 (Figure 3). For example, if the model vector belongs to class of gait signals, the obtained signal can also be classified to be a gait signal. In another example, the selected model vector can correspond to a particular body position or posture, in which case the obtained signal is also associated with such position, thus indicating that the monitored subject was in such position at the time the sensor readings were taken.
[0091] As noted above, at 410, information based on the respective state associated with the obtained signal can be provided. In an example configuration of the monitoring application 1 14, the state evaluator 312 can implement step 410 (Figure 3). The nature of the information can depend on the desired application of the system 100. [0092] In patient care applications, it may be desirable for the patient being monitored, a monitoring station and/or a caregiver or other related party, to receive a notification on the occurrence of certain movements of the patient or lack of movement thereof. As a result, the information provided can include a notification alert providing information on the motion, requesting prompt action, etc.
[0093] In an example, alerts may be provided immediately after associating a state with the obtained signal. For example, in a pressure ulcer prevention application of the invention, an obtained signal associated with a patient being in a given position for a prolonged period of time may require immediate intervention to prevent one or more pressure ulcers from forming. In such case, a notification alert may be generated and sent to a suitable recipient (as described above).
[0094] In another example, an alert may be desired upon determining that a series of obtained signals have been associated with a specific sequence of states. A certain sequence of matching signals may lead to an alert. For example, in pressure ulcer applications, the same model vector corresponding to a posture state that is matched many times in a row would indicate that the subject has maintained the same posture for a period of time. Thus, an alert may be sent to prompt caregivers to move the patient into a new position, thus preventing pressure ulcers.
[0095] In an example activity detection application, the lack of a match to a model vector or model vectors over a long period of time can indicate that the subject is not performing prescribed exercises (i.e. motions) associated with the model vectors. The subject can then be prompted to perform the exercises through an alert or message. Subsequently, a correct sequence of exercises (i.e. a correct sequence of matches to model vectors associated with performing the exercise) can results in a "confirmation" alert to be sent to the subject, indicating that the exercise was done properly, Otherwise, a "try again" alert can be sent if the correct sequences of model vectors was not matched. Therefore, it can be seen that an alert can be provided when a single model vector is matched, or when a predetermined sequence of model vectors is matched. Similarly, an alert can be provided when a single model vector is not matched, or when a predetermined sequence of model vectors is not matched.
[0096] After an alert is sent to a party, such as a user of a receiver device 106, the user can cancel the alert. An alert may also be cancelled by the monitoring application 1 14 upon detecting that the condition that resulted in the alert has ended (i.e. the selected model vector or sequence of selected model vectors no longer satisfy the criteria that prompted the alert).
[0097] It will appreciated that the signal sequences forming an obtained signal or a model signals can be filtered or pre-processed using signal processing techniques prior to the operations of 400-410 in Figure 4. The pre-processing may be applied to a subset of the signal sequences forming a signal, and/or to different segments of a signal sequence. Pre-processing techniques may include high-pass filtering to remove a constant offset to the signal, band-stop filtering to remove components related to vibration or breathing, band-pass filtering to enhance measurements of breathing, resampling or other pre-processing techniques that may be available or that can be incorporated in to the system 100.
[0098] The number of model signals corresponding to different states may change over time, such as when new states are discovered, resulting in new model signals to be added to the set of model signals. Whenever the set of model signals are modified, the operations at 402 to 406 of Figure 4 can be repeated to generate a new vector space used to define a new set of model vectors and subject vector.
[0099] The set of model signals can be generated in various ways. In an example, the set of model signals may be pre-loaded with signals representing a sample of subjects representative of the general population of subjects, while in a large number of different states (e.g. in different positions and/or performing a variety of motions). In another example, the set of model signals can be generated over time from monitoring a particular individual subject. As the individual subject may have particular characteristics in their movement, which can also change over time, a set of model signals customized to the particular individual subject can enable the set of model signals to adapt to individual variances, thus maintaining a library of data that captures developments in the subject's motion (e.g. an individual patient whose motion improves as a result of therapy).
[00100] In another example, the set of model signals can be generated with supervision (i.e. (human intervention). In an example, an expert (such as a doctor or physiotherapist) can inspect a signal, as well as other available data (e.g. video of the subject performing a motion or having a certain posture), and designate portions of the signal to serve as a model signal, along with annotations describing the state (e.g. motion or posture) associated with the model signal. An interface can be provided to facilitate the input of the expert (e.g. a graphical editor allowing the expert to directly annotate the waveforms). The expert may also be prompted to identify a candidate model signal, as discussed below. In another example, the user being monitored can indicate what kind of activity he/she is performing.
[00101] In another example, the set of model signals can be updated using an automated process without supervision (i.e. human intervention is not required). For example, a system may update a set of model signals as follows:
a. A database is maintained of all sensor readings and signals taken from the sensors. The signals may be annotated with characteristics of a signal, such as features of its signal sequences. The signals can be analyzed to identify the subset of signals (referred to herein as the set of "non-matching signals") that do not match any of the current set of model signals. The matching procedure described with respect to Figure 4, or any other matching technique to identify similar signals within a given threshold can be used. b. The unique signals within the set of non-matching signals can then be
identified. Of the set of non-matching signals, let S represent a subset that all match with each other (e.g. their matching score is above the minimum threshold). Then S is a set of candidate model signals.
c. Once a subset S is identified, it may be incorporated directly into the set of matching signals. In another example, an expert or the subject may be prompted to confirm that a candidate model signal may be added to the set of model signals.
[00102] It will be appreciated that the computational burden or complexity of calculating a matching score for every possible pair of non-matching signals may be prohibitive. In an example, only a subset of every possible pair of non-matching signal is considered. The complexity can be mitigated by using a number of techniques, such as by discarding all non- matching signals that do not match with other non-matching signals within a predetermined time (i.e. that do not form any subsets S within a given amount of time).
[00103] It will be appreciated that supervised techniques to generate a set of model signals can be used in combination with unsupervised techniques. For example, an expert or user can choose to, or be prompted to, provide supplemental information (e.g. type of activity taking place) on a newly identified candidate model signal. The system 100 may prompt a user or expert to input supplemental information when the automated process is unable to detect or misidentifies the state to be associated with the model signal. [00104] Therefore, it can be seen that the system 100 can constantly track similar kinds of movement or positions recognized by the set of model signals. The set of model signals can be derived from a large numbers of subjects in the population. Model signals derived from monitoring a subject may be used as model signals to monitor another subject, or may be used to guide the selection of model signals in monitoring the other subject. Collaborative filtering may be used to identify members of the population with similar characteristics and demographics to the subject being monitored, to improve the likelihood that a set of model signals will be applicable to the monitored subject.
[00105] Obtaining the model signals from monitoring an actual subject increases the likelihood that the model signals represent plausibly states. For example, the model signals should be plausible in terms of human motion that can be physically achieved, and any signal that is not compatible with human motion should be rejected. By obtaining model signals from monitoring an actual person, the non-compatible motions are automatically avoided. As a result, the monitoring application 1 14 may match motions consistent with human movement more quickly.
[00106] In the model signal space, there will be a region of coordinates where the plausible signals exist, and a complementary region of coordinates corresponding to impossible or improbable motion signals. For example, a signal in the impossible or improbable region may imply that connected limbs, such as hip and thigh, are moving as though they are unconnected, such as if one is accelerating sharply while the other is stationary. In an example embodiment, the monitoring application 1 14 may also check the plausibility of a motion detected in a signal by evaluating the signal against a mathematical model of the biomechanical system (e.g., Hill's muscle model or Pandy's limb model, among others). If any known physical limitations of the human anatomy are violated, the signal may be safely rejected.
[00107] Although the examples herein have been generally described with reference to monitoring human subjects, it will be appreciated that the subject may include other living beings, such as animals, and non-living subjects, such as robotic subjects, artificial or prosthetic limbs, etc.
[00108] Although the invention has been described with reference to certain specific embodiments, various modifications thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention as outlined in the claims appended hereto. The entire disclosures of all references recited above are incorporated herein by reference.

Claims

What is claimed is:
1 . A method of monitoring a subject, the method comprising:
obtaining a plurality of readings from a plurality of sensors for detecting motion of the subject;
generating a subject vector using the plurality of readings;
selecting one of a set of model vectors based on the subject vector, the set of model vectors corresponding to a set of respective states, the set of respective states comprising any one or more of a position and a motion; and
associating the plurality of readings with the respective state of the selected model vector.
2. The method of claim 2, wherein generating the subject vector using the plurality of readings comprises mapping the plurality of readings to a vector space, the vector space being generated using a set of model plurality of readings obtained from detecting the set of respective states.
3. The method of claim 3, wherein the vector space is generated by orthonormalizing the set of model plurality of readings.
4. The method of claim 2 or claim 3, wherein a model plurality of readings obtained from detecting a position comprises a plurality of static readings over a period of time.
5. The method of claim 2 or claim 3, wherein a model plurality of readings obtained from detecting a motion comprises at least one reading changing over a period of time.
6. The method of any one of claims 1 to 5, further comprising aligning the plurality of readings with at least one of the set of model plurality of readings prior to generating the subject vector.
7. The method of any one of claims 1 to 6, wherein selecting the one of the set of model vectors comprises selecting the model vector at a shortest distance to the subject vector.
8. The method of any one of claims 1 to 7 further comprising generating an alert based on the respective state associated with the plurality of readings.
9. The method of any one of claims 1 to 8, wherein the subject comprises any one of: at least a portion of a person, a prosthetic limb and a robotic arm.
10. A system for monitoring a subject, the system comprising:
a plurality of sensors for detecting motion of the subject; and
a processor coupled to memory and the plurality of sensors, the memory storing computer executable instructions that when executed by the processor cause the processor to perform the method of any one of claims 1 to 9.
1 1 . A computer readable medium comprising computer executable instructions that when executed by a processor, cause the processor to perform the method of any one of claims 1 to 9.
PCT/CA2014/050316 2013-03-29 2014-03-28 System and method for monitoring a subject WO2014153665A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361806754P 2013-03-29 2013-03-29
US61/806,754 2013-03-29

Publications (2)

Publication Number Publication Date
WO2014153665A1 true WO2014153665A1 (en) 2014-10-02
WO2014153665A8 WO2014153665A8 (en) 2014-11-06

Family

ID=51622319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2014/050316 WO2014153665A1 (en) 2013-03-29 2014-03-28 System and method for monitoring a subject

Country Status (1)

Country Link
WO (1) WO2014153665A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3155996B1 (en) * 2015-10-16 2019-02-20 MAKO Surgical Corp. Tool and method for controlling the same
CN110916984A (en) * 2019-12-03 2020-03-27 上海交通大学医学院附属第九人民医院 Wearable device for preventing frozen gait and implementation method thereof
US20210174308A1 (en) * 2019-03-27 2021-06-10 On Time Staffing Inc. Behavioral data analysis and scoring system
US11636678B2 (en) 2020-04-02 2023-04-25 On Time Staffing Inc. Audio and video recording and streaming in a three-computer booth
US11720859B2 (en) 2020-09-18 2023-08-08 On Time Staffing Inc. Systems and methods for evaluating actions over a computer network and establishing live network connections
US11783645B2 (en) 2019-11-26 2023-10-10 On Time Staffing Inc. Multi-camera, multi-sensor panel data extraction system and method
US11863858B2 (en) 2019-03-27 2024-01-02 On Time Staffing Inc. Automatic camera angle switching in response to low noise audio to create combined audiovisual file

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008018810A2 (en) * 2006-08-07 2008-02-14 Universidade Do Minho Body kinetics monitoring system
EP2108393A1 (en) * 2008-04-11 2009-10-14 F.Hoffmann-La Roche Ag Administration device having patient state monitor
US20110066383A1 (en) * 2009-09-15 2011-03-17 Wellcore Corporation Indentifying One or More Activities of an Animate or Inanimate Object
US20110131005A1 (en) * 2007-12-18 2011-06-02 Hiromu Ueshima Mobile recording apparatus, body movement measuring apparatus, information processing apparatus, movement pattern determining apparatus, activity amount calculating apparatus, recording method, body movement measuring method, information processing method, movement pattern determining method, activity amount calculating met
US20120083705A1 (en) * 2010-09-30 2012-04-05 Shelten Gee Jao Yuen Activity Monitoring Systems and Methods of Operating Same
EP2650807A1 (en) * 2012-04-13 2013-10-16 Adidas AG Athletic activity monitoring methods and systems
EP2666406A2 (en) * 2012-05-22 2013-11-27 Hill-Rom Services, Inc. Occupant egress prediction systems, methods and devices

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008018810A2 (en) * 2006-08-07 2008-02-14 Universidade Do Minho Body kinetics monitoring system
US20110131005A1 (en) * 2007-12-18 2011-06-02 Hiromu Ueshima Mobile recording apparatus, body movement measuring apparatus, information processing apparatus, movement pattern determining apparatus, activity amount calculating apparatus, recording method, body movement measuring method, information processing method, movement pattern determining method, activity amount calculating met
EP2108393A1 (en) * 2008-04-11 2009-10-14 F.Hoffmann-La Roche Ag Administration device having patient state monitor
US20110066383A1 (en) * 2009-09-15 2011-03-17 Wellcore Corporation Indentifying One or More Activities of an Animate or Inanimate Object
US20120083705A1 (en) * 2010-09-30 2012-04-05 Shelten Gee Jao Yuen Activity Monitoring Systems and Methods of Operating Same
EP2650807A1 (en) * 2012-04-13 2013-10-16 Adidas AG Athletic activity monitoring methods and systems
EP2666406A2 (en) * 2012-05-22 2013-11-27 Hill-Rom Services, Inc. Occupant egress prediction systems, methods and devices

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HADLEY ET AL.: "Power Saving in a Biomechanical Sensor Network Using Activity Detection", IEEE 2010 25TH BIENNIAL SYMPOSIUM ON COMMUNICATIONS (QBSC 2010, 12 May 2010 (2010-05-12), pages 173 - 176, XP031681276 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3155996B1 (en) * 2015-10-16 2019-02-20 MAKO Surgical Corp. Tool and method for controlling the same
US10373715B2 (en) 2015-10-16 2019-08-06 Mako Surgical Corp. Tool and method for controlling the same
US20210174308A1 (en) * 2019-03-27 2021-06-10 On Time Staffing Inc. Behavioral data analysis and scoring system
US11863858B2 (en) 2019-03-27 2024-01-02 On Time Staffing Inc. Automatic camera angle switching in response to low noise audio to create combined audiovisual file
US11961044B2 (en) * 2019-03-27 2024-04-16 On Time Staffing, Inc. Behavioral data analysis and scoring system
US11783645B2 (en) 2019-11-26 2023-10-10 On Time Staffing Inc. Multi-camera, multi-sensor panel data extraction system and method
CN110916984A (en) * 2019-12-03 2020-03-27 上海交通大学医学院附属第九人民医院 Wearable device for preventing frozen gait and implementation method thereof
US11636678B2 (en) 2020-04-02 2023-04-25 On Time Staffing Inc. Audio and video recording and streaming in a three-computer booth
US11861904B2 (en) 2020-04-02 2024-01-02 On Time Staffing, Inc. Automatic versioning of video presentations
US11720859B2 (en) 2020-09-18 2023-08-08 On Time Staffing Inc. Systems and methods for evaluating actions over a computer network and establishing live network connections

Also Published As

Publication number Publication date
WO2014153665A8 (en) 2014-11-06

Similar Documents

Publication Publication Date Title
Pandey Machine learning and IoT for prediction and detection of stress
US10750977B2 (en) Medical evaluation system and method using sensors in mobile devices
WO2014153665A1 (en) System and method for monitoring a subject
CN103596493B (en) Pressure measuring device and method
US10335080B2 (en) Biomechanical activity monitoring
EP2432392B1 (en) Sensing device for detecting a wearing position
US8529448B2 (en) Computerized systems and methods for stability—theoretic prediction and prevention of falls
CN108882892A (en) The system and method for tracking patient motion
US20110246123A1 (en) Personal status monitoring
CN105051799A (en) Method for detecting falls and a fall detector.
Hemmatpour et al. A review on fall prediction and prevention system for personal devices: evaluation and experimental results
Creagh et al. Smartphone-and smartwatch-based remote characterisation of ambulation in multiple sclerosis during the two-minute walk test
Majumder et al. A multi-sensor approach for fall risk prediction and prevention in elderly
US20190320944A1 (en) Biomechanical activity monitoring
US20230298760A1 (en) Systems, devices, and methods for determining movement variability, illness and injury prediction and recovery readiness
Pires et al. Limitations of energy expenditure calculation based on a mobile phone accelerometer
Jovanov et al. A mobile system for assessment of physiological response to posture transitions
Perez et al. A smartphone-based system for clinical gait assessment
JP2016045816A (en) Deglutition analysis system, device, method, and program
Ma et al. Toward robust and platform-agnostic gait analysis
Zulj et al. Supporting diabetic patients with a remote patient monitoring systems
EP3847961A1 (en) Walking state determination program, walking state determination method, and information processing device
Boutaayamou et al. Extraction of Temporal Gait Parameters using a Reduced Number of Wearable Accelerometers.
Qin et al. A smart phone based gait monitor system
AU2021107373A4 (en) Develop an autonomous design system to recognize the subject through the posture of the walk using iot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14774317

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14774317

Country of ref document: EP

Kind code of ref document: A1