WO2014118767A1 - Classifying types of locomotion - Google Patents
Classifying types of locomotion Download PDFInfo
- Publication number
- WO2014118767A1 WO2014118767A1 PCT/IL2013/051004 IL2013051004W WO2014118767A1 WO 2014118767 A1 WO2014118767 A1 WO 2014118767A1 IL 2013051004 W IL2013051004 W IL 2013051004W WO 2014118767 A1 WO2014118767 A1 WO 2014118767A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- locomotion
- sensor
- subject
- motion
- signal
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7242—Details of waveform analysis using integration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
Definitions
- the invention relates to the field of motion analysis.
- Gait analysis is often defined as the systematic study of human locomotion, using aids such as instrumentation for measuring body movements, body mechanics and muscles activity. Gait analysis is commonly used to assess, plan, and treat individuals with conditions affecting their ability to walk. It is also used in sports biomechanics to help athletes run more efficiently and to identify posture-related or movement-related problems in people with injuries.
- gait analysis requires a subject to walk in a straight line in a laboratory setting, and to be monitored by a variety of instruments.
- a video camera is positioned to be pointing directly along the line of the straight line, and a trained professional analyzes the video.
- Such analysis itself is labor-intensive and normally requires a long time by a trained professional.
- gait analysis involves walking or running on a treadmill.
- the professional simply watches the way that the subject moves, looking in particular at the subject's feet, ankles, knees and hips.
- a video recorder will often be set-up behind the treadmill, which will record a video of the subject's walking or running cycle. This is then relayed to a computing device where slow motion and freeze frames are used to carefully assess the subject's running or walking style.
- This form of gait analysis usually focuses on the feet and ankles.
- Gait analysis is commonly performed by a professional, such as a podiatrist or physiotherapist, although it is now becoming more widespread and readily available with many specialist running and sports shops which own equipment and professional staff that are trained in gait analysis.
- U.S. Patent No. 7,421,369 to Clarkson which describes an activity recognition apparatus for detecting an activity of a subject.
- the apparatus includes: a sensor unit including a plurality of linear motion sensors configured to detect linear motions and a plurality of rotational motion sensors, the linear motions being orthogonal to each other, the rotational motions being orthogonal to each other; and a computational unit configured to receive and process signals from the sensors included in the sensor unit so as to detect an activity of the subject.
- the sensor unit is directly or indirectly supported by the subject with an arbitrary orientation with respect to the subject.
- the computational unit performs a calculation that uses the signals from both linear motion sensors and rotational motion sensors to determine the activity of the subject independent of the orientation of the sensor unit;
- U.S. Patent No. 7,689,378 to Kolen which describes a highly miniaturized electronic data acquisition system, includes MEMS sensors that can be embedded onto moving device without affecting the static/dynamic motion characteristics of the device.
- the basic inertial magnetic motion capture (IMMCAP) module consists of a 3D printed circuit board having MEMS sensors configured to provide a triaxial accelerometer; a tri-axial gyroscope, and a tri-axial magnetometer all in communication with analog to digital converters to convert the analog motion data to digital data for determining classic inertial measurement and change in spatial orientation (rho, theta, phi) and linear translation (x, y, z) relative to a fixed external coordinate system as well as the initial spatial orientation relative to the know relationship of the earth magnetic and gravitational fields.
- the data stream from the IMMCAP modules will allow the reconstruction of the time series of the 6 degrees of freedom for each rigid axis associated with each independent IMMCAP module;
- U.S. Patent Application Publication No. 2008/0146968 to Hanawaka et al. which describes a gait analysis system which has: a gait sensor which is to be attached to one foot or both feet of a walking person, and which wirelessly outputs detection data of at least one of an acceleration and an angular velocity; a portable terminal which receives the detection data, and which stores the data for a predetermined time period; and a gait analyzing apparatus which, based on the detection data obtained from the portable terminal, calculates two or three-dimensional position information and status information of the foot or feet at an arbitrary time;
- the present invention in some embodiments thereof, teaches a method of automatic and/or semiautomatic methods for classifying a patient's locomotion into one of several types of locomotion, such as, by way of some non-limiting examples, walking straight (WS), standing (S), turning left or right (TL/R), running (R) or climbing up stairs (C).
- WS walking straight
- S standing
- T/R turning left or right
- R running
- C climbing up stairs
- a method for automatic identification of which leg, right or left, a motion sensor is attached to is described.
- a method for automatic detection of types of locomotion including using at least one hardware processor for acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion, extracting one or more features from the motion signal, inputting the one or more features to a locomotion recognition unit, and having the locomotion recognition unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time.
- the specific type of locomotion is walking straight (WS).
- the locomotion recognition unit includes a trained machine learning unit.
- the senor is attached to the subject's leg, and the locomotion recognition produces an output indicating to which of the subject's legs the sensor is attached.
- the locomotion recognition which produces the output indicating to which of the subject's legs the sensor is attached is based on identifying a direction of a twist of the leg following a toe-off event.
- the specific type of locomotion is one of group consisting of standing (S), turning left (TL), turning right (TR), running (R), and climbing (C).
- the locomotion recognition unit produce an output indicating a likelihood that the subject was turning during the period of time.
- the acquiring the motion signal includes preprocessing the motion signal according to at least one method selected from the group including analog to digital conversion, and de-noising the signal.
- the motion signal includes a plurality of motion signals, including a linear motion signal and a rotational motion signal.
- the extracting one or more features from the motion signal includes using Wavelet Packet Decomposition (WPD) to extract at least one of the one or more features.
- WPD Wavelet Packet Decomposition
- the locomotion recognition unit includes an Artificial Neural Network (ANN).
- ANN Artificial Neural Network
- the locomotion recognition unit includes a plurality of locomotion recognition units.
- the locomotion recognition unit includes a Support Vector Machine (SVM).
- SVM Support Vector Machine
- the plurality of locomotion recognition units is provided to a decision unit, and it is the decision unit which produces the output indicating the likelihood that the subject was moving using the specific type of locomotion during the period of time.
- the decision unit includes an expert system.
- the acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion includes attaching the sensor to the subject, allowing the subject to walk in an unconstrained environment for the period of time, and downloading a recording of the motion signal to a computer for performing the extracting, the inputting to a locomotion recognition unit, and the producing an output.
- the extracting includes processing chunks of data from discrete windows of time. According to some embodiments of the invention, the chunks of data from discrete windows of time partially overlap.
- Wavelet Packet Decomposition is applied to the chunks of data, WPD terminal node values are calculated for the chunks of data, filter coefficients are computed for each one of the terminal node values, and energy is calculated for each filter, and DCT is applied to a vector of logarithms of the filter energies.
- the specific type of locomotion includes Walking Straight (WS) locomotion.
- two sensors are attached, each one of the two sensors to a different leg of the subject.
- a method for training an automatic locomotion classification system which includes a machine learning component, the method including using at least one hardware processor for obtaining a motion sensor signal from a motion sensor attached to a walking subject, extracting one or more features from the motion sensor signal, identifying a type of locomotion to which the motion sensor signal belongs, and feeding the one or more features to the machine learning component as a training example of the type of locomotion.
- the machine learning component includes an Artificial Neural network.
- the extracting one or more features from the motion sensor signal includes splitting the motion sensor signal into chunks of data from discrete windows of time, applying Wavelet Packet Decomposition (WPD) to the chunks of data, calculating WPD terminal node values for the chunks of data, computing filter coefficients for each one of the terminal node values, calculating energy for each filter coefficient, and applying DCT to a vector of logarithms of the energies.
- WPD Wavelet Packet Decomposition
- the machine learning component includes a plurality of Feed Forward Artificial Neural networks.
- a system for automatic classification of different types of locomotion including a locomotion classification module being configured, when executed by at least one hardware processor, to accept input of a motion signal from a motion sensor and to produce output including an indication of a locomotion classification based, at least in part, on the motion signal.
- the locomotion classification module includes at least one computerized machine learning component.
- a sensor package including at least one motion sensor for producing the motion signal, and in which the locomotion classification module is configured to accept the motion signal from the sensor package.
- the sensor package is included in a mobile personal computing device which includes at least one acceleration sensor, collecting the motion signal is included in an application residing on the mobile personal computing device, and the application is configured to send the motion signal to the locomotion classification module.
- the locomotion classification module is included in the mobile personal computing device, collecting and classifying the motion signal is included in an application residing on the mobile personal computing device, and the application is configured to send only a portion of the motion signal including a specific classification of locomotion to another computer.
- the sensor package and the locomotion classification module are both included in one unit.
- the unit includes a smart phone.
- the sensor package includes a plurality of motion sensors, at least one of which is a linear motion sensor, and one of which is a rotational motion sensor.
- the computerized machine learning component includes an Artificial Neural Network (ANN).
- ANN Artificial Neural Network
- the computerized machine learning component includes a first plurality of computerized machine learning components.
- the first plurality of computerized machine learning components includes Feed Forward Neural Networks.
- each one of the first plurality of computerized machine learning components is configured to identify a different one from a set of types of locomotion.
- a second, additional, machine learning component configured to accept input from the first plurality of computerized machine learning components, and to provide an output indicating which of the set of types of locomotion is most likely present in the motion signal.
- the second machine learning component is a Probabilistic Neural Network.
- Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
- a data processor such as a computing platform for executing a plurality of instructions.
- the data processor includes a volatile memory for storing instructions and/or data and/or a non- volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
- a network connection is provided as well.
- a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
- Figure 1A is an illustration of a subject wearing an example embodiment of a sensor package, constructed according to an example embodiment of the invention, attached to the subject's foot;
- Figure IB is an illustration of a subject wearing two sensor packages, one per leg, constructed according to an example embodiment of the invention, attached to the subject's feet, walking in a non-clinic environment;
- Figure 1C is an illustration of a subject wearing two example embodiments of a sensor package, constructed according to an example embodiment of the invention, attached to the subject's feet, in which the sensors are transmitting data to a receptor in a clinic environment;
- Figure ID is an illustration of a reference coordinate system including a set of three perpendicular axes each of which may be used to measure linear acceleration and rotational velocity, according to an example embodiment of the invention
- Figure IE is an illustration of a leg of a subject with a FORWARD direction indicated, and reference coordinate system of a set of three perpendicular axes, according to Figure ID, indicated next to the leg, according to an example embodiment of the invention
- Figure IF is a simplified flowchart illustration of use of locomotion classification according to an example embodiment of the invention.
- Figure 2A is a simplified block diagram of a locomotion classification system constructed according to an example embodiment of the invention.
- Figure 2B is a simplified block diagram of a locomotion classification system constructed according to another example embodiment of the invention.
- Figure 2C is a simplified block diagram of a locomotion classification system constructed according to yet another example embodiment of the invention.
- Figure 3 is a graph showing two input signals, which are examples of input signals similar to the input signal of Figure 2C;
- Figure 4 A is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a right turn, according to an example embodiment of the invention
- Figure 4B is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a right turn, according to an example embodiment of the invention
- Figure 4C is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a left turn, according to an example embodiment of the invention
- Figure 4D is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a left turn, according to an example embodiment of the invention.
- Figure 5 is a simplified flowchart illustration of training of an automatic locomotion classification system which includes a machine learning component according to an example embodiment of the invention.
- the present invention in some embodiments thereof, relates to a method and a system for automatic classification of different types of locomotion and, more particularly, but not exclusively, to a method for machine learning for automatic differentiation between different types of locomotion and, even more particularly, but not exclusively, to a method for training a machine learning unit(s) for automatic differentiation between different types of locomotion.
- gait analysis is typically performed on a subject walking straight. It is useful to automatically detect motion signals produced by motion sensors attached to the subject when the subject is walking straight, whether as a precursor to automatic gait analysis, or even as a method of pointing out a walking straight segment of walking for semi-automatic or even manual gait analysis. Automatic locomotion classification is useful in the above role.
- a bonus result of a locomotion classification system as described herein is that the same sensors and classification components can serve to identify, based on motion signals received from the motion sensors, which motion sensor was attached to which leg. Such identification, again, is useful, whether as a precursor to automatic gait analysis, or as a method of eliminating potential errors for semi-automatic or even manual gait analysis.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider.
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- gait analysis professionals look at a subject walking straight in a lab, and analyze how the subject places feet on the surface, and also differences between how the subject places one foot on the surface versus how the subject places the other foot on the surface.
- Automatic gait analysis using a computerized system can potentially provide a more rapid analysis, and/or at a lower cost, than manual gait analysis by a trained professional.
- Gait analysis can potentially gain by being freed from being performed with the subject in a laboratory setting.
- Some embodiments of the invention receive an input signal from motion sensors, and perform a pre-analysis, detecting when a subject used a specific type of locomotion, such as for example walking straight or running, and can provide a gait analysis system with the signal from just the specific type of locomotion.
- Various embodiments of the invention, described below, teach how such freedom is achieved for classifying types of locomotion, such as, by way of example, types of walking and/or types of running.
- collecting data on a subject's walking or running style is optionally done using a sensor package including one or more motion sensor(s) attached to the subject's feet.
- Figure 1A is a simplified image of a subject wearing an example embodiment of a sensor package 102, constructed according to an example embodiment of the invention, attached to the subject's foot 104.
- data about a subject's locomotion is collected by one or more motion sensors.
- the data is transferred to a system for classifying the locomotion.
- the transfer is immediate, for example if the subject is inside a laboratory.
- the subject is not constrained to walking or running in a laboratory setting, but rather may walk around a clinic wearing the motion sensor(s), walk in the environs of the clinic, and/or leave the clinic and walk elsewhere.
- the locomotion data is immediately transferred to the classifying system, for example by cellular communication, and/or by other wireless communication such as WiMax.
- the data is collected, and transferred to the classifying system later. For example, when the sensor package is brought to a locomotion or a gait analysis laboratory, and/or by a form of wireless transfer, as described above, to the locomotion or gait analysis laboratory.
- the motion sensors are embodied within portable computerized device such as a smart phone, which collects locomotion data using smart phone sensors, such as accelerometers, and the data is transferred from the smart phone to an analysis unit, the analysis unit possibly being in a locomotion or gait analysis center.
- portable computerized device such as a smart phone
- smart phone sensors such as accelerometers
- both the motion sensors and a locomotion classification unit are embodied within a portable computerized device such as a smart phone, which collects locomotion data using smart phone sensors, such as accelerometers.
- the smart phone classifies the data according to locomotion classes, and optionally sends only data belonging to a specific type of locomotion from the smart phone to the analysis unit at a locomotion or gait analysis center. It is noted that the smart phone mentioned above should be taken to stand for a family of motion-sensor-equipped mobile personal computing devices which are used nowadays, such as tablets, smart phones, and possibly even a communication-enabled pedometer.
- output of a locomotion classification unit is sent on to an automatic locomotion analysis unit, whether embodied in the same computer/device as the locomotion classification unit or in a separate device.
- Figure IB is a simplified image of a subject wearing two sensor packages 102, one per leg, constructed according to an example embodiment of the invention, attached to the subject's feet 104, walking in a non- clinic environment.
- locomotion data is collected over a period of time, only later to be uploaded into a computer for classification and/or analysis.
- the locomotion data is transmitted and uploaded in real time to a computer for classification and/or analysis.
- the uploading is done wirelessly.
- FIG. 1C is a simplified image of a subject wearing two example embodiments of a sensor package 102, constructed according to an example embodiment of the invention, attached to the subject's feet 104, in which the sensors are transmitting 106 data to a receptor 108 in a clinic 110 environment.
- the sensors may be attached to the subject's feet by a person who is not a medical professional or a gait analysis professional.
- the sensors may optionally even be attached by the subject.
- locomotion data classification is optionally done automatically.
- locomotion data classification is optionally performed automatically, optionally including segmenting duration of a subject's locomotion into segments which belong to a single continuous type of locomotion, such as, for example, segmenting into a period of Walking Straight locomotion, followed by a period of Climbing, followed by another period of Walking Straight.
- locomotion classification is done semi-automatically, with a technician selecting which portions of the locomotion data should be analyzed by a computer, and which should be left out of the classification process.
- a derivation of walking straight (WS) segments of the subject's walk is performed automatically without human intervention.
- the introduction section generally described a process of collecting data.
- the locomotion data is optionally collected from one or more linear accelerometers.
- a linear accelerometer can sense walking, and various types of locomotion. For example, when walking straight forward, a subject generates a first kind of motion forward, a second kind of motion in the vertical direction (a periodic up and down motion), and a third kind of motion side-to-side.
- three linear accelerometers are used, measuring linear acceleration, and/or linear motion, in three perpendicular directions, termed x, y and z.
- the accelerometers directions may not be aligned with the direction of motion, and so the signals from the accelerometers, for example for "walking straight forward", may not be purely as described above, since the axes of the accelerometers may not be aligned with the forward, up and down, and sideways directions.
- the signals from three perpendicular accelerometers are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
- the direction of the axes relative to the forward, up and down, and sideways directions is learned, and optionally, after the learning, the signals from three perpendicular accelerometers are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
- the signals from three not-all-in-the- same-plane accelerometers are modified by a mathematical rotation of the axes so as to produce signals corresponding to the forward, up and down, and sideways directions.
- the signals from three accelerometers are modified by a mathematical rotation of the axes so as to produce signals corresponding to polar coordinates.
- the rotation of the axes is performed so as to minimize amplitude of the acceleration signal in a first direction which is optionally defined as sideways, and/or minimize amplitude of the acceleration signal in a second direction which is optionally defined as forward, and/or maximize amplitude of the acceleration signal in a third direction which is optionally defined as up and down.
- the locomotion data is optionally collected from one or more gyroscopic measurement units, measuring angular velocity.
- three gyroscopic measurement units are used, measuring angular velocity in three perpendicular directions, also termed x, y and z.
- the locomotion data is optionally collected from one or more gyroscopic measurement units, measuring angular acceleration.
- three gyroscopic measurement units are used, measuring angular acceleration in three perpendicular directions, also termed x, y and z.
- the locomotion data is optionally collected from one or more gyroscopic measurement units.
- a gyroscopic measurement unit can sense walking, and various types of locomotion. For example, when walking straight forward, a subject may generate a little rotation from motion forward, some rotation in the vertical direction, and little rotation sided-to-side.
- gyroscopic measurement units are used, measuring angular velocity, and/or angular motion, in three perpendicular directions, termed x, y and z.
- gyroscopic measurement units are used, measuring angular acceleration, and/or angular motion, in three perpendicular directions, termed x, y and z.
- the gyroscopic axes may not be aligned with the direction of motion, and so the signals from the gyroscopic measurement units, for example for "walking straight forward", may not be purely as described above.
- the signals from three perpendicular gyroscopic measurement units are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
- both linear accelerometers and gyroscopic measurement units are used to collect locomotion data.
- the sensor packages 102 described above with reference to Figures 1A, IB and 1C include one or more accelerometers and/or one or more gyroscopes.
- Figure ID is a simplified image of a reference coordinate system 140 including a set of three perpendicular axes 142 145 148 each of which may be used to measure linear acceleration and rotational velocity, according to an example embodiment of the invention.
- sensors are used to measure linear acceleration and rotational velocity in three perpendicular directions, providing complete movement information about a sensor attached to a moving subject (not shown).
- Figure IE is a simplified image of a leg 160 of a subject with a FORWARD direction 162 indicated, and reference coordinate system 164 of a set of three perpendicular axes, according to Figure ID indicated next to the leg 160, according to an example embodiment of the invention.
- one of the axes 142 145 148 of the coordinate system is preferably aligned in the FORWARD direction 162.
- one of the axes 142 145 148 of the coordinate system is preferably aligned in an up-down direction (not shown) perpendicular to a floor, and/or a sole of the subject's shoe.
- Figure IF is a simplified flowchart illustration of use of locomotion classification according to an example embodiment of the invention.
- the flowchart of Figure IF illustrates a method which includes the following: acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion (122);
- a trained machine learning unit inputting the one or more features to a trained machine learning unit (126); and having the trained machine learning unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time (128).
- FIG. 2A is a simplified block diagram of a locomotion classification system 200 constructed according to an example embodiment of the invention.
- the locomotion classification system 200 of Figure 2A accepts an input signal 202 from one or more sensors such as packaged in the sensor package 102 of Figures 1A, IB and 1C.
- the input signal 202 is provided to a machine learning component, in this example an Artificial Neural Network (ANN) 204.
- ANN Artificial Neural Network
- the input signal 202 includes several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units.
- the ANN 204 receives input, optionally from all the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C.
- the ANN 204 produces, based on its input, a respective output 206.
- the output 206 is a confidence level that the input signal 202 is an input signal describing a specific type of locomotion, for example walking straight (WS).
- WS walking straight
- the output signal 206 indicates whether or not the input signal 202 corresponds to the specific type of locomotion, such as, for example, walking straight (WS).
- WS walking straight
- the output signal 206 indicates to which specific type of locomotion the input signal 202 corresponds, for example a value of "1" for a first specific locomotion class, and a value of "2" for a second specific locomotion class, and so on.
- FIG. 2B is a simplified block diagram of a locomotion classification system 210 constructed according to another example embodiment of the invention.
- the locomotion classification system 210 of Figure 2B accepts input signals
- the input signals 212 213 214 216 are provided to a machine learning component, in this example an Artificial Neural Network (ANN) 224.
- ANN Artificial Neural Network
- the input signals 212 213 214 216 include several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units.
- the ANN 224 produces, based on its input signals 212 213 214 216, outputs
- Each one of the outputs 226 227 228 230 is optionally a confidence level that the input signals 212 213 214 216 are input signals describing a different specific type of locomotion.
- the output signal 226 may provide a confidence level that the input signals 212 213 214 216 correspond to walking straight (WS), and the output signal 227 may provide a confidence level that the input signals
- the outputs 226 227 228 230 are optionally collected by a decision unit 232.
- the decision unit 232 optionally outputs an output signal 234.
- the output signal 234 indicates whether or not the input signals 212 213 214 216 correspond to a specific type of locomotion, such as, for example, walking straight (WS).
- the output signal 232 indicates to which specific type of locomotion the input signals 212 213 214 216 corresponds and/or whether the input signals 212 213 214 216 correspond to one of a set of specific types of locomotion.
- the decision unit 232 is an expert system.
- a Support Vector Machine instead of the ANN 224 acting as a machine learning unit, a Support Vector Machine (SVM) is used.
- SVM Support Vector Machine
- Support Vector Machines In machine learning, Support Vector Machines (SVMs, sometimes also termed support vector networks) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis.
- An example basic SVM takes a set of input data and predicts, for each given input, which of two possible classes forms the output, making it a non- probabilistic binary linear classifier.
- An SVM training algorithm Given a set of training examples, each marked as belonging to one of two categories, an SVM training algorithm builds a model that assigns new examples into one category or the other.
- An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall on.
- an SVM should also be understood to apply. A person ordinarily skilled in the art is able to discern when an SVM may be used in place of an ANN.
- training the SVM is optionally done in a similar manner to training the ANN.
- similar feature extraction is performed.
- feature vectors that corresponds to locomotion classes such as Walking Straight, Left Turn, Right Turn, and so on, are optionally produced by training one or more SVM instances, where the input is optionally similar to input vectors which are fed to the ANN.
- each instance of the SVM is trained to detect a specific locomotion class.
- a detection phase is optionally similar in an SVM embodiment as in an ANN embodiment, in that after the training phase is done, data captured by the sensor is processed fed to a set of SVM instances, with the data processing optionally including phases similar to data processing for the ANN - optional preprocessing, followed by a feature extraction phase.
- each SVM instance optionally classifies signal segments
- an expert system optionally combines results from the SVM classifications into a final result.
- FIG. 2C is a simplified block diagram of a locomotion classification system 250 constructed according to yet another example embodiment of the invention.
- the locomotion classification system 250 of Figure 2C accepts an input signal 252 from the sensors 102 of Figures 1A, IB and 1C.
- the input signal 252 is provided to several machine learning components, in this example several Artificial Neural Networks (ANNs) 256 257 258 259.
- ANNs Artificial Neural Networks
- the input signal 252 includes several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units.
- Each one of the ANNs 256 257 258 259 receives input, optionally from all the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C.
- each one of the ANNs 256 257 258 259 receives input, optionally from only some of the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C. For example, input from only two acceleration sensors - a front and back acceleration sensor, and a sideways acceleration sensor.
- the input signals are preprocessed, optionally within a preprocessing unit (not shown), transforming signals picked up by three perpendicular directions which do not necessarily correspond to front- and-back, sideways and up- and-down, to three perpendicular directions which do correspond to front-and-back, sideways and up-and-down.
- the transformation is optionally performed by detecting three perpendicular directions: a direction of an up-and-down motion, a direction of mostly forward motion, and a direction of little sideways motion, which can be achieved by a rotation of axes of the three perpendicular directions of the actual sensors.
- the preprocessing optionally includes a denoising of the input signals.
- the denoising may include a smoothing of the input signals, such as, by way of some non-limiting examples, low-pass filtering (LPF), and wavelet denoising.
- LPF low-pass filtering
- Each one of the ANNs 256 257 258 259 produces, based on its input, a respective output 266 267 268 269 of a confidence level that the input signal 252 is an input signal describing a specific type of locomotion, for example walking straight (WS).
- WS walking straight
- the outputs 266 267 268 269 are optionally collected by a decision unit 262.
- the decision unit 262 optionally outputs an output signal 265.
- the output signal 265 indicates whether or not the input signal 252 corresponds to a specific type of locomotion, such as, for example, walking straight (WS).
- a specific type of locomotion such as, for example, walking straight (WS).
- the output signal 265 indicates to which specific type of locomotion the input signal 252 corresponds and/or whether the input signal 252 corresponds to one of a set of specific types of locomotion.
- the decision unit 262 is an expert system, as will be further described below.
- Figure 3 is a graph 300 showing two input signals 310 312, which are examples of input signals similar to the input signal 252 of Figure 2C.
- the graph 300 of Figure 3 includes an x-axis 302 of time, and a y-axis 304 which is a qualitative indication of a signal's amplitude.
- the graph 300 depicts a first line 310 which corresponds to a signal from a gyroscopic sensor in a direction termed "x", which in this example is a sideways direction, and a second line 312 which corresponds to a signal from a linear acceleration sensor, which in this example is the sideways direction termed "y".
- a section 314 of the input signals marks one of a number of sections of the input signals which are known to represent a class of locomotion of a subject walking straight (WS). Section 314 is suitable for use in training one or more ANNs to recognize the WS locomotion.
- input signals such as depicted in section 314 of Figure 3 are used as a positive example to train a machine learning unit to identify the Walking Straight type of locomotion.
- Figure 4A is a graph 405 showing two input signals 410 412, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a right turn, according to an example embodiments of the invention.
- the graph 405 of Figure 4A includes an x-axis 407 of time, and a y-axis 409 which is a qualitative indication of a signal's amplitude.
- the graph 405 depicts a first line 410 which corresponds to a signal from the gyroscopic sensor in a direction termed "x", which in this example is a sideways direction, and a second line 412 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
- a section 414 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 414 is suitable for use in training one or more ANNs to recognize the right turn.
- input signals such as depicted in section 414 of Figure 4A are used as a positive example to train a machine learning unit to identify a "Turning Right" type of locomotion.
- Figure 4B is a graph 415 showing two input signals 420 422, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a right turn, according to an example embodiments of the invention.
- the graph 415 of Figure 4B includes an x-axis 417 of time, and a y-axis 419 which is a qualitative indication of a signal's amplitude.
- the graph 415 depicts a first line 420 which corresponds to a signal from the gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 422 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
- a section 424 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 424 is suitable for use in training one or more ANNs to recognize the right turn.
- input signals such as depicted in section 424 of Figure 4B are used as a positive example to train a machine learning unit to identify a "Turning Right" type of locomotion.
- Y-axis gyro readings may be used for determining whether a sensor is attached to the right leg of a subject or to the left leg of the subject.
- Such differentiation between a left-worn sensor and a right-worn sensor may exploit a certain kinematic property associated with a TO event; in humans, immediately after a leg is lifted off the ground, it tends to make a slight twist in the direction of the body's sagittal plane. Namely, the right leg twists to the left and the left leg twist to the right. This twist is barely noticeable with the naked eye, but can be discerned when using a sensor with a high enough sampling rate (e.g. in the range of tens or hundreds of samples per second).
- gyro Y readings which immediately follow a TO event are analyzed, to identify the twist.
- the identification is performed by observing the gyro Y readings starting at about 50 milliseconds (+50%) after the TO event, and lasting about 200 milliseconds (+50%).
- These timings may also be defined by the sampling rate of the sensor: the observation may start X samples after the TO event, where X is equal to 5% (+50%) of the sampling rate, and last Y samples, where Y is equal to 20% (+50%) of the sampling rate.
- Negative gyro Y readings indicate a left twist - meaning that the sensor is worn on the right leg. See Fig. 4E, which shows a graph 450 of a gyro Y signal 452 and a linear motion sensor signal 454 - both as a function of time. As exhibited in a section 456, the gyro Y value becomes noticeably negative over a time window of about 100 samples, which starts about 25 samples following a TO event (marked with a triangle 458). Conversely, positive gyro Y readings indicate a right twist - meaning that the sensor is worn on the left leg. See Fig.
- FIG. 4F which shows a graph 460 of a gyro Y signal 462 and a linear motion sensor signal 464 - both as a function of time.
- the gyro Y value becomes noticeably positive over a time window of about 100 samples, which starts about 25 samples following a TO event (marked with a triangle 458).
- the time window after a TO event, in which the twist is identified may be automatically and dynamically adapted as the subject walks. That is, the length of the time window may be adapted based on the walking speed of the subject and/or a step size of the subject, detected using one or more of the sensor.
- Figure 4C is a graph 425 showing two input signals 430 432, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a left turn, according to an example embodiments of the invention.
- the graph 425 of Figure 4C includes an x-axis 427 of time, and a y-axis 429 which is a qualitative indication of a signal's amplitude.
- the graph 425 depicts a first line 430 which corresponds to the signal from a gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 432 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
- a section 434 of the input signals marks a section of the input signals which is known to represent a left turn of a subject, and is suitable for use in training one or more ANNs to recognize the left turn.
- input signals such as depicted in section 434 of Figure 4C are used as a positive example to train a machine learning unit to identify a "Turning Left" type of locomotion.
- Figure 4D is a graph 435 showing two input signals 440 442, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a left turn, according to an example embodiment of the invention.
- the graph 435 of Figure 4D includes an x-axis 437 of time, and a y-axis 439 which is a qualitative indication of a signal's amplitude.
- the graph 435 depicts a first line 440 which corresponds to a signal from the gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 442 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
- a section 444 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 444 is suitable for use in training one or more ANNs to recognize the right turn.
- input signals such as depicted in section 444 of Figure
- 4D are used as a positive example to train a machine learning unit to identify a "Turning Left" type of locomotion.
- input signals such as depicted in sections 414 and 424 of Figures 4A and 4B are used as examples to train a machine learning unit to classify between a sensor attached to a right leg of a subject and a sensor attached to a left leg of a subject while the subject is turning right.
- input signals such as depicted in sections 434 and 444 of Figures 4C and 4D are used as examples to train a machine learning unit to classify between a sensor attached to a right leg of a subject and a sensor attached to a left leg of a subject while the subject is turning left.
- linear acceleration sensor In some embodiments only one linear acceleration sensor is used. In some embodiments two linear acceleration sensors are used. In some embodiments three linear acceleration sensors are used. In some embodiments even more linear acceleration sensors are used.
- the linear acceleration sensors are mounted within a sensor package in perpendicular directions.
- a potential benefit of mounting the linear acceleration sensors in perpendicular directions is that of capturing any movement of the subject, in any direction.
- only one gyroscopic sensor is used. In some embodiments two gyroscopic sensors are used. In some embodiments three gyroscopic sensors are used. In some embodiments even more gyroscopic sensors are used.
- the gyroscopic sensors are mounted within a sensor package in perpendicular directions.
- a potential benefit of mounting the gyroscopic sensors in perpendicular directions is that of capturing any movement of the subject, in any direction.
- the number of sensors to be used is preferably as many as needed to capture the subject's movement and enable automatic locomotion classification.
- linear acceleration sensors and gyroscopic sensors are inexpensive, so using more than one sensor and even three perpendicular sensors of each type, is not prohibitively expensive, and potentially adds to the accuracy of locomotion segmentation and classification.
- the Walking Segmentation Method (WSM)
- the WSM optionally uses one or more Artificial Neural Network (ANNs) which have optionally been trained in a supervised training fashion, that is, the ANNs are first trained, and then used for classification.
- ANNs Artificial Neural Network
- sensor data is optionally modeled as features.
- the features are a compact representation of the data.
- the features are results of signal processing the sensor data, optionally using wavelet-packet-decomposition (WPD) and spectral analysis.
- WPD wavelet-packet-decomposition
- the WSM uses four ANNs.
- Three of the ANNs are fully-connected Feed Forward Networks (FFNs), and the additional ANN is a Probabilistic Neural Network (PNN). All four networks are trained and used in classifying input signals. The use of 4 networks potentially increases the classification accuracy.
- FNNs Feed Forward Networks
- PNN Probabilistic Neural Network
- even one ANN or PNN may be trained and used.
- the ANN(s) may be implemented as software ANN(s) or as hardware ANN circuit(s) with appropriate surrounding support circuits.
- the walking segmentation method uses signal processing and machine learning techniques which are further described with reference to an example embodiment below.
- 3 linear accelerometer channels in directions termed x, y and z axes
- 3 gyroscopic sensor channels in directions termed x, y and z axes.
- input data comprises the x channel of the gyroscopic sensor and the same x channel of the linear accelerometer.
- the gyro x measurement is optionally used to identify and optionally extract the WS segments of the same x linear accelerometer signal.
- signal in the present patent application and claims. Wherever the term signal is used, it stands for either an analog signal or a digital signal. A person ordinarily skilled in the art is able to discern when an operation which is described as performed on a signal is inappropriate for use on either the analog signal or the digital signal, and should then understand that the operation is used on an appropriate form (analog/digital) of the signal, or that the signal is transformed into the appropriate form at a stage prior to performing the operation.
- Figure 3 depicts typical measurements of both x gyro and y accelerometer.
- the input data that is used by the ANNs of the example embodiment is that of the y accelerometer.
- the X gyro in the example embodiment is used for segmenting the input signal from the y accelerometer into one or more WS segments. Each y accelerometer segment is a potential input data entry in the learning data set.
- the analog y accelerometer segment is transformed into a vector of digital values, which is a time series of digital values of the analog input signal.
- the vector of digital values is a compact representation of a WS segment.
- a compact representation of an entry which can store most of the information in the input signal is desired.
- the representation is optionally invariant to possible transformations of input signal.
- the input signal is a y accelerometer signal recording.
- An example procedure for processing the input signal to extract its features is now described.
- An optional pre-processing stage where the input signal(s) may be improved, for example by de-noising, smoothing, expanding to a predetermined window size, and so on.
- the denoising is performed by using a filter which removes high frequencies from an input signal which are not associated with the walking.
- a feature extraction phase including one or more of:
- a vector of features is produced by optionally concatenating low frequency values of the DCT to the vector of normalized log energies.
- each Hamming window produces is own vector of features.
- Using several Hamming windows results in a matrix of features, where a row represents Hamming window features.
- the feature matrix rows are optionally concatenated as a single vector, which represents a final feature vector, corresponding to a WS segment.
- the above procedure is optionally repeated to produce several feature vectors, corresponding to several WS segments.
- a training set is built, it is optionally used to train the three ANN and the one PNN networks of the example embodiment described above. After training, a classification system is ready to be fed with new measured data for classification using the trained ANNs.
- the walking segmentation method uses a signal processing technique which is further described below.
- This technique is aimed at discerning, from sensor readings, segments of WS. This may be beneficial, as one example, in knee osteoarthritis analysis, which usually requires the patient to walk straight.
- knee osteoarthritis analysis single limb support (SLS) and double limb support (DLS) may be used to assess the functional status of the patient. These measurements should usually be done while the patient is walking straight. Therefore, according to the technique, sensor reading are used for detecting turns, thereby classifying the walking segments between turns as WS segments - during which gait analysis can be made.
- the technique for WS segmentation may be based on observing kinematic data represented by an integrated Y angular velocity (GY).
- GY Y angular velocity
- the integration serves as an estimation of the Y direction angular change.
- the sensor should measure zero acceleration (AY) and zero angular velocity (GY) in the Y direction (which is perpendicular to X (forward) and Z (up) directions).
- AY zero acceleration
- GY zero angular velocity
- the sensor should measure non-zero Y values of acceleration and angular velocity.
- the senor does not commonly measure zero values during WS segments. There are various reasons for the non-zero measurements, such as mechanical noise and/or miscalibration of the sensor. Still, it is possible to differentiate turn segments from WS segments by observing high amplitudes in GY and AY, whereas in WS segments these amplitudes are lower. Since the gyro measurements inherently produce drift over time, naive integration over gyro measurements may not suffice for amplitude classification. Gyro integration during turn may provide an angle which may be misinterpreted as a noisy WS segment. Therefore, before integrating GY, we may use AY as a weight function for the GY values. During turns, AY measurements get higher values.
- VY is a smooth function. It slowly fluctuates around zero in WS segments and slowly fluctuates around a non-zero value during a period of time when the patient turns. Therefore, VY is an advantageous choice for a weight function.
- the integration may be performed, for example, in the time range between approximately 5 milliseconds ( ⁇ 50%) after a TO and until an HS ( ⁇ 50%), where TO is a toe-off event (indicating the point in time when patient has fully lifted its foot off the ground) and HS is a heel-strike event (indicating the point in time when the patient's foot re-touches the ground).
- TO is a toe-off event (indicating the point in time when patient has fully lifted its foot off the ground)
- HS heel-strike event
- the constant factor may be different from 25, such as between 5-10, 10-15, 15-20, 20-25, 25-30, 30-35 or higher.
- these scalars will be converted to match the other sampling rate, as known in the art.
- the result of the definite integral is direction feature (f), a number:
- the f feature may be thresholded, though it depends on the integration range, or, in other words, on the length of the stride. Therefore, the threshold for each segment is a normalized factor which represents percentage of the swing out of the entire step length.
- thresh — -—— - (2)
- HS 1 represents the first heel- strike event and HS 2 represents the next, consecutive, heel-strike (HS) event.
- f the absolute value of f is compared to thresh. If it is lower than thresh, we classify the corresponding segment as WS. In case f is higher than thresh we classify the corresponding segment as a turn. Negative and positive signs suggest left and right turns, respectively.
- Fig. 6A shows a graph 635 of an integral 640 of GY and of that integral multiplied by a gyroscope reading 642 - both as a function of time.
- the behavior of graph 635 is indicative of a right turn.
- Fig. 6B shows a graph 645 of an integral 650 of GY and of that integral multiplied by a gyroscope reading 652 - both as a function of time.
- the behavior of graph 645 is indicative of a left turn.
- Fig. 6C shows a graph 655 of an integral 660 of GY and of that integral multiplied by a gyroscope reading 662 - both as a function of time.
- the behavior of graph 655 is indicative of SW.
- training the ANNS is done in a supervised fashion.
- a training data set is produced by extracting WS segments from one or more recordings.
- the WSM performs multiple classifications, that is, its training data set includes WS segments and various non-WS segments.
- the ANNs classify input signals, whereas a decision unit which includes an expert system accepts or denies the classification.
- the expert system optionally classifies non-WS segments when an input example is classified as one of the non-WS segments, and/or when classification confidence is poor. This is further described below.
- the WSM trains a number of FFN (Feed-Forward Network) networks, and saves for classification several of the FFNs which provided best locomotion classification performance during training.
- FFN eed-Forward Network
- an additional trained PNN (Probabilistic Neural Network) is used to classify locomotion.
- a classification system by way of a non-limiting example a classification system as depicted in Figure 2C is fed with extracted segments.
- Each of the segments is manually labeled according to its walking nature (i.e. WS, S, TL, TR, and C).
- the labeled segments are a training dataset.
- features are extracted and used to train one or more ANNs.
- the trained networks are optionally tested against a test set of signals.
- a test set is optionally the same as, or similar to, the training dataset though the test set is optionally not used in the training stage.
- the test set is optionally used to assess accuracy of neural network performance.
- Figure 5 is a simplified flowchart illustration of training of an automatic locomotion classification system which includes a machine learning component according to an example embodiment of the invention.
- the method depicted in Figure 5 includes:
- 4 neural networks are optionally trained for each given type of locomotion; three neural networks of type FFN (Feed-Forward Network) and one neural network of type PNN (Probabilistic Neural Network).
- the neural networks are assigned with classification weights according to successful classifications produced in the learning stage.
- the classification weights are optionally used to compute a total expectation confidence for each output classification.
- training was terminated when the following WS recognition values were obtained on the training set:
- the values correspond qualitatively, but not necessarily quantitatively, to a success rate of recognition of WS over the training data.
- the above trained networks are considered to have the following confidence values for detection of a Walking Straight type of locomotion:
- an overall confidence of detecting WS for an example input signal is taken to be the maximal value provided by the neural networks.
- an overall confidence of detecting WS for an example input signal is taken to be an average of the values provided by the neural networks.
- it is a decision unit, such as the decision units 232 262 of Figures 2B and 2C which selects the maximal value or calculates the average value.
- a confidence level lower than a specific value by way of a non- limiting example lower than .5 optionally indicates negative classification, and may be interpreted as 1 -confidence for negating a specific locomotion classification.
- an input signal is optionally fed to networks which are each trained to identify a specific class of locomotion.
- each type of locomotion has 4 neural networks as described above. For each type of locomotion, an expected confidence is computed based on the output of all 4 networks specific to the locomotion. This produces confidence levels for the types of locomotion. A final classification is optionally based on which type of locomotion has a highest confidence.
- the input signal if the highest confidence corresponds, for example, to WS, and if the confidence is above a predefined threshold, the input signal is classified as WS. Otherwise, the input signal is classified as a non-WS segment.
- Automatic locomotion classification if the highest confidence corresponds, for example, to WS, and if the confidence is above a predefined threshold, the input signal is classified as WS. Otherwise, the input signal is classified as a non-WS segment.
- recording data of a subject's locomotion include various walking segments. For example, segments of walking straight (WS), standing (S), turning left or right (TL/TR), or climbing up stairs (C).
- locomotion classification is performed over WS segments only.
- the WS segments are optionally extracted from, or identified in, the recorded data.
- one or more WS segments are optionally marked by a human operator, and locomotion is then automatically analyzed in the WS segments.
- one or more WS segments are automatically identified as a first process, and a second process is used to automatically analyze the subject's locomotion within the WS segments.
- specific portions of walking are extracted for purposes other than locomotion classification.
- Example purposes include: tracking how straight a patient walks - which potentially identifies some medical problems; tracking a straight-walking speed and variations in the speed; and tracking stride size and its variations - for instance to predict risk of falling in an elderly population.
- walking patterns of a subject may be tracked, both inside and outside a laboratory setting. In some cases it is desired to track gait of a subject under natural conditions, and/or as the gait develops over a day, as the subject gets tired.
- the analysis itself is human intensive and takes many hours by a trained personal.
- Some embodiments of the invention perform automatic locomotion classification, which potentially saves time and money.
- the time saving potentially includes both a professional's time, for analyzing, and a subject's time, for staying at the laboratory and/or sport facility.
- the money saving potentially includes a saving in employing a professional to perform the classification.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- compositions, methods or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
- a unit or “at least one unit” may include a plurality of units, including combinations thereof.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Physiology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Psychiatry (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361760098P | 2013-02-03 | 2013-02-03 | |
US61/760,098 | 2013-02-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014118767A1 true WO2014118767A1 (en) | 2014-08-07 |
Family
ID=51261546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2013/051004 WO2014118767A1 (en) | 2013-02-03 | 2013-12-05 | Classifying types of locomotion |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014118767A1 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016081946A1 (en) * | 2014-11-21 | 2016-05-26 | The Regents Of The University Of California | Fast behavior and abnormality detection |
EP3396319A4 (en) * | 2015-12-24 | 2018-12-26 | Fujitsu Limited | Information processing system, information processing program, and information processing method |
WO2019173321A1 (en) * | 2018-03-06 | 2019-09-12 | Anki, Inc. | Robot transportation mode classification |
CN111351524A (en) * | 2018-12-21 | 2020-06-30 | 亚玛芬体育数字服务公司 | Sensor data management |
CN111694829A (en) * | 2020-06-10 | 2020-09-22 | 北京卡路里信息技术有限公司 | Motion trail processing method and device and motion trail processing system |
US10856776B2 (en) | 2015-12-21 | 2020-12-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
IT201900014631A1 (en) * | 2019-08-12 | 2021-02-12 | Webbdone Srl | HANDLING METHOD FOR VIRTUAL REALITY |
CN113303789A (en) * | 2021-04-30 | 2021-08-27 | 武汉齐物科技有限公司 | Gait event detection method and device based on acceleration |
US11137820B2 (en) | 2015-12-01 | 2021-10-05 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11145272B2 (en) | 2016-10-17 | 2021-10-12 | Amer Sports Digital Services Oy | Embedded computing device |
US11144107B2 (en) | 2015-12-01 | 2021-10-12 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11210299B2 (en) | 2015-12-01 | 2021-12-28 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11215457B2 (en) | 2015-12-01 | 2022-01-04 | Amer Sports Digital Services Oy | Thematic map based route optimization |
US11284807B2 (en) | 2015-12-21 | 2022-03-29 | Amer Sports Digital Services Oy | Engaging exercising devices with a mobile device |
FR3118235A1 (en) * | 2020-12-17 | 2022-06-24 | Orange | Movement mode recognition by motion sensor |
US11541280B2 (en) | 2015-12-21 | 2023-01-03 | Suunto Oy | Apparatus and exercising device |
US11587484B2 (en) | 2015-12-21 | 2023-02-21 | Suunto Oy | Method for controlling a display |
US11607144B2 (en) | 2015-12-21 | 2023-03-21 | Suunto Oy | Sensor based context management |
US11703938B2 (en) | 2016-10-17 | 2023-07-18 | Suunto Oy | Embedded computing device |
US11838990B2 (en) | 2015-12-21 | 2023-12-05 | Suunto Oy | Communicating sensor data in wireless communication systems |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080146968A1 (en) * | 2006-12-14 | 2008-06-19 | Masuo Hanawaka | Gait analysis system |
US20110054833A1 (en) * | 2009-09-02 | 2011-03-03 | Apple Inc. | Processing motion sensor data using accessible templates |
US20110054359A1 (en) * | 2009-02-20 | 2011-03-03 | The Regents of the University of Colorado , a body corporate | Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator |
WO2011033799A1 (en) * | 2009-09-18 | 2011-03-24 | 株式会社日立製作所 | Management method of computer system, computer system, and program for same |
-
2013
- 2013-12-05 WO PCT/IL2013/051004 patent/WO2014118767A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080146968A1 (en) * | 2006-12-14 | 2008-06-19 | Masuo Hanawaka | Gait analysis system |
US20110054359A1 (en) * | 2009-02-20 | 2011-03-03 | The Regents of the University of Colorado , a body corporate | Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator |
US20110054833A1 (en) * | 2009-09-02 | 2011-03-03 | Apple Inc. | Processing motion sensor data using accessible templates |
WO2011033799A1 (en) * | 2009-09-18 | 2011-03-24 | 株式会社日立製作所 | Management method of computer system, computer system, and program for same |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016081946A1 (en) * | 2014-11-21 | 2016-05-26 | The Regents Of The University Of California | Fast behavior and abnormality detection |
US10503967B2 (en) | 2014-11-21 | 2019-12-10 | The Regents Of The University Of California | Fast behavior and abnormality detection |
US11215457B2 (en) | 2015-12-01 | 2022-01-04 | Amer Sports Digital Services Oy | Thematic map based route optimization |
US11210299B2 (en) | 2015-12-01 | 2021-12-28 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11144107B2 (en) | 2015-12-01 | 2021-10-12 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11137820B2 (en) | 2015-12-01 | 2021-10-05 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US10856776B2 (en) | 2015-12-21 | 2020-12-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
US11587484B2 (en) | 2015-12-21 | 2023-02-21 | Suunto Oy | Method for controlling a display |
US11838990B2 (en) | 2015-12-21 | 2023-12-05 | Suunto Oy | Communicating sensor data in wireless communication systems |
US11607144B2 (en) | 2015-12-21 | 2023-03-21 | Suunto Oy | Sensor based context management |
US11541280B2 (en) | 2015-12-21 | 2023-01-03 | Suunto Oy | Apparatus and exercising device |
US11284807B2 (en) | 2015-12-21 | 2022-03-29 | Amer Sports Digital Services Oy | Engaging exercising devices with a mobile device |
EP3396319A4 (en) * | 2015-12-24 | 2018-12-26 | Fujitsu Limited | Information processing system, information processing program, and information processing method |
US11145272B2 (en) | 2016-10-17 | 2021-10-12 | Amer Sports Digital Services Oy | Embedded computing device |
US11703938B2 (en) | 2016-10-17 | 2023-07-18 | Suunto Oy | Embedded computing device |
WO2019173321A1 (en) * | 2018-03-06 | 2019-09-12 | Anki, Inc. | Robot transportation mode classification |
CN111351524A (en) * | 2018-12-21 | 2020-06-30 | 亚玛芬体育数字服务公司 | Sensor data management |
TWI729596B (en) * | 2018-12-21 | 2021-06-01 | 芬蘭商亞瑪芬體育數字服務公司 | Sensor data management |
IT201900014631A1 (en) * | 2019-08-12 | 2021-02-12 | Webbdone Srl | HANDLING METHOD FOR VIRTUAL REALITY |
CN111694829A (en) * | 2020-06-10 | 2020-09-22 | 北京卡路里信息技术有限公司 | Motion trail processing method and device and motion trail processing system |
CN111694829B (en) * | 2020-06-10 | 2023-08-15 | 北京卡路里信息技术有限公司 | Motion trail processing method and device and motion trail processing system |
FR3118235A1 (en) * | 2020-12-17 | 2022-06-24 | Orange | Movement mode recognition by motion sensor |
CN113303789B (en) * | 2021-04-30 | 2023-01-10 | 武汉齐物科技有限公司 | Gait event detection method and device based on acceleration |
CN113303789A (en) * | 2021-04-30 | 2021-08-27 | 武汉齐物科技有限公司 | Gait event detection method and device based on acceleration |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014118767A1 (en) | Classifying types of locomotion | |
US10918312B2 (en) | Wearable and connected gait analytics system | |
CN101394788B (en) | Gait analysis | |
Antwi-Afari et al. | Deep learning-based networks for automated recognition and classification of awkward working postures in construction using wearable insole sensor data | |
US9307932B2 (en) | System and method for 3D gait assessment | |
US20190150793A1 (en) | Method and System for Analyzing Human Gait | |
Mannini et al. | Walking speed estimation using foot-mounted inertial sensors: Comparing machine learning and strap-down integration methods | |
KR20160031246A (en) | Method and apparatus for gait task recognition | |
Santhiranayagam et al. | A machine learning approach to estimate minimum toe clearance using inertial measurement units | |
AU2010286471A1 (en) | Characterizing a physical capability by motion analysis | |
Khandelwal et al. | Identification of gait events using expert knowledge and continuous wavelet transform analysis | |
CN108958482B (en) | Similarity action recognition device and method based on convolutional neural network | |
Sama et al. | Analyzing human gait and posture by combining feature selection and kernel methods | |
Iervolino et al. | A wearable device for sport performance analysis and monitoring | |
EP3808268B1 (en) | System and method for shoulder proprioceptive analysis | |
WO2021028641A4 (en) | Method and system for analysing biomechanical activity and exposure to a biomechanical risk factor on a human subject in a context of physical activity | |
KR20210046121A (en) | Apparatus and method for identify patients with parkinson's disease and patients with podarthritis by performing neural network analysis by various detection information | |
US11497452B2 (en) | Predictive knee joint loading system | |
Kour et al. | Sensor technology with gait as a diagnostic tool for assessment of Parkinson’s disease: a survey | |
KR102128268B1 (en) | Method and system for walking ability prediction using foot characteristics information | |
KR102194313B1 (en) | Apparatus and method for identifying individuals by performing neural network analysis for various detection information | |
Ma et al. | Toward robust and platform-agnostic gait analysis | |
McCalmont et al. | eZiGait: toward an AI gait analysis and sssistant system | |
JP2021030049A (en) | Sarcopenia evaluation method, sarcopenia evaluation device, and sarcopenia evaluation program | |
JP2021030050A (en) | Cognitive function evaluation method, cognitive function evaluation device, and cognitive function evaluation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13874174 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13874174 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 12.02.2016) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13874174 Country of ref document: EP Kind code of ref document: A1 |