WO2010128423A1 - Procédé et système d'analyse de données de mouvement - Google Patents

Procédé et système d'analyse de données de mouvement Download PDF

Info

Publication number
WO2010128423A1
WO2010128423A1 PCT/IB2010/051825 IB2010051825W WO2010128423A1 WO 2010128423 A1 WO2010128423 A1 WO 2010128423A1 IB 2010051825 W IB2010051825 W IB 2010051825W WO 2010128423 A1 WO2010128423 A1 WO 2010128423A1
Authority
WO
WIPO (PCT)
Prior art keywords
sound
motion data
movements
events
processor
Prior art date
Application number
PCT/IB2010/051825
Other languages
English (en)
Inventor
Gerd Lanfermann
Juergen Te Vrugt
Stefan Winter
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2010128423A1 publication Critical patent/WO2010128423A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • A61B5/1038Measuring plantar pressure during gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/006Detecting skeletal, cartilage or muscle noise

Definitions

  • the invention relates to motion analysis, and in particular to a method and system for analyzing motion data.
  • the segmentation of motion data is an essential task in motion analysis.
  • the segmentation of motion data refers to the decomposition of a continuous data stream of motion data into segments that each relate to a basic or simple movement in an exercise (such as the movement of a limb, or an individual step when the motion data relates to walking) or to repetitions of an exercise.
  • motion data can be matched up to templates in order to identify particular movements within the motion data.
  • a particular movement or motion is always performed with slight variations, even if the motion is repetitive and performed by the same person.
  • the lines represent the repeated execution of the same drinking movement (i.e. lifting a cup to the mouth and replacing the cup on a table).
  • the person may have a disability (for example caused by a stroke) which means that they are unable to perform the movement in the "normal" way (as defined in stored templates).
  • the graph in the top left shows the movements of a healthy person lifting their arm repetitively, whereas the other three graphs show the movements recorded from three patients attempting the same movement who have suffered strokes.
  • Each graph shows three lines, with each line representing the respective angles of the upper arm, lower arm and torso of the person performing the exercise.
  • the person might not execute the exercises properly (particularly if they are inexperienced with the exercise), which means, again, the recorded movements will not match the stored templates.
  • Fig. 3 the person might not execute the exercises properly (particularly if they are inexperienced with the exercise), which means, again, the recorded movements will not match the stored templates.
  • the left part of the graph represents the movements of three people performing an exercise in which they (correctly) keep their trunk stable (in terms of the angle of their torso), whereas the right part of the graph shows the movements recorded during the same exercise when the trunks of the three people are unstable (in terms of the angle of the torso).
  • the obtained motion data can be affected if the sensors are not calibrated correctly or if they are not attached to the person in the right way. In each of these cases, the motion data can be completely different to the expected or desired form and it is difficult, if not impossible, to use a pattern or template matching process to identify the proper segments.
  • a method of analyzing motion data representing movements of a user comprising the steps of identifying sound events occurring during the movements; and dividing the motion data into a plurality of segments using the identified sound events.
  • a system for analyzing motion data representing movements of a user comprising a processor for identifying sound events from measurements of sound occurring during the movements and for dividing the motion data into a plurality of segments using the identified sound events.
  • a third aspect of the invention relates to a computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method defined above.
  • Fig. 1 is a graph illustrating the variation in motion data across a repeated movement
  • Fig. 2 is a set of graphs illustrating the variation in motion data between healthy and impaired people
  • Fig. 3 is a graph illustrating the variation in motion data when an exercise is executed with a stable trunk and an unstable trunk;
  • Fig. 4 is an illustration of a system for segmenting motion data in accordance with the invention.
  • Fig. 5 is a more detailed block diagram of the system according to the invention.
  • Fig. 6 is a flow chart illustrating a method of segmenting motion data in accordance with the invention.
  • Fig. 7 is a graph illustrating the use of sound events to segment motion data from a healthy person.
  • Fig. 8 is a graph illustrating the use of sound events to segment motion data from an impaired person.
  • the system 2 comprises a processing unit 4 that receives motion data relating to the movements or characteristics of the movement of a user 6.
  • the motion data can be captured using any conventional technique, such as by placing movement sensors 8a, 8b on a part or parts of the user 6 and connecting the movement sensors 8a, 8b wirelessly or through a wired connection to the processing unit 4.
  • the movement of the user 6 can be captured using one or more cameras (not shown) that track the movement of markers that are attached to different parts of the user 6.
  • the system 2 also comprises a sound sensor
  • the sound sensor 10 such as a microphone, that collects sound information while motion data is being collected.
  • the sound sensor 10 is connected to the processing unit 4 through a wired connection, but it could alternatively be connected to the processing unit 4 wirelessly.
  • the sound sensor 10 detects sounds that occur while the user 6 is exercising or performing some movement, and the processing unit 4 uses this sound information to assist in segmenting the collected motion data into particular basic or distinct movements.
  • Fig. 4 shows a table 12 and a cup 14 that the user 6 is to repeatedly pick up and put down on the table 12. Each time that the cup 14 contacts the table 12, there will be an audible sound that is detected by the sound sensor 10.
  • the processing unit 4 can analyze the output of the sound sensor 10 to identify the sound of the cup 14 contacting the table 12 and can use the detected sound (and specifically the time that the sound event occurred) to assist in segmenting the collected motion data into separate repetitions of the exercise.
  • Fig. 5 shows the system 2 in accordance with the invention in more detail.
  • the processing unit 4 comprises a processor 16 that is connected to the movement sensors 8a, 8b and sound sensor 10.
  • the processing unit 4 also comprises a database 18 that stores data for use by the processor 16 in segmenting the motion data and identifying particular basic or distinct movements.
  • the database 18 can store templates for particular sound events (for example a cup being placed onto a table or a footstep), and templates for particular basic movements (such as an arm being raised and lowered).
  • the processing unit 4 also comprises a memory 20 connected to the processor that temporarily stores data collected from the sound sensor 10 and movement sensors 8a, 8b.
  • the sound event templates can be stored in a format that is suitable for comparison to the output from the sound sensor 10.
  • the sound sensor 10 can provide measurements of the detected sound amplitude, and the templates can comprise corresponding amplitude measurements expected during that type of event.
  • the sound templates can be provided in terms of the "shape" of the sound event as represented by a plot of the measured sound amplitude over time and the processor 16 can use appropriate shape recognition techniques to compare the measured sound event with the stored templates to identify the measured sound.
  • the movement templates can be stored in a format that is suitable for comparison to the output from the movement sensors 8a, 8b.
  • the templates can comprise amplitude or angular measurements over time, or an expected "shape" of the movement.
  • the sound sensor 10 collects sound measurements
  • the processor 16 identifies a particular sound event from the sound measurements using templates stored in the database 18 and uses a timestamp associated with a particular identified sound event or events to assist in segmenting a stream of motion data that is being collected by movement sensors 8a, 8b on the user 6.
  • the processor 16 may use the movement templates stored in the database 18.
  • the sound event may identify a specific phase in the movement performed by the user 6, or, when multiple, different, sound events are identified, they can identify multiple phases in the movement.
  • the movement templates can indicate a position or a range of positions in which a specific sound event (such as a cup contacting a table or a footstep) is expected to be found, and the processor 16 can use the timing of the identified sound event to match the motion data to the movement template.
  • a specific sound event such as a cup contacting a table or a footstep
  • the processor 16 can use the timing of the identified sound event to match the motion data to the movement template.
  • motion data can be segmented using alternative techniques that do not use movement templates.
  • Fig. 6 illustrates a method of segmenting motion data in more detail.
  • step 101 motion data is collected.
  • the motion data can be collected using movement sensors 8a, 8b located on particular parts of the user's body, or can be collected using any other suitable form of arrangement, such as a camera-based tracking system.
  • the motion data is stored in the memory 20 together with timing information that indicates the time at which each motion data sample or set of motion data samples was measured.
  • the format of the motion data will depend on the sensors used to collect the motion data.
  • movement sensors 8a, 8b can provide angular information for the relevant limbs of the user 6 over time
  • a camera-based system can provide positional information for the user 6 over time.
  • the sound sensor 10 can be a microphone, such as a piezoelectric microphone, and can be placed close to, or on, a surface which makes contact with parts of the body of the user 6 or objects during the training or exercise, and which can be used to segment motion data collected during the training or exercise.
  • the sound sensor 10 can be placed on a table (as shown in Fig. 4) to detect a cup or other object being placed on the table, or on the ground to detect footsteps.
  • step 105 continuing from step 103, the processor 16 retrieves one or more sound profiles or templates from the database 18.
  • the profiles or templates may comprise the shape of the relevant sound event or its typical amplitude.
  • step 107 the sound profiles or templates retrieved from the database 18 in step 105 are compared to the sound data collected in step 103 to identify sound events corresponding to the profiles or templates in the sound data.
  • the processor 16 can use pattern recognition algorithms or speech processing algorithms to identify particular sound events using the profiles or templates.
  • a sound event can comprise a single signal or sample whose amplitude is above a threshold or a more complex sound pattern. Complex sound patterns may be generated by, for example, an item (such as a shoe or hand) rubbing over a certain surface.
  • the processor 16 can distinguish and identify complex sounds such as these by using frequency analysis (for example using Fourier Transforms) and/or pattern recognition algorithms (for example Discriminative Time Warping). If a sound event or events corresponding to the sound profiles or templates is identified in the sound data, a signal identifying the sound event is generated and the time that the sound event occurred is determined from the sound data (step 109).
  • step 111 the processor 16 combines the output of step 109 and step 101.
  • the processor 16 applies the signals identifying the type and timing of the sound events to the stream of motion data, and uses the locations (in terms of time) of the sound events to determine or suggest how the motion data should be segmented into particular basic movements or exercises.
  • the processor 16 can use motion data templates that include information on when a specific sound event is expected to be heard or other information regarding the exercise or movement being performed by the user 6 to determine how the motion data should be segmented.
  • Subsequent analysis of the motion data segments can be performed using any conventional technique, including by comparing each segment to a movement template.
  • Figs. 7 and 8 illustrate the application of the invention to motion data obtained from a user 6 performing the lifting exercise shown in Fig. 4.
  • the exercise involves the user 6 picking up a cup 14 from a table 12 and placing it back down on the table 12.
  • the sound sensor 10 is placed on the table 12 to detect sound events corresponding to the cup 14 contacting the table 12.
  • the database 18 in the processing unit 4 includes a sound template for a cup 14 contacting the table 12. Typically, this sound template will show a relatively sharp peak.
  • Fig. 7 shows the motion data (in terms of the angle of the upper arm of the user 6) obtained from a healthy user 6.
  • the motion data is marked with reference numeral 22 and represents the movement of the arm of the user 6.
  • the motion data 22 is almost periodic, and there are only relatively small variations between the motion data 22 for each repetition of the exercise.
  • the detected sound events that correspond to the sound template for a cup 14 contacting the table 12 are superimposed on the motion data graph 22.
  • Each of these sound events are marked with reference numeral 24.
  • each of the sound events 24 occurs at approximately the same point in the exercise cycle, when the arm is at its lowest point.
  • the motion data can be segmented into separate repetitions of the exercise. In this case, each segment starts at the time that the arm is at its highest point before the cup 14 contacts the table 12. The start of each segment is marked by line 26.
  • Fig. 8 shows the motion data (again in terms of the angle of the upper arm of the user 6) obtained from a user 6 that has an impairment that prevents them from performing the particular exercise properly.
  • the motion data is marked with reference numeral 22 and represents the movement of the arm of the user 6. Due to the impairment, the user 6 is unable to perform a steady arm movement, and this is represented by the motion data shown in the graph. In particular, it appears that there is no discernable periodic pattern, and there are significant variations between each repetition of the exercise.
  • the sound data collected from the sound sensor 10 is shown in the lower graph in Fig. 8. It can be seen that there were four sound events detected (that each correspond to the sound template for a cup 14 contacting a table 12) and these have been superimposed on the motion data graph. Again, each sound event is marked by a line 24. Using this sound event information, it is possible to segment the motion data
  • each segment starts at the time that the arm is at its highest point before the cup 14 contacts the table 12, and is marked by a respective line 26.
  • the sound event data it can be seen from the use of the sound event data that three of the repetitions (starting at lines 26a, 26c and 26d) are relatively similar (in terms of their duration and highest and lowest arm position), and that user 6 had some trouble in performing the repetition starting at line 26b.
  • sound event data can be used to assist in segmenting motion data.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Abstract

La présente invention concerne un procédé d'analyse de données de mouvement représentant les mouvements d'un utilisateur. Ledit procédé comprend les étapes suivantes : identification d'événements sonores se produisant durant les mouvements ; et division des données de mouvement en une pluralité de segments au moyen d'événements sonores identifiés.
PCT/IB2010/051825 2009-05-04 2010-04-27 Procédé et système d'analyse de données de mouvement WO2010128423A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP09159320.2 2009-05-04
EP09159320 2009-05-04

Publications (1)

Publication Number Publication Date
WO2010128423A1 true WO2010128423A1 (fr) 2010-11-11

Family

ID=42238697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/051825 WO2010128423A1 (fr) 2009-05-04 2010-04-27 Procédé et système d'analyse de données de mouvement

Country Status (1)

Country Link
WO (1) WO2010128423A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0062459A2 (fr) * 1981-04-03 1982-10-13 National Research Development Corporation Appareil de diagnostic en orthopédie
US4823807A (en) * 1988-02-11 1989-04-25 Board Of Regents, Univ. Of Texas System Device for non-invasive diagnosis and monitoring of articular and periarticular pathology
US4836218A (en) * 1984-04-09 1989-06-06 Arthrotek, Inc. Method and apparatus for the acoustic detection and analysis of joint disorders
US20020107649A1 (en) * 2000-12-27 2002-08-08 Kiyoaki Takiguchi Gait detection system, gait detection apparatus, device, and gait detection method
WO2002067449A2 (fr) * 2001-02-20 2002-08-29 Ellis Michael D Systemes et procedes de reseaux personnels modulaires
WO2003065891A2 (fr) * 2002-02-07 2003-08-14 Ecole Polytechnique Federale De Lausanne (Epfl) Dispositif de surveillance de mouvement du corps
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
DE102007038392A1 (de) * 2007-07-11 2009-01-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zur Vorhersage eines Kontrollverlustes über einen Muskel

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0062459A2 (fr) * 1981-04-03 1982-10-13 National Research Development Corporation Appareil de diagnostic en orthopédie
US4836218A (en) * 1984-04-09 1989-06-06 Arthrotek, Inc. Method and apparatus for the acoustic detection and analysis of joint disorders
US4836218B1 (fr) * 1984-04-09 1991-12-17 Arthrotek Inc
US4823807A (en) * 1988-02-11 1989-04-25 Board Of Regents, Univ. Of Texas System Device for non-invasive diagnosis and monitoring of articular and periarticular pathology
US20020107649A1 (en) * 2000-12-27 2002-08-08 Kiyoaki Takiguchi Gait detection system, gait detection apparatus, device, and gait detection method
WO2002067449A2 (fr) * 2001-02-20 2002-08-29 Ellis Michael D Systemes et procedes de reseaux personnels modulaires
WO2003065891A2 (fr) * 2002-02-07 2003-08-14 Ecole Polytechnique Federale De Lausanne (Epfl) Dispositif de surveillance de mouvement du corps
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
DE102007038392A1 (de) * 2007-07-11 2009-01-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zur Vorhersage eines Kontrollverlustes über einen Muskel

Similar Documents

Publication Publication Date Title
CN102245102B (zh) 用于心冲击图信号的分析的方法和设备
US20200319710A1 (en) Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system
US10327670B2 (en) Systems, methods and devices for exercise and activity metric computation
CN106175778B (zh) 一种建立步态数据集的方法及步态分析方法
CN107157450B (zh) 用于对帕金森病人的手部运动能力进行量化评估方法和系统
US20190160339A1 (en) System and apparatus for immersive and interactive machine-based strength training using virtual reality
US20230085511A1 (en) Method and system for heterogeneous event detection
EP3589199A1 (fr) Algorithmes d'apprentissage profond pour la détection de battements cardiaques
Altaf et al. Acoustic gaits: Gait analysis with footstep sounds
EP3241492B1 (fr) Procédé et dispositif de détection de fréquence cardiaque
WO2014089238A1 (fr) Système et procédé d'analyse de démarche
US20180014756A1 (en) Method for Determining Types of Human Motor Activity and Device for Implementing Thereof
CN103717124A (zh) 用于获得并处理生物的测量读数的设备与方法
WO2017161734A1 (fr) Correction de mouvements du corps humain par l'intermédiaire d'un téléviseur et d'un accessoire de détection de mouvement, et système
WO2014153665A1 (fr) Système et procédé pour surveiller un sujet
JP2019508123A (ja) 心拍数情報を抽出するデバイス及び方法
Gupta et al. A wearable multisensor posture detection system
CN109063661A (zh) 步态分析方法及装置
CN108242260B (zh) 一种健身监测方法及装置
Malawski et al. Real-time action detection and analysis in fencing footwork
KR101875472B1 (ko) 지화 인식 시스템 및 방법
WO2010128423A1 (fr) Procédé et système d'analyse de données de mouvement
Marques et al. Exploring the Stationary Wavelet Transform detail coefficients for detection and identification of the S1 and S2 heart sounds
JP2007121217A (ja) 身体動作解析装置
CN114869274B (zh) 运动姿态检测方法、装置、点阵式柔性垫

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10717853

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10717853

Country of ref document: EP

Kind code of ref document: A1