WO2009053901A2 - Method and system for selecting the viewing configuration of a rendered figure - Google Patents

Method and system for selecting the viewing configuration of a rendered figure Download PDF

Info

Publication number
WO2009053901A2
WO2009053901A2 PCT/IB2008/054325 IB2008054325W WO2009053901A2 WO 2009053901 A2 WO2009053901 A2 WO 2009053901A2 IB 2008054325 W IB2008054325 W IB 2008054325W WO 2009053901 A2 WO2009053901 A2 WO 2009053901A2
Authority
WO
WIPO (PCT)
Prior art keywords
motion data
data
viewing configuration
measurements
captured
Prior art date
Application number
PCT/IB2008/054325
Other languages
French (fr)
Other versions
WO2009053901A3 (en
WO2009053901A8 (en
Inventor
Juergen Te Vrugt
Richard D. Willmann
Gerd Lanfermann
Stefan Winter
Privender K. Saini
Annick A. A. Timmermans
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2010530605A priority Critical patent/JP5575652B2/en
Priority to US12/738,659 priority patent/US9418470B2/en
Priority to CN2008801133925A priority patent/CN101836237B/en
Priority to EP08841127.7A priority patent/EP2203896B1/en
Publication of WO2009053901A2 publication Critical patent/WO2009053901A2/en
Publication of WO2009053901A3 publication Critical patent/WO2009053901A3/en
Publication of WO2009053901A8 publication Critical patent/WO2009053901A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/06363D visualisation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/30Speed
    • A63B2220/34Angular speed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention relates to a method and a system for determining a viewing configuration of a rendered figure, for example of a person of whom posture analysis or analysis of his movements is desired.
  • the present invention relates to a method for automatically selecting one of more suitable viewing configurations of rendered images made from a person following rehabilitation therapy, a so called rehab-patient, for giving visual feedback to a therapist or to the rehab-patient.
  • Stroke is the most prominent cause of permanent disability in the industrialized countries.
  • One of the most prominent disabilities stroke survivors suffer from is half sided paralysis of the upper limbs.
  • Rehabilitation exercises are proven to be efficient in regaining motor control, provided the training is intense and the patient is guided in the therapy.
  • External feedback can for example be given verbally through a coach.
  • Such external feedback is for example known from sports teaching situations, e.g. when a person is taught how to make a golf stroke, or from physiotherapists, e.g. in case of stroke victims learning to reach out for an object again.
  • Inertial sensors capture linear acceleration, angular velocity, and magnetic fields and can be used for a 3 -dimensional motion capture of all limbs they are attached to.
  • the motion data is displayed to the learner in form of a rendered, animated figure, a so-called avatar.
  • a coach is providing cues to the learners to point their attention to mistakes in the motion execution when reviewing the avatar motion with them.
  • An unsupervised home-stroke rehabilitation equipped with inertial sensors is able to track the movements of a patient in the 3D space.
  • the resulting data provides the basis to render an avatar that mimics the movements of the patient.
  • Both, the patient and/or the therapist can watch the avatar to analyze the patient's movements. Since the sensor system provides 3D data, the system enables the reviewer to watch the movements from different angles by rotating the avatar on the screen.
  • a problem experienced with the existing external feedback systems is that the viewing configuration, i.e. the rotation, tilt, zoom, and eventual other parameters, is still to be determined by the patient or the therapist, or in case of a sports teaching situation, by the trainee or the coach.
  • the 3 -dimensional recorded data allows the viewer to view the movements from different angles.
  • a known system allows the viewer to rotate the avatar or zoom into the figure while watching the recordings, as shown in Figs. 1 and 2.
  • the viewer still needs to be aware about the best viewport or, in other systems, the viewer is restricted to certain, pre-defined viewing setups that can be selected on demand. Thus, the viewer has to select the optimal configuration for reviewing the recorded data.
  • the optimal viewing configuration assists the patients in analyzing their own movements and recognizing wrong movement patterns.
  • the 3D data allows the viewer to 'walk around' the virtual representation of the patient and focus on the region of interest.
  • the present invention describes a method for determining a viewing configuration of a rendered figure of a rehab-patient, aiming to deliver a suitable view on the rendered figure.
  • the above object is obtained by providing a method for determining a viewing configuration of a rendered figure of a rehab-patient or a sports trainee, aiming to deliver a suitable view on the rendered figure, the method comprising the steps of capturing motion data in the 3D space of one or more body parts of the rehab-patient or the sports trainee and providing them to a computer; and further the step of performing on the computer measurements of deviation of the captured motion data from a reference list of motion data and/or measurements of main motion direction; and based on the results of the above measurements, determining the viewing configuration.
  • the feedback is given as an automatically chosen viewing configuration, without any need for input from the patient.
  • a region of interest is first selected based on the measurements, and based on the region of interest, and thereafter the viewing configuration is automatically selected.
  • Figs. 1 and 2 represent two viewing configurations of a rendered figure;
  • Fig. 3 represents schematically a preferred method according to the present invention
  • Figs. 4 to 6 represent examples of motion parameters set out as a function of time.
  • the invention consists of a computer system with a screen attached to this system.
  • 3 inertial sensors are connected wirelessly to the computer system.
  • the three sensors are attached to the patient's upper and lower arm of the affected side and the torso.
  • the sensors deliver orientations of these body segments in the 3D space, the data are stored on the computer system while the patient executes an exercise.
  • the viewer i.e. either the patient or the therapist, reviews the data using a computer system equipped with a screen.
  • the program on the computer system enables the viewer to review the recorded data by showing a rendered, animated figure, often called avatar, on the screen.
  • the figure is presented in a 3D virtual environment. Thus, it can be viewed from different perspectives.
  • a mouse or dedicated buttons next to the rendering space of the avatar might be used.
  • the method depicted in Fig. 3 describes how to automatically adjust the viewing configuration during a review of 3D posture data.
  • the viewing configuration to watch the avatar can be adjusted according to data recorded during the execution of an exercise: the data received from the sensors during exercise execution were stored. To review the exercise, the viewer selects the appropriate data, the system loads the data and starts processing it.
  • motion parameters are derived from that data.
  • Canonical motion parameters are angles in terms of the therapists' language, e.g. the shoulder flexion in the sagittal plane.
  • the patient's data and some reference data, e.g. templates generated from a collection of previous recordings, are compared.
  • the comparison includes the raw sensor data as received from the sensors and/or the motion parameters derived from the raw sensor data.
  • the motion parameters of the reference data could either be computed on-line or be pre- computed and thus stored in the database of exercise references.
  • the reference data and recorded data are aligned with respect to time-scales.
  • the different motion parameters that have been identified as being relevant for comparison evaluated i.e. in a time-window of a certain length (e.g. Vi second), for each motion parameter e.g. the Euclidian distance between the data points of reference and measurement or the distance of the mean values of these data points is computed.
  • the region of interest is identified.
  • Figs. 4 to 6 represent some curves of measured data and a reference of movements of a right shoulder.
  • the horizontal axis represents the time axis
  • the vertical axis represents the angle in the sagittal plane
  • the vertical axis represents the angle in the frontal plane
  • in Fig. 6 the vertical axis represents the angle in the horizontal plane.
  • the viewing configuration is adapted to enable to 'best' view on this motion parameter, i.e., that the rotation, tilt, zoom, etc of the virtual 3D space is adjusted to deliver the optimal view onto the area of interest.
  • the determination of the optimal viewing configuration given a certain selected motion parameter or selected motion parameter set could be done by using a predefined mapping table.
  • a mapping table To provide such a mapping table, professionals in the field, especially therapists, can give an insight on the regions of interest given that the execution of an exercise deviates from a certain reference. From that, the table translating the identified motion parameters into the optimal viewing configuration can be provided.
  • the main movement direction determines the viewing configuration. For example, if the patient mainly moves her arm to the side, the avatar is shown in a frontal view, since that transports the most parts of the movement information.
  • the identification of the main movement directions can be done using the summation of velocity vectors and using the direction of the sum vector. Projections may be used as well.
  • the reference patterns are actually averaged recordings of certain exercises by the patient, which would honor her handicaps, possibly supported by the therapist during a rehab session.
  • the identification of motion parameters is not limited to a single motion parameter but a collection of motion parameters.
  • the 3 motion parameters might be selected as being the relevant ones.
  • the identification of the most relevant motion parameters is not limited to a small time-window but takes the context into account. For example, if mostly the sagittal plane is considered to be relevant and, within short fragments, the frontal plane is identified, the context could be used to stick to a viewing configuration that allows the optimal analysis of movements in the sagittal plane. This smoothes the display of the avatar and prevents too many changes in the view on the figure.
  • multiple avatars might be used to deliver an optimal view on the avatar. Due to the identified motion parameters, two or more avatars re-playing the patient's movements and being shown from different perspectives provide more insight on the region of interest than a single avatar. The optimal number of avatars may depend on the identified motion parameters.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Rehabilitation Tools (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method for determining a viewing configuration of a rendered figure of a rehab-patient or a sports trainee, aiming to deliver a suitable view on the rendered figure, the method comprising the steps of capturing motion data in the 3D space of one or more body parts of the rehab-patientor the sports trainee and providing them to a computer; and further the step of performing on the computer measurements of deviation of the captured motion data from a reference list of motion data and/or measurements of main motion direction; and based on the results of the above measurements, determining the viewing configuration.

Description

Method and system for selecting the viewing configuration of a rendered figure
FIELD OF THE INVENTION
The present invention relates to a method and a system for determining a viewing configuration of a rendered figure, for example of a person of whom posture analysis or analysis of his movements is desired. In particular, the present invention relates to a method for automatically selecting one of more suitable viewing configurations of rendered images made from a person following rehabilitation therapy, a so called rehab-patient, for giving visual feedback to a therapist or to the rehab-patient.
BACKGROUND OF THE INVENTION
Stroke is the most prominent cause of permanent disability in the industrialized countries. One of the most prominent disabilities stroke survivors suffer from is half sided paralysis of the upper limbs. Rehabilitation exercises are proven to be efficient in regaining motor control, provided the training is intense and the patient is guided in the therapy.
Technical solutions for unsupervised home stroke rehabilitation require the use of appropriate feedback mechanisms to ensure proper exercising.
Motor skill acquisition in healthy persons as well as stroke victims is facilitated by so called 'augmented' or 'external' feedback. This type of feedback is in contrast to internal feedback where the person moving uses its own senses such as vision or proprioception.
External feedback can for example be given verbally through a coach. Such external feedback is for example known from sports teaching situations, e.g. when a person is taught how to make a golf stroke, or from physiotherapists, e.g. in case of stroke victims learning to reach out for an object again.
Another popular method especially in motor skill acquisition in sport is video analysis, as for example described in US 2003/0054327, where the learner and/or a supervisor view the learner after having executed a prescribed motion. As video analysis captures only a single movement plane, inertial sensor systems are becoming increasingly popular.
Inertial sensors capture linear acceleration, angular velocity, and magnetic fields and can be used for a 3 -dimensional motion capture of all limbs they are attached to. The motion data is displayed to the learner in form of a rendered, animated figure, a so-called avatar. A coach is providing cues to the learners to point their attention to mistakes in the motion execution when reviewing the avatar motion with them.
An unsupervised home-stroke rehabilitation equipped with inertial sensors is able to track the movements of a patient in the 3D space. The resulting data provides the basis to render an avatar that mimics the movements of the patient. Both, the patient and/or the therapist can watch the avatar to analyze the patient's movements. Since the sensor system provides 3D data, the system enables the reviewer to watch the movements from different angles by rotating the avatar on the screen.
A problem experienced with the existing external feedback systems is that the viewing configuration, i.e. the rotation, tilt, zoom, and eventual other parameters, is still to be determined by the patient or the therapist, or in case of a sports teaching situation, by the trainee or the coach.
Current research prototypes of home-stroke rehabilitation systems using inertial sensors show the recorded movements from a pre-selected angle. This viewpoint is pre-selected to allow for the 'best' evaluation of the recorded movement.
However, the 3 -dimensional recorded data allows the viewer to view the movements from different angles. A known system allows the viewer to rotate the avatar or zoom into the figure while watching the recordings, as shown in Figs. 1 and 2. However, in this known system, the viewer still needs to be aware about the best viewport or, in other systems, the viewer is restricted to certain, pre-defined viewing setups that can be selected on demand. Thus, the viewer has to select the optimal configuration for reviewing the recorded data.
Since patients usually lack the expertise and additionally are cognitively impaired, they are in general not able to select the optimal viewing configuration. The optimal viewing configuration assists the patients in analyzing their own movements and recognizing wrong movement patterns.
For the therapists, selecting the optimal viewing configuration might require repeated watching of the exercises. Thus, starting with a viewing setup targeting at the problematic elements of the movement would increase the efficiency in the therapist's review process.
This also shows the benefit of measuring 3D motion data compared to 2D recordings, as for example delivered by a video camera. The 3D data allows the viewer to 'walk around' the virtual representation of the patient and focus on the region of interest.
Existing systems allow the user only to manually choose the viewing direction in steps of 90 degrees.
The present invention describes a method for determining a viewing configuration of a rendered figure of a rehab-patient, aiming to deliver a suitable view on the rendered figure.
OBJECT OF THE INVENTION
It is an object of the present invention to provide a method for determining a viewing configuration of a rendered figure of a rehab-patient in an automatic manner. It is also an object of the invention to provide a suitable system for performing such method.
SUMMARY OF THE INVENTION
The above object is obtained by providing a method for determining a viewing configuration of a rendered figure of a rehab-patient or a sports trainee, aiming to deliver a suitable view on the rendered figure, the method comprising the steps of capturing motion data in the 3D space of one or more body parts of the rehab-patient or the sports trainee and providing them to a computer; and further the step of performing on the computer measurements of deviation of the captured motion data from a reference list of motion data and/or measurements of main motion direction; and based on the results of the above measurements, determining the viewing configuration.
It is extremely advantageous for a rehab-patient or a sports trainee that the feedback is given as an automatically chosen viewing configuration, without any need for input from the patient. Optionally, a region of interest is first selected based on the measurements, and based on the region of interest, and thereafter the viewing configuration is automatically selected. BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be elucidated with reference to the drawings, that show a number of non- limiting embodiments, and in which:
Figs. 1 and 2 represent two viewing configurations of a rendered figure; Fig. 3 represents schematically a preferred method according to the present invention;
Figs. 4 to 6 represent examples of motion parameters set out as a function of time.
DETAILED DESCRIPTION OF EXAMPLES
In one embodiment, the invention consists of a computer system with a screen attached to this system. In addition, 3 inertial sensors are connected wirelessly to the computer system. The three sensors are attached to the patient's upper and lower arm of the affected side and the torso. The sensors deliver orientations of these body segments in the 3D space, the data are stored on the computer system while the patient executes an exercise.
Based on that stored data, the viewer, i.e. either the patient or the therapist, reviews the data using a computer system equipped with a screen. The program on the computer system enables the viewer to review the recorded data by showing a rendered, animated figure, often called avatar, on the screen. The figure is presented in a 3D virtual environment. Thus, it can be viewed from different perspectives. To change this viewing configuration, e.g. a mouse or dedicated buttons next to the rendering space of the avatar might be used.
The method depicted in Fig. 3 describes how to automatically adjust the viewing configuration during a review of 3D posture data. The viewing configuration to watch the avatar can be adjusted according to data recorded during the execution of an exercise: the data received from the sensors during exercise execution were stored. To review the exercise, the viewer selects the appropriate data, the system loads the data and starts processing it.
Next, motion parameters are derived from that data. Canonical motion parameters are angles in terms of the therapists' language, e.g. the shoulder flexion in the sagittal plane. The patient's data and some reference data, e.g. templates generated from a collection of previous recordings, are compared. The comparison includes the raw sensor data as received from the sensors and/or the motion parameters derived from the raw sensor data. The motion parameters of the reference data could either be computed on-line or be pre- computed and thus stored in the database of exercise references.
Among the various options to compare the reference data and the exercise recordings, one will be outlined next. Using dynamic time-warping, the reference data and recorded data are aligned with respect to time-scales. Then, the different motion parameters that have been identified as being relevant for comparison evaluated, i.e. in a time-window of a certain length (e.g. Vi second), for each motion parameter e.g. the Euclidian distance between the data points of reference and measurement or the distance of the mean values of these data points is computed. Based on the comparison, the region of interest is identified. One possibility to obtain the region of interest is to identify the motion parameter where the comparison values indicate the largest deviation from the 'normal' given by the reference: at a certain point in time t, the motion parameter that contains the largest accumulated deviation in a certain time- window around t compared to the other motion parameters is selected. Figs. 4 to 6 represent some curves of measured data and a reference of movements of a right shoulder. The horizontal axis represents the time axis, while in Fig. 4 the vertical axis represents the angle in the sagittal plane, in Fig. 5 the vertical axis represents the angle in the frontal plane and in Fig. 6 the vertical axis represents the angle in the horizontal plane. These planes are well known by therapists. The most relevant motion parameter, in this case the flexion angle in a plane, changes over time, shifts in this example from the sagittal plane to the frontal plane, to the horizontal plane, to the frontal plane, and back to the horizontal plane.
Once the most relevant motion parameter for a certain time-span is known, the viewing configuration is adapted to enable to 'best' view on this motion parameter, i.e., that the rotation, tilt, zoom, etc of the virtual 3D space is adjusted to deliver the optimal view onto the area of interest.
The determination of the optimal viewing configuration given a certain selected motion parameter or selected motion parameter set could be done by using a predefined mapping table. To provide such a mapping table, professionals in the field, especially therapists, can give an insight on the regions of interest given that the execution of an exercise deviates from a certain reference. From that, the table translating the identified motion parameters into the optimal viewing configuration can be provided. In another embodiment, not represented though, instead of deviations from a certain reference, the main movement direction determines the viewing configuration. For example, if the patient mainly moves her arm to the side, the avatar is shown in a frontal view, since that transports the most parts of the movement information. The identification of the main movement directions can be done using the summation of velocity vectors and using the direction of the sum vector. Projections may be used as well.
For velocity or projection based finding of main direction, even no reference pattern is needed, which would make a reference database obsolete.
In a further embodiment, the reference patterns are actually averaged recordings of certain exercises by the patient, which would honor her handicaps, possibly supported by the therapist during a rehab session.
In a fourth embodiment, the identification of motion parameters is not limited to a single motion parameter but a collection of motion parameters. Thus, taking multiple motion parameters into account, if 4 motion parameters in total would be considered to be relevant, but the top candidate stands separate while the other 3 motion parameters are related, the 3 motion parameters might be selected as being the relevant ones.
In a fifth embodiment, the identification of the most relevant motion parameters is not limited to a small time-window but takes the context into account. For example, if mostly the sagittal plane is considered to be relevant and, within short fragments, the frontal plane is identified, the context could be used to stick to a viewing configuration that allows the optimal analysis of movements in the sagittal plane. This smoothes the display of the avatar and prevents too many changes in the view on the figure.
In a sixth embodiment, multiple avatars might be used to deliver an optimal view on the avatar. Due to the identified motion parameters, two or more avatars re-playing the patient's movements and being shown from different perspectives provide more insight on the region of interest than a single avatar. The optimal number of avatars may depend on the identified motion parameters.

Claims

CLAIMS:
1. A method for determining a viewing configuration of a rendered figure of a rehab-patient or a sports trainee, aiming to deliver a suitable view on the rendered figure, characterized in that method comprises the steps of: capturing motion data in a 3D space of one or more body parts of the rehab- patient or sports trainee and providing the data to a computer; performing on the computer: measurements of deviation of the captured motion data from a reference list of motion data and/or measurements of main motion direction; based on the results of the above measurements, determining the viewing configuration.
2. A method according to claim 1, characterized in that the method is performed within time frames.
3. A method according to claim 2, characterized in that the determination of the viewing configuration within a certain time frame takes the said measurements during previous and further time frames into account, for the purpose of providing stable or complementary viewing configurations.
4. A method according to any of the preceding claims, characterized in that capturing motion data of the rehab-patient or a sports trainee in the 3D space is performed by providing inertial sensors wired or wirelessly connected to a computer.
5. A method according to any of the preceding claims, characterized in that capturing motion data of the rehab-patient or a sports trainee in the 3D space is performed by means of stereo camera systems.
6. A method according to any of the preceding claims, characterized in that the motion parameters are derived from the captured motion data, which may allow a therapist or a coach easier interpretation of the captured data.
7. A method according to any of the preceding claims, characterized in that the measurement of the deviation of the captured motion data from a reference list of motion data, comprises the steps of providing reference values, for example by capturing the same motion data from an ideally performed exercise or by calculating a mean value of previously captured motion data; and calculating a measurement of distance between the reference values and captured values.
8. A method according to claim 7, characterized in that the measurement of distance is the Euclidian distance.
9. A method according to any of the preceding claims, characterized in that the main motion direction is determined using the summation of velocity vectors and using the direction of the sum vector.
10. A method according to claim 1 or 9, characterized in that the main motion direction is determined using projections of velocity vectors.
11. A method according to any of the preceding claims, characterized in that an area of interest is selected after and based on the mentioned measurements.
12. A method according to claim 11, characterized in that the selected area of interest directly determines the viewing configuration on the basis of a mapping table defining a relation between any area of interest and parameters of a viewing configurations.
13. A method according to claim 11 or 12, characterized in that the selected area of interest comprises captured motion data which at least to a certain extent, in absolute or relative terms, deviate from the reference.
14. A method according to any of the claims 11, characterized in that the selected area of interest shows the greatest main motion direction.
15. A method according to claim 14, characterized in that the viewing configuration is such that a view is provided perpendicular to the plane wherein the main motion direction is measured.
16. A method according to claim 1, characterized in that more than one viewing configuration is represented simultaneously.
17. A system for performing a method according to claim 1, characterized in that it comprises : - Means for capturing motion data in the 3D space of one or more body parts of the rehab-patient or a sports trainee and a computer whereon the captured data can be provided and processed;
A program on the computer for: performing measurements of deviation of the captured motion data from a reference list of motion data and/or measurements of main motion direction; determining the viewing configuration based on the results of the above measurements; visualization means for displaying the rendered figure according to the selected viewing configuration.
18. A system according to claim 17, characterized in that the means for capturing motion data are inertial sensors wired or wirelessly connected to a computer.
19. A system according to claim 17, characterized in that the means for capturing motion data comprise a stereo camera system.
PCT/IB2008/054325 2007-10-26 2008-10-21 Method and system for selecting the viewing configuration of a rendered figure WO2009053901A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2010530605A JP5575652B2 (en) 2007-10-26 2008-10-21 Method and system for selecting display settings for rendered images
US12/738,659 US9418470B2 (en) 2007-10-26 2008-10-21 Method and system for selecting the viewing configuration of a rendered figure
CN2008801133925A CN101836237B (en) 2007-10-26 2008-10-21 Method and system for selecting the viewing configuration of a rendered figure
EP08841127.7A EP2203896B1 (en) 2007-10-26 2008-10-21 Method and system for selecting the viewing configuration of a rendered figure

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP07119415 2007-10-26
EP07119415.3 2007-10-26

Publications (3)

Publication Number Publication Date
WO2009053901A2 true WO2009053901A2 (en) 2009-04-30
WO2009053901A3 WO2009053901A3 (en) 2009-06-11
WO2009053901A8 WO2009053901A8 (en) 2010-05-06

Family

ID=40456497

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/054325 WO2009053901A2 (en) 2007-10-26 2008-10-21 Method and system for selecting the viewing configuration of a rendered figure

Country Status (5)

Country Link
US (1) US9418470B2 (en)
EP (1) EP2203896B1 (en)
JP (1) JP5575652B2 (en)
CN (1) CN101836237B (en)
WO (1) WO2009053901A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011026001A2 (en) 2009-08-28 2011-03-03 Allen Joseph Selner Characterizing a physical capability by motion analysis
JP2013533999A (en) * 2010-06-10 2013-08-29 コーニンクレッカ フィリップス エヌ ヴェ Method and apparatus for presenting options

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9011293B2 (en) 2011-01-26 2015-04-21 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
CN102198003B (en) * 2011-06-07 2014-08-13 嘉兴恒怡科技有限公司 Limb movement detection and evaluation network system and method
US20140002266A1 (en) * 2012-07-02 2014-01-02 David Hayner Methods and Apparatus for Muscle Memory Training
US9063578B2 (en) 2013-07-31 2015-06-23 Microsoft Technology Licensing, Llc Ergonomic physical interaction zone cursor mapping
CN105301771B (en) * 2014-06-06 2020-06-09 精工爱普生株式会社 Head-mounted display device, detection device, control method, and computer program
US9958946B2 (en) 2014-06-06 2018-05-01 Microsoft Technology Licensing, Llc Switching input rails without a release command in a natural user interface
US9478064B2 (en) * 2015-01-29 2016-10-25 Harris Corporation Automatic control of avatar perspective view in a graphical user interface
US11511156B2 (en) 2016-03-12 2022-11-29 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
EP3588471A1 (en) * 2018-06-28 2020-01-01 West & Berg Holding AB Real time sports motion training aid
US20220296963A1 (en) * 2021-03-17 2022-09-22 Tonal Systems, Inc. Form feedback

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4828257A (en) * 1986-05-20 1989-05-09 Powercise International Corporation Electronically controlled exercise system
US5625577A (en) * 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
KR100276549B1 (en) 1995-12-07 2000-12-15 이리마지리 쇼우이치로 Image generation apparatus, image generation method, game machine using the method
JP2001504605A (en) * 1996-08-14 2001-04-03 ラティポフ,ヌラフメド,ヌリスラモビチ Method for tracking and displaying a user's location and orientation in space, method for presenting a virtual environment to a user, and systems for implementing these methods
JP3930935B2 (en) * 1997-02-28 2007-06-13 三菱電機株式会社 Image processing device
JPH10334270A (en) 1997-05-28 1998-12-18 Mitsubishi Electric Corp Operation recognition device and recorded medium recording operation recognition program
JP3145059B2 (en) * 1997-06-13 2001-03-12 株式会社ナムコ Information storage medium and image generation device
US6292575B1 (en) * 1998-07-20 2001-09-18 Lau Technologies Real-time facial recognition and verification system
JP3985369B2 (en) * 1998-08-21 2007-10-03 株式会社セガ Game screen display control method, character movement control method, game machine, and recording medium recording program
AU6503300A (en) 1999-07-30 2001-02-19 Douglas K. Fulcher Method and system for interactive motion training and a user interface therefor
US6734834B1 (en) * 2000-02-11 2004-05-11 Yoram Baram Closed-loop augmented reality apparatus
ES2467154T3 (en) 2000-05-18 2014-06-12 Commwell, Inc. Method for remote medical monitoring that incorporates video processing
US7117136B1 (en) * 2000-08-18 2006-10-03 Linden Research, Inc. Input and feedback system
US6679812B2 (en) * 2000-12-07 2004-01-20 Vert Inc. Momentum-free running exercise machine for both agonist and antagonist muscle groups using controllably variable bi-directional resistance
US20030054327A1 (en) 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
JP2003136465A (en) 2001-11-06 2003-05-14 Yaskawa Electric Corp Three-dimensional position and posture decision method of detection target object and visual sensor of robot
AU2003214910A1 (en) 2002-01-25 2003-10-13 Silicon Graphics, Inc. Three dimensional volumetric display input and output configurations
JP2004113411A (en) * 2002-09-26 2004-04-15 Fuji Photo Film Co Ltd Operation diagnostic device
KR100507780B1 (en) 2002-12-20 2005-08-17 한국전자통신연구원 Apparatus and method for high-speed marker-free motion capture
JP3735672B2 (en) * 2003-07-22 2006-01-18 国立大学法人岐阜大学 Rehabilitation training technology education equipment
JP2005198818A (en) * 2004-01-15 2005-07-28 Ritsumeikan Learning supporting system for bodily motion and learning supporting method
WO2006086504A2 (en) * 2005-02-09 2006-08-17 Alfred E. Mann Institute For Biomedical Engineering At The University Of Southern California Method and system for training adaptive control of limb movement
US7308349B2 (en) * 2005-06-06 2007-12-11 Delphi Technologies, Inc. Method of operation for a vision-based occupant classification system
US7478009B2 (en) * 2005-07-29 2009-01-13 Wake Forest University Health Sciences Apparatus and method for evaluating a hypertonic condition
US7869646B2 (en) * 2005-12-01 2011-01-11 Electronics And Telecommunications Research Institute Method for estimating three-dimensional position of human joint using sphere projecting technique
KR100727034B1 (en) 2005-12-09 2007-06-12 한국전자통신연구원 Method for representing and animating 2d humanoid character in 3d space
US20070135225A1 (en) * 2005-12-12 2007-06-14 Nieminen Heikki V Sport movement analyzer and training device
US20080018792A1 (en) * 2006-07-19 2008-01-24 Kiran Bhat Systems and Methods for Interactive Surround Visual Field
WO2008085553A1 (en) * 2006-08-25 2008-07-17 Eliezer Jacob Improved digital camera with non-uniform image resolution

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011026001A2 (en) 2009-08-28 2011-03-03 Allen Joseph Selner Characterizing a physical capability by motion analysis
EP2470076A2 (en) * 2009-08-28 2012-07-04 Allen Joseph Selner Characterizing a physical capability by motion analysis
EP2470076A4 (en) * 2009-08-28 2012-10-10 Allen Joseph Selner Characterizing a physical capability by motion analysis
JP2013533999A (en) * 2010-06-10 2013-08-29 コーニンクレッカ フィリップス エヌ ヴェ Method and apparatus for presenting options
US9639151B2 (en) 2010-06-10 2017-05-02 Koninklijke Philips N.V. Method and apparatus for presenting an option

Also Published As

Publication number Publication date
JP5575652B2 (en) 2014-08-20
CN101836237B (en) 2012-10-10
US9418470B2 (en) 2016-08-16
CN101836237A (en) 2010-09-15
EP2203896A2 (en) 2010-07-07
WO2009053901A3 (en) 2009-06-11
US20100208945A1 (en) 2010-08-19
EP2203896B1 (en) 2019-04-24
JP2011503685A (en) 2011-01-27
WO2009053901A8 (en) 2010-05-06

Similar Documents

Publication Publication Date Title
US9418470B2 (en) Method and system for selecting the viewing configuration of a rendered figure
US11132533B2 (en) Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
US10755466B2 (en) Method and apparatus for comparing two motions
KR100772497B1 (en) Golf clinic system and application method thereof
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
US4891748A (en) System and method for teaching physical skills
US5184295A (en) System and method for teaching physical skills
CN103127691B (en) Video-generating device and method
US8094090B2 (en) Real-time self-visualization system
KR102125748B1 (en) Apparatus and method for motion guide using 4d avatar
KR20130098770A (en) Expanded 3d space based virtual sports simulation system
KR101399655B1 (en) Somatotype swing analysis apparatus, system and method for managing golf lesson thereof
KR102320960B1 (en) Personalized home training behavior guidance and correction system
CN109045651B (en) Golf swing correcting system
JP2005198818A (en) Learning supporting system for bodily motion and learning supporting method
US20220277506A1 (en) Motion-based online interactive platform
Chun et al. A sensor-aided self coaching model for uncocking improvement in golf swing
CN114022512A (en) Exercise assisting method, apparatus and medium
Jan et al. Augmented tai-chi chuan practice tool with pose evaluation
WO2021039857A1 (en) Video generation device
KR102438488B1 (en) 3d avatar creation apparatus and method based on 3d markerless motion capture
JP7419969B2 (en) Generation method, generation program, and information processing device
JP2021099666A (en) Method for generating learning model
KR102669062B1 (en) Exercise assistance service providing robot and exercise assistance service providing method
WO2023026529A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880113392.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08841127

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2008841127

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010530605

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 12738659

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE