WO2013084031A1 - Système de suivi et comparaison de mouvements - Google Patents

Système de suivi et comparaison de mouvements Download PDF

Info

Publication number
WO2013084031A1
WO2013084031A1 PCT/IB2011/055567 IB2011055567W WO2013084031A1 WO 2013084031 A1 WO2013084031 A1 WO 2013084031A1 IB 2011055567 W IB2011055567 W IB 2011055567W WO 2013084031 A1 WO2013084031 A1 WO 2013084031A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
center
user
gravity
tag
Prior art date
Application number
PCT/IB2011/055567
Other languages
English (en)
Inventor
Siu Fung KU
Original Assignee
Ku Siu Fung
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ku Siu Fung filed Critical Ku Siu Fung
Priority to PCT/IB2011/055567 priority Critical patent/WO2013084031A1/fr
Publication of WO2013084031A1 publication Critical patent/WO2013084031A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4005Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
    • A61B5/4023Evaluating sense of balance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Definitions

  • This invention relates to a system for teaching motion or posture, and in particular a system for motion and posture simulation.
  • Conventional motion simulators such as golf simulators track the movement of the subject using a camera. The movement of the user is captured and played back to the user.
  • a library of professional golfers is provided and can be played side-by-side with the image of a learner hitting the ball to allow the learner to visualize the "accuracy" of his put compared to a professional that he wants to learn from.
  • the user may only see the movement itself but unable to understand the concepts behind the movement, hindering the learning of the subject. An improved motion tracking system is hence desired.
  • the present invention in one aspect, is a system comprising a first system for determining the center of gravity of a subject, and a second system for determining the center of balance of the subject.
  • the system also comprises a presentation module to display the posture information, the center of gravity and the center of balance to a user.
  • the system comprises a matrix of pressure sensors disposed under the subject in the form of a mat for measuring the weight information of the subject.
  • the system comprises reflective tags that are attached to a pre pre-determined position of said subject.
  • a plurality of cameras are disposed at different locations for capturing the image of the subject from different angles.
  • the 3D coordinates of the tags can be computed based on the pixel information captured by the cameras and are sent to the relevant modules of the system for further processing and storage.
  • the presentation module comprises a display screen for a user to select at least one motion sequence to display.
  • the motion sequence may be trajectories of the subject's body posture, center of gravity and said center of balance.
  • the display screen can be further partitioned into quadrants; and a time indicator or signature can also be displayed.
  • the motion sequence of a reference template can also be displayed for comparison with that of the subject.
  • a method of analyzing movement of a subject is disclosed. The method comprises the steps of collecting data on the body movement of the subject; determining at least one balancing parameter of the subject; and presenting that parameter to a user.
  • the center of gravity and the center of balance provide very valuable insight to the quality of a motion or posture of the subject.
  • performance fields such as dancing, sports, or medical fields such as chiropractic applications
  • the relative position or the speed of movement of the center of gravity and the center of balance are essential to determine the effectiveness or visual effect of a move, or the condition of the spine, or any stress on some parts of the subject etc.
  • the user such as a dance teacher or a chiropractor, can more conveniently determine problems associated with the subject and devise potential solutions.
  • time signature to be output along with the centers the timing and rhythm of the motion can be determined with a clearer scale than simply referring to the time of the video for example.
  • the optional three dimensional coordinate system allows a user to more conveniently refer to a location of the subject in great detail, thus identifying any subtle differences that may have a big effect overall. For example, in a dance move, the arm of the subject may be a few inches lower than intended, and that alone may result in a big difference in the posture, the center of gravity to the stability of the motion.
  • FIG. 1 is a block diagram of a system according to an implementation of the present invention.
  • Fig. 2 is an illustration of sensors and tags arrangement, according to an embodiment of the present invention.
  • Fig. 3 is a top view diagram of a subject standing on a weight mat, according to an embodiment of the present invention..
  • Fig. 4 is a conceptual diagram showing the toe, ball, and heel locations of the subject's feet when the subject stands on the weight mat.
  • Fig. 5a and Fig. 5b are the side-view and the top-view illustrative diagrams showing the placement of tags around the torso of the subject for COB measurement.
  • the tags are arranged in a horizontal plane when the subject stands upright.
  • the heights of the tags relative to the feet are approximately the height of the COB of the subject.
  • Fig. 6 is a side view illustrative diagram whereby the body arm is represented by an upper arm segment and a lower arm segment. Tags are disposed on around both arm segments..
  • Fig. 7 is a human-stick representation of a human body, according to an embodiment of the present invention.
  • Fig. 8 illustrates the transition of a human stick model from a first time frame to a second time frame, according to an embodiment of the present invention.
  • Fig. 9 is the top view of an exemplary dance studio, showing the placements of camera and weigth mats, according to one implementation of the present invention.
  • Fig. 10 is a screen layout for the presentation module, according to one implementation of the present invention.
  • “comprising” means including the following elements but not excluding others.
  • “Motion sequence”, “motion trajectory”, and “body movement” all refers to the time series data of human body movement captured by a sensor system and they are interchangeably used. It should be noted that the disclosure below also covers the cases where the subject is not moving, or taking a static posture. In this case, the time series trajectory is a constant or a slow varying curve over time.
  • determining means an estimation based on the user's own understanding.
  • the present invention teaches how to use various approaches to estimate the COB and COG.
  • the exact positioning of the sensors vary from user to user and the appended claims are not limited to such exact positioning.
  • the COG is closely related to the center of mass of a human body, and is a point in the 3D coordinate where all the body weight is concentrated.
  • typical COG for a male adult is estimated to be a point at around 56% of his height and that of a female subject is around 54% although body weight & shape also affects the COG.
  • the COG position will change when the posture of the subject changes (e.g. extending his arm or leg outward or carrying an external load). It is possible that the COG position is a point outside of his body area; for example: when he bends forward with both of his arms stretched downward, the COG is at a point outside and below his belly.
  • the COB is a point inside the torso of a human body.
  • the head rests directly on the spine, which is supported by the pelvis, which in turn is supported by the ankles.
  • the muscles need not exert forces to align the bones in order to maintain balance. This means that minimum energy is needed and muscles are not strained.
  • the COG shifts to that direction and muscle forces must be applied for the body to maintain balance.
  • the left leg needs to exert force upward to counter the left-shift of the COG in order to maintain balance.
  • the COG and COB are inter-related.
  • the COB is generally estimated to be located at around 6 inches above the navel of a male subject, and at around 4 inches above the navel of a female subject, although the height of the subject would affect this general estimate in which case and estimation may be made based on the proportional length of the torso. Unlike COG, its position relative to the torso does not change even when the subject bends his body in any direction, or carrying external load.
  • a system 20 comprising a sensor data processing module 21 that is connected to an apparatus that comprises at least one sensor (not shown).
  • the sensor captures the positions and/or motions of the subject 24 as well as other related parameters.
  • the sensor data processing module 21 performs data conditioning operations on the raw sensor data; and send the processed data to a first system 22 for determining the center of gravity (COG) of a subject 24, a second system 26 for determining the center of balance (COB) of the subject 24, and a presentation module 28.
  • the presentation module 28 receives data from modules 22 and 26 and allows a user (not shown) to selectively display them on a monitor screen (not shown). The user may be the subject himself or may be another person.
  • the system 20 also comprises a database system 27 that is inter-connected to the aforementioned modules. It further comprises a plurality of libraries whose purposes are further explained in the following paragraph.
  • system 20 is to capture and analyse the movements of a subject 24 when he is performing certain exercise. Examples of such exercises include, but not limited to dancing, playing golf, doing therapeutic postures exercises such as yoga or practicing martial arts. At least one sensor is placed within the area where the subject is performing the exercise to capture his body movement or posture. A detailed description of the sensor apparatus will be discussed in the following session.
  • the raw sensor data is fed to the sensor data processing module 21 for data cleaning, conditioning and other operations.
  • This module may retrieve a human body model from a library in database 27 to facilitate its computation.
  • the processed motion data is sent to modules 22, 26 and 28, as well as to the database system 27.
  • the presentation module 28 displays the body parameters and data of the subject 24 on a screen.
  • a user can interactively select a portion of the data for display.
  • the presentation module 28 also allows the user to display not only the data of the subject, but also the data from a reference template (i.e. a teacher's data) for comparison.
  • the reference template can be retrieved from a library stored in the database system 27.
  • the sensor apparatus is to capture the human posture data and motion activities and send them to system 20.
  • a plurality of sensors 36 are placed at the peripheral area where the subject 24 is performing.
  • a set of tags 37 are optionally either sown onto the clothes that the subject is wearing, or otherwise attached to various parts of the subject's body through any conventional means as shown in that Figure.
  • the sensors are digital camera and the tags are markers coated with a reflective material. The cameras are mounted at pre-determined locations where the 3D coordinates are known. Each marker, if not blocked, will appear as one or more pixels on the 2D image plane of the camera.
  • a calibration process can be performed to determine the correspondence between a given 3D position of a marker and the 2D pixel position of each camera.
  • the markers' 3D coordinates can be computed using computer vision techniques operating on the 2D pixel coordinates of the cameras.
  • three cameras are used in a room. One is positioned at the front ceiling of the room, the other at the side ceiling while the third is at the middle of the ceiling.
  • additional cameras can also be installed at other areas of the room to improve robustness and precision.
  • cameras can be placed on both sides of the room, or a back camera can be added to improve the chance of the tags being directly visible from the camera, therefore improving the accuracy of coordinate determination.
  • the markers also emit lights so that the camera can easily identify the markers.
  • the markers comprise one or more light emitted diodes (LED).
  • the sensors are infrared emitters while the tag is infrared receivers.
  • the former further emits coded signal that encodes the space.
  • the tags are attached to the clothes that the subject wears and is programmed to decode the signal from the infrared emitters to determine the tag's position.
  • the latter can be transmitted to the senor data processing module 21 via wireless means.
  • Other sensor technologies such as inertial sensors that are capable of capturing the full six degrees of freedom body motion, or active or passive radio frequency identification (RFID) technology can also be used in this invention.
  • RFID radio frequency identification
  • a weight mat 30 is disposed on the floor.
  • the weight mat 30 comprises a two dimension array or matrix of pressure sensors 32 to capture the body weight of the subject 24.
  • Each pressure sensor 32 in the weight mat 30 senses a pressure or weight applied on the particular pressure sensor 32 to produce a two- dimensional pressure or weight mapping of the subject 24.
  • the distance between adjacent pressure sensors 32 of the weight mat 30 is less than 5cm.
  • the resolution is chosen to effectively distinguish the pressure sensed at the toe 24a, ball 24b or heel 24c of the subject 24 as shown in Fig. 4.
  • the distance between adjacent pressure sensors 32 is less than 1cm in order to more clearly show the change of weight distribution from one part to another.
  • the pressure sensors 32 are disposed near the two feet of the subject 24 instead of being provided in the weight mat 30.
  • pressure sensors 32 are disposed near or at the bottom of the shoes that the subject 24 wears. Separate sensors are disposed to measure the toe, ball and heel pressures. The readings are sent to the sensor data processing module 21 via wireless means.
  • An example of such device is a piezoelectric sensor coupled to a wireless module..
  • the sensor data processing module 21 processes the raw data that it receives from the plurality of sensors to a format that can be used by the COG module 22 and the COB module 26.
  • the processed data is also recorded in the database system 27 for later retrieval and display.
  • the pre-processing steps may be different. If the cameras-reflective markers technology is adopted, then it is possible that a marker may not be visible by a camera as its line-of-sight is obstructed by an obstacle. Possible obstacle may be the clothes that the subject wears, or the limb of the subject himself. In this case, the missing-data needs to be dealt with. In one embodiment, extrapolation technique based on past data points can be used to recover the missing data. With multiple pixel data from different cameras on the same marker, the 3D coordinate of the marker can be computed based on computer vision and regression analysis techniques.
  • Another function that this module can optionally perform is to eliminate positional offset.
  • the system is capable of comparing two or more body motion sequences simultaneously. As the subject (or the reference templates used for comparison) is unlikely to start his motion sequence from exactly the same starting 3D coordinates, a user may want to eliminate the positional offsets between different motion sequences when viewing them in overlaying mode on the display screen (discussed in details later).
  • the differences of the motion coordinate against the initial 3D coordinate are computed.
  • the difference of motion coordinate between the current sample and the previous sample is computed and recorded.
  • the raw sensor data also need to be normalized before comparison is made.
  • each of the motion sequence needs to be normalized by these parameters before comparison.
  • the size of the clothes can be used as a factor to normalize the sensor data.
  • the size of the shoes can also be used to normalize the toe, ball and heel positional data.
  • the tags or sensors are directly attached to the subject's body, then physical measurement of the subject's height and width may be needed and they are used for data normalization.
  • the normalization factors, as well as the location of the tags are stored in a library of the database system 27.
  • the size of the clothes and the locations of the tags are stored. When a user specifies the sizes of the clothes and shoe the subject is to wear, this information is recalled and used by the sensor data processing module 21 to perform data normalization.
  • the modules 22 and 26 can be used by the modules 22 and 26 to compute the COB and COG.
  • COB is a point inside the torso of our body, its position can be computed based on tag coordinates surrounding the torso.
  • at least one tag 37 is preferably attached to the torso of the subject 24 at same height where the COB 42 is located.
  • the tags are attached to the clothes that the subject wears. The clothes should preferably be as tight to the body as possible to avoid any slacks. Any sensor technology mentioned above can be used for this purpose.
  • An embodiment of computing COB is shown in Fig. 5b. Here a plurality of tags 37 is attached to the subject 24. Their 3D coordinates are sent to the COB module 42 after being processed by the sensor data processing module 21.
  • the COB For each position Pi of tag 37, the COB is estimated as a vector offset di of Pi. Hence for a plurality of tags as shown in Fig. 5b, there is a plurality of COB estimates. In one embodiment, the final COB value is the average of these individual estimates.
  • the vector offset di, the height where the sensors are placed on the body, and the number of sensors attached, are all predetermined and sent to the COB module 26 prior to the COB computation. In one embodiment, these parameters are retrieved from a library of the database system 27.
  • the COG of the subject can be computed, for example but not limited to, the following formula where multiple sensors are employed to measure the body weight: where W; and Pi are the weight and the position measurement of each sensor.
  • the weight measurement W; of each pressure sensor, as well as the position Pi of that sensor are fed to Eq. 1 to compute the COG of the subject.
  • the weights of the toe, ball and heel, as well as their corresponding positional data are recorded and used in computing the COG.
  • the center of gravity 40 as shown in Fig. 4 should be within the boundary of the feet of the subject 24 or between the feet, which are the bases of support. If the center of gravity 40 moves beyond the boundary, the subject 24 is unable to stay in the static posture or the subject 24 will fall down. However, when the subject 24 is in motion, the center of gravity 40 so calculated may move outside the boundary of the bases of support. Moreover, when only one foot makes contact to the weight mat, as in walking or running, the center of gravity 40 calculated will be on the supporting foot. It will shift all in a sudden once the other foot comes down and the original foot raised. In such cases, the center of gravity 40 so calculated by Eq. 1 will record a periodic step-wise displacement.
  • this periodic step- wise displacement is filtered and smoothed out.
  • Many signal processing and smoothing techniques can be applied to obtain a smoothed COG trajectory.
  • the center of gravity 40 during a step may be calculated based on the center of gravity 40 before the step starts and after the step is completed.
  • the estimation can be a simple interpolation between these two points or can involve a more sophisticated calculation.
  • the weight mat 30 is only able to determine the x-y coordinate of the center of gravity 40 of the subject 24, since the gravitational force always acts downwards. If the third coordinate, i.e. the height (z-axis) of the center of gravity 40 is desired, a more rigorous calculation will need to be involved.
  • the subject is considered to compose of different body segments (i.e. limbs and torso), and the center of gravity of each segment is first determined. Eq. 1 can then be applied when the position Pi and weight W; corresponding to the center of gravity of each segment is known.
  • the weight Wi of each segment is also referred as the segmental COG so as to distinguish it from the COG of the subject.
  • the position Pi can be measured by similar technique that measures the position of COB.
  • the overall weight of the subject can either be obtained by the weight mat as a sum of all pressure sensor measurements, or obtained by a separate weight scale.
  • An exemplary embodiment on determining the COGs of an arm segment is shown in Fig. 6.
  • the arm segment is modeled to comprise two sub-segments, the upper arm segment 60 and lower arm segment 62 that further comprises the forearm, wrist and palm.
  • the relative COG position of the upper arm segment is predetermined at a certain offset distance Dl 64 from the top of the upper arm segment, while the relative COG position of the forearm is predetermined at offset distance D2 65.
  • the weight Wj of each segment is a product of the pre-specified fraction of the overall body weight times the measured body weight of the subject.
  • the same methodology of estimating the COB coordinates mentioned above may be used to determine the 3D COG coordinates 61 and 63 of the upper and lower arm segments 60 and 62. Coordinates 61 and 63 are also referred as the segmental COGs so as to distinguish them from the COG of the subject.
  • 3D motion capturing tags are attached to the body or the clothes that the subject wear around the relative positions of the respective COGs as shown in Fig. 6.
  • the 3D coordinates of COG of the upper arm 61 and that of the lower arm 63 can be estimated as predetermined offsets from these tag coordinates.
  • the COG coordinates of other body segments can be similarly determined. Eq. 1 can then be applied to determine the 3D coordinate of the subject 24.
  • all the predetermined constants, the number of body segments, and the corresponding weight distribution fractions are all stored in a library of the database system module 27.
  • the presentation module 28 is to display the motion sequences to a user.
  • the motion sequences can be retrieved from the database system 27, or obtained from the sensor data processing module 21, the COG module 22 and the COB module 26.
  • the user can select only a few motion sequences to display so that he can focus his attention on those few motion trajectories and not be interfered by others.
  • straight lines joining points in one motion sequence with another are displayed so a human 'stick' figures can be observed. To further illustrate this point, consider the example where the motion sequences of the left shoulder, the left elbow joint and the wrist are available. A straight line can be drawn from the shoulder and the elbow point and from the elbow joint to the wrist.
  • Fig. 7 illustrates the transition of a human stick model from a first time frame to a second time frame.
  • the stick figures are superimposed onto the actual image of the subject.
  • a motion sequence of another subject is recalled from a library of the database system 27 and display together with that of the subject 24.
  • Said another subject in one embodiment, is a teacher or a reference template so the user can compare the movements of the subject and the teacher.
  • the differences of the corresponding motion sequence coordinates are computed over time. These differences can be aggregated and presented to the user as a matching score between the subject and the reference template. In a further embodiment, the matching score is computed for every musical beat.
  • the presentation module can present the photo or image of the user together with some desired data such as the COB and/or COG.
  • the database system 27 comprises at least one library in support of the various operations mentioned above.
  • it comprises a library of the human body.
  • the library is categorized by sex, age, body height and shoulder width, etc. as well as a general description of the body type (i.e. slim, fat-belly, muscular, etc.). For each of these categories, the general body weight (or body mass) distribution of each body segment is recorded. These values are used in the COG computation.
  • the parameters needed to compute the COB as shown in Fig. 5a and 5b, as well as those for computing the COG of individual body segments as shown in Fig. 6 are also recorded in this library.
  • the database system 27 further comprises another library about the clothing that the subject may wear.
  • the clothing can be categorized by sex, size and for each piece. The number of sensing tags and the locations of those tags in said cloth are recorded.
  • the database system further comprises a library that stores the captured motion data itself, as well as the motion data of the reference templates or teachers. As mentioned previously, the motioned data are stored in normalized form so that comparison between motion data can be readily performed.
  • the application is a dance simulator which comprises system 20 and a camera system install in a dance studio 70.
  • System 20 is a PC (personal computer) and the various modules inside system 20 as described in Fig. 1 are implemented in software.
  • the dance studio is a room with size 4000 mm x 4000 mm.
  • a weight mat 30 comprising smaller pieces with pressure sensors embedded inside is laid on the floor. The size of each smaller piece in one embodiment is 500 mm x 500 mm.
  • Camera systems are disposed on the walls of the dance studio 70. In this embodiment, a front camera 72, side camera 74 and back camera 76 are disposed on three sides of the dance studio 70.
  • a top camera 78 is disposed at the center of the ceiling. Except for the top camera, the other three cameras are mounted at a height of 1000 mm above the dance floor. It should be noted that the size of the room, the size of the weight mat, the number of the camera and the positions of the camera disclosed here are for a particular implementation of this embodiment and other values can also be used.
  • a subject 24 wearing special clothes with reflective tags practices his dance on the dance studio. The cameras capture the 3D coordinates of the tags. The pixel coordinates are sent to the sensor data processing modules 21, where the 3D coordinates of the tags are computed.
  • the weight mat 30 also forwards the pressure sensor readings and their coordinates to the same module. Subsequently, the COG and COB of the subject 24 is also computed using methods described above. All these motion data can be stored and displayed.
  • the display screen is arranged as shown in Fig. 11.
  • the screen is partitioned into four quadrants 80, 82, 84 and 86.
  • a user can select to display which motion trajectory of the subject on which quadrant.
  • the first quadrant 80 displays the data from the top camera and the fourth quadrant 86 displays the information from the weight mat.
  • the user can also choose to display the image of the subject, the 3D coordinates of the tags, or the stick body model as shown in Fig. 7 and 8, or any combination thereof for presentation.
  • he can also select to display 2D trajectories.
  • the weight mat information records the foot-step movements as the subject dances on the floor.
  • the user can select a planar view of the foot-step trajectories.
  • a time bar 88 and a beat bar 90 are also displayed on the screen below the four quadrants.
  • the former indicates the elapse time of the motion sequences while the later indicates the time-stamps of the music beats.
  • a separate screen similar to that shown in Fig. 11 is also deployed.
  • the user can recall the motion trajectories of a top dancer or an expert in the specific type of dance from a library in the database system 27 and display them on the second screen (not shown). The user can then easily compare the trajectory of the subject against that of the top dancer as they are shown side-by-side.
  • An alternative way of presentation is to overlap the motion trajectory of the top dancer on top of that of the subject, either in a quadrant or in full screen. Again, the user can choose what motion sequence, or combination of motion sequences to display.
  • the presentation module 28 also implements user-friendly interfaces with buttons or slide-bars (not shown in Fig. 11 ) for user to control what to display and how to present the information. For example, zoom/pan/tilt buttons may be provided. A time indication may optinally be provided in the time bar 88 and the user can slide it forward or backward to advance or retract to a certain time value. Once changed, all the corresponding motion trajectories will be automatically adjusted. The user can also speed up or slow down, fast-forward or fast-backward the time scale with dial-button(s) (not shown). Furthermore, the COB and COG data can also be superimposed to any of the motion trajectory for user to observe the inter-dependency of these parameters in relationship with others.
  • 2D projection of 3D data points is performed for a user who wants a planar view.
  • the user may want to study the COB and COG trajectories in relationship with the toe, ball or heel motion sesquences of his feet. Since the later captured by the weight mat are planar coordinates but the COB data is 3D, the COB data can be projected to the dance floor for display.
  • the presentation module 28 also let users to select a different reference point when displaying motion sequences.
  • the user may want to study the shoulder movement relative to the pelvis.
  • the body pelvis is the reference point rather than the dance floor coordinates.
  • the 3D coordinates of the shoulders and pelvis can be obtained and computed from sensors attached to the body, the user can specify to display the the displacement of the shoulder measurement against the pelvis so that he can study in details about this displacement, and compare with the reference movement of a top dancer.
  • the shoulder coordinate is the reference point in this instance.
  • shape of weight mat is rectangular or square
  • the shape can be arbitrary, as long as there are at least three pressure sensors disposed in appropriate locations to effectively calculate the center of gravity of the subject.
  • the weight mat can be circular, octagonal, cross-shaped, or can assume an irregular shape.
  • the cameras do not need to be fixed at a specific location in a room, and can move translationally while facing the same orientation as necessary. For example, when the subject moves from the center of the room to a corner of the room, the distance between the top camera to the subject will increase. The coordinate thus detected may be less accurate due to the difference in distance and also angle. In such situation, the cameras can be designed to move along with the subject, such as the subject always align to the center of the captured video of the cameras.
  • the present invention can be applied to many areas
  • the dance simulator is just one specific realization on how the present invention can be applied.
  • those skilled in the art can incorporate the teaching of this invention to many different areas, including but not limit to applications such as various kinds of sport simulators for improving athletic performance for different sports; exercise machines in physical fitness centers; therapeutic monitors for physiotherapists and simulators for recreational sports.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un système de suivi et de comparaison de mouvements, ledit système comprenant un premier système (22) destiné à détecter le centre de gravité d'un sujet, un second système (26) destiné à détecter le centre d'équilibre du sujet, et un module de présentation (28) destiné à notifier à l'utilisateur le centre de gravité par rapport au centre d'équilibre. Un indicateur temporel est également présent pour l'utilisateur conjointement aux trajectoires des mouvements corporels. Les trajectoires d'un modèle de référence peuvent également être affichées à côté de celles du sujet, ou être superposées sur celles du sujet, pour la facilité de comparaison.
PCT/IB2011/055567 2011-12-09 2011-12-09 Système de suivi et comparaison de mouvements WO2013084031A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2011/055567 WO2013084031A1 (fr) 2011-12-09 2011-12-09 Système de suivi et comparaison de mouvements

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2011/055567 WO2013084031A1 (fr) 2011-12-09 2011-12-09 Système de suivi et comparaison de mouvements

Publications (1)

Publication Number Publication Date
WO2013084031A1 true WO2013084031A1 (fr) 2013-06-13

Family

ID=48573635

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/055567 WO2013084031A1 (fr) 2011-12-09 2011-12-09 Système de suivi et comparaison de mouvements

Country Status (1)

Country Link
WO (1) WO2013084031A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469679A (zh) * 2015-11-14 2016-04-06 辽宁大学 基于Kinect的心肺复苏辅助训练系统及方法
CN107397552A (zh) * 2016-05-20 2017-11-28 武汉软工硕成技术有限公司 面向移动终端应用的人体重量和平衡性测量装置及其测量方法
CN107661102A (zh) * 2016-07-29 2018-02-06 Lg电子株式会社 电子装置及其控制方法
WO2018063989A1 (fr) * 2016-09-28 2018-04-05 Bodbox, Inc. Évaluation et accompagnement de performance athlétique
RU2768183C1 (ru) * 2021-04-12 2022-03-23 Общество С Ограниченной Ответственностью "Хабилект" Программно-аппаратный комплекс и способ для вычисления положения центра тяжести человека в трехмерном пространстве с использованием бесконтактного сенсора

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10258044A (ja) * 1997-03-18 1998-09-29 Atr Chinou Eizo Tsushin Kenkyusho:Kk 人体姿勢の推定方法
JP2002213923A (ja) * 2001-01-24 2002-07-31 Sunlit Sangyo Co Ltd 三次元座標自動測定方法、これを用いた人体座標測定方法、三次元座標自動測定装置及び人体座標自動測定装置
KR20020081561A (ko) * 2001-04-16 2002-10-28 (주)하이미디어통신 골프 스윙 셀프클리닉 시스템 및 그 방법
US20050182341A1 (en) * 2004-02-13 2005-08-18 Ken Katayama Posture diagnosis equipment and program therefor
CN101697870A (zh) * 2009-11-06 2010-04-28 中国科学院合肥物质科学研究院 一种多参数人体机能检测装置及检测方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10258044A (ja) * 1997-03-18 1998-09-29 Atr Chinou Eizo Tsushin Kenkyusho:Kk 人体姿勢の推定方法
JP2002213923A (ja) * 2001-01-24 2002-07-31 Sunlit Sangyo Co Ltd 三次元座標自動測定方法、これを用いた人体座標測定方法、三次元座標自動測定装置及び人体座標自動測定装置
KR20020081561A (ko) * 2001-04-16 2002-10-28 (주)하이미디어통신 골프 스윙 셀프클리닉 시스템 및 그 방법
US20050182341A1 (en) * 2004-02-13 2005-08-18 Ken Katayama Posture diagnosis equipment and program therefor
CN101697870A (zh) * 2009-11-06 2010-04-28 中国科学院合肥物质科学研究院 一种多参数人体机能检测装置及检测方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469679A (zh) * 2015-11-14 2016-04-06 辽宁大学 基于Kinect的心肺复苏辅助训练系统及方法
CN107397552A (zh) * 2016-05-20 2017-11-28 武汉软工硕成技术有限公司 面向移动终端应用的人体重量和平衡性测量装置及其测量方法
CN107661102A (zh) * 2016-07-29 2018-02-06 Lg电子株式会社 电子装置及其控制方法
CN107661102B (zh) * 2016-07-29 2023-10-03 Lg电子株式会社 电子装置及其控制方法
WO2018063989A1 (fr) * 2016-09-28 2018-04-05 Bodbox, Inc. Évaluation et accompagnement de performance athlétique
US11071887B2 (en) 2016-09-28 2021-07-27 Bodbox, Inc. Evaluation and coaching of athletic performance
RU2768183C1 (ru) * 2021-04-12 2022-03-23 Общество С Ограниченной Ответственностью "Хабилект" Программно-аппаратный комплекс и способ для вычисления положения центра тяжести человека в трехмерном пространстве с использованием бесконтактного сенсора
WO2022220708A1 (fr) * 2021-04-12 2022-10-20 Общество С Ограниченной Ответственностью "Хабилект" Complexe matériel-logiciel et procédé de calcul de la position du centre de gravité d'une personne

Similar Documents

Publication Publication Date Title
US11887174B2 (en) Systems and methods for analyzing lower body movement to recommend footwear
CN108697921B (zh) 用于评估动作表现的系统、方法、装置和标记物
KR101980378B1 (ko) 동적 움직임과 신체 밸런스를 이용한 운동자세 유도장치
JP5421437B2 (ja) 足の診断装置およびそれを用いた靴またはインソールのフィッティング・ナビゲーション・システム
US8696450B2 (en) Methods for analyzing and providing feedback for improved power generation in a golf swing
JP6466139B2 (ja) 人間の動きを測定するロボット計測器
US20130171601A1 (en) Exercise assisting system
JP6444813B2 (ja) 分析システム、及び、分析方法
US20080133171A1 (en) Method and Device for Evaluating Displacement Signals
CN107660135A (zh) 信息处理装置、信息处理系统及鞋垫
CN104126184A (zh) 用于包括训练计划的自动个人训练的方法和系统
KR20130116886A (ko) 자동화된 개인 훈련 방법 및 시스템
JP2016080671A5 (fr)
JP6837484B2 (ja) 運動をデジタル化し評価する装置
JP2016140591A (ja) 動作解析評価装置、動作解析評価方法、及びプログラム
WO2013084031A1 (fr) Système de suivi et comparaison de mouvements
US10247626B2 (en) Motion recognition method and apparatus
US20220266091A1 (en) Integrated sports training
US20200154817A1 (en) Shoe
US20110166821A1 (en) System and method for analysis of ice skating motion
JP6310255B2 (ja) オプションを提示するための方法及び装置
Kaichi et al. Estimation of center of mass for sports scene using weighted visual hull
US20040059264A1 (en) Footprint analyzer
JP6123520B2 (ja) 下肢形状変化測定装置、方法及びプログラム
US20230316620A1 (en) System and method for generating a virtual avatar

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11877177

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11877177

Country of ref document: EP

Kind code of ref document: A1