CN101996311A - Yoga stance recognition method and system - Google Patents

Yoga stance recognition method and system Download PDF

Info

Publication number
CN101996311A
CN101996311A CN 200910109546 CN200910109546A CN101996311A CN 101996311 A CN101996311 A CN 101996311A CN 200910109546 CN200910109546 CN 200910109546 CN 200910109546 A CN200910109546 A CN 200910109546A CN 101996311 A CN101996311 A CN 101996311A
Authority
CN
China
Prior art keywords
angle
action
data
trace points
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 200910109546
Other languages
Chinese (zh)
Inventor
林洋
甘泉
李�浩
王跃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN TOL TECHNOLOGY Co Ltd
Shenzhen Taishan Online Tech Co Ltd
Original Assignee
SHENZHEN TOL TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN TOL TECHNOLOGY Co Ltd filed Critical SHENZHEN TOL TECHNOLOGY Co Ltd
Priority to CN 200910109546 priority Critical patent/CN101996311A/en
Publication of CN101996311A publication Critical patent/CN101996311A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a Yoga stance recognition method based on a computer. The method comprises the following steps of: collecting the space position data of a plurality of tracking points set on a human body in different times; calculating the relative space position relation of the tracking points according to the space position data of the plurality of tracking points; making a reasonable judgment on the calculated data according to standard Yoga stance data prestored in a stance database and a judgment criterion thereof; and outputting a judgment result to prompt. The invention also discloses a system using the method. The method and the system of the invention can be utilized to collect the stance trails of various tracking parts of a user in Yoga exercise in time on a basis of collecting the space position information of the tracking parts in various times, judge the relation between the user gesture and the Yoga stance and locate and describe the Yoga stance figure through the plurality of tracking points so as to more truly reflect the body condition of the user in Yoga exercise.

Description

A kind of yoga action identification method and system
Technical field
The invention belongs to computer application field, more particularly, relate to a kind of yoga action identification method and system.
Background technology
Develop rapidly along with Computer Applied Technology, huge variation has also taken place the mode that people carry out yoga body-building: people do not need specific gymnasium body-building, only need in virtual body-building hall, select the yoga project, program inserts the video platform user oriented by computer platform, the yoga video of playing can be that true man record a video also can be virtual image, data acquisition equipment is gathered user's movable information when needing the user to act accordingly, and the exercise data that collects is sent into computing machine carry out computational analysis, show testing result, determine next step program according to testing result, point out corresponding attention requirement etc. as nonstandard.
Some methods of utilizing action to carry out identification control all are based on traditional image-recognizing method at present, take the behavior attitude or the action of human body by camera, the image of taking is carried out analyzing and processing, utilize corresponding recognizer to discern, judge the behavior attitude or the action of human body.This kind method needs sizable storage space to deposit the image of shooting on the one hand, and the algorithm that also needs on the other hand to optimize is very much accurately discerned human body attitude or action.
At the defective of above-mentioned classic method, developed a kind of mode of discerning the staff action based on acceleration transducer.This technology has been used acceleration transducer and gyroscope, generally is that degree of will speed up sensor and gyroscope are placed in the handle.Wherein acceleration transducer is used to catch the acceleration information of human motion, gyroscope is used to catch the directional information of human motion, the acquisition hardware video frequency pick-up head is gathered the data that acceleration transducer and gyroscope produce, and gives terminal computer and process.
The Wii Sport product of Nintendo Co. has adopted the technology of real-time seizure body motion information, and the handle the inside of the said firm's product has comprised gravity sensor, these two major equipments of gyroscope.Wherein gravity sensor is used to catch the acceleration information of human motion, and gyroscope is used to catch the directional information of human motion.By Bluetooth signal, can degree of will speed up information and direction of motion information pass to terminal computer, terminal computer by acceleration information and direction of motion information just can the modeling human motion action.
Having used acceleration transducer and gyroscope in the existing yoga action recognition technology, generally is that degree of will speed up sensor and gyroscope are placed in the handle, and by the data of handle transmission hand exercise, there is many-sided problem in this kind mode, such as:
1, can not carry out the tracking at a plurality of positions of body position, therefore also just can't make analysis entertainment body-building person's whole body kinematics;
What 2, send having only movable information and not having the locus coordinate information, therefore can not reflect the motion conditions of human body really;
3, because therefore the locus coordinate information of no human body can not be followed the tracks of the human body movement locus, more can't position and describe human body attitude.
Therefore, need a kind ofly can provide spatial positional information, and can note and follow the tracks of the position track, human body attitude is positioned, the scheme of truer reflection human motion situation overcomes the above-mentioned defective that exists in the prior art.
Summary of the invention
Technical matters to be solved by this invention is, the problem that can not carry out the tracking, no locus coordinate information at a plurality of positions of body position when catching the body motion information technology in real time in the existing yoga action recognition technology and can not follow the tracks of the human body movement locus provides a kind of yoga action identification method and system.
The present invention solves the scheme that its technical matters adopts: a plurality of trace points that human body is provided with are on one's body gathered in difference spatial position data constantly; Select predetermined computing method according to different action requests, spatial position data is calculated space distance or relative angle or space distance and angle between described a plurality of trace point; According to yoga standard operation data and the criterion thereof that action database is stored in advance, judge whether the data that calculate meet the yoga standard; Judged result is exported prompting.
The invention provides a kind of yoga action identification method, may further comprise the steps:
S1) gather a plurality of trace points that human body is provided with on one's body in difference spatial position data constantly;
S2) relative tertiary location that calculates between them according to the spatial position data of a plurality of trace points concerns;
S3) according to the standard operation data and the criterion thereof of action database stores action, the data that S2 calculates are carried out the qualification judgement;
S4) judged result is exported prompting.
Step S1 is when gathering action data, also comprise: the trace point that human body is provided with on one's body is in head, both hands, waist, trick, be provided with six trace points altogether, and three trace points on head and the both hands are one group of trace point above the waist, three trace points on waist and the both feet are one group of trace point of the lower part of the body, have all used acceleration transducer and gyrostatic equipment on the trace point.Video acquisition to data all be that (z) mode of three-dimensional coordinate is preserved for x, y with tlv triple, horizontal right-hand lay is the x axle, and vertical direction is the y axle, with video acquisition over against direction be the Z axle, and per second provides 30 groups of data, and this high-frequency information acquisition can obtain very accurate positional information.
The computing method of calculating the relative tertiary location relation between the trace point among the step S2 comprise: the computing method of the combination of the computing method of relative distance, the computing method of relative angle and relative distance and relative angle; The computing method of relative angle comprise the computing method of plane projection angle of trace point line or the computing method of the relative angle between the trace point line.Computing method are separate computations to the upper part of the body and the lower part of the body of human body, and three collection points above the waist constitute a spatial triangle, and three collection points of the lower part of the body constitute another spatial triangle.
Can learn that from geometry three limits only otherwise exist conllinear just can construct unique triangle, that is to say to prove that shape must fix, because people's particular location is transportable, height also is to distinguish to some extent, but the relative position of each body part of people is stable, so we only need the detection relative position just passable in some actions.This moment, step S2 adopted the decision method of relative distance to simple motion: the distance between 2 of the computer memories, some A (X a, Y a, Z a), some B (X b, Y b, Z b), by Obtain apart from S.The data that collect are two groups, one group of three point of the upper part of the body, and one group of three point of the lower part of the body can obtain three sections relative distances of every group respectively by calculating, and judge above the waist and the action of the lower part of the body by every group three segment distance.
From top algorithm as can be seen, owing to there are differences between men, so it can only be a value range that the required value of adjusting the distance is doomed, certainly working as value range requires the accurate more standardization to posture also strong more, but dirigibility has just reduced thereupon, adapting to different crowd is more coarse by the position distance only obviously, though and distance judges it is almost to apply to everything to detect in the present invention, but what applications distances was judged action separately also is more specific simple motions, so step S2 also adopts the computing method of plane projection angle:
Obtained the relative position that to analyze them behind the spatial point position, by calculating the relative position that angle between them reflects them.With the Z axle is that normal can obtain XOY plane, with the Y-axis is that normal can obtain the XOZ plane, with the X-axis is that normal can obtain the YOZ plane, according to the orthogonal geometry projection principle, now just the angle problem in space can be transformed on the angle problem on three planes: the line L of arbitrary group of any two points of trace point, project on the reference planes, obtain straight line L 1, straight line L 1The axis of reference of reference planes can form certain included angle therewith, and this angle also is the projection angle of body part on these reference planes of these 2 representatives.
Further, when the action that is detected is a continuous process, projection angle so in the plane is constantly to change, step S2 is on the basis of the computing method of plane projection angle so, also adopt the computing method of relative angle, can whether change according to relative angle and judge this process.Judge relative angle by the cosine law: C=arccos[(a 2+ b 2-c 2)/2bc], the C upper part of the body or three collection points of the lower part of the body form arbitrary angle of spatial triangle, and a, b, c are the space length on corresponding spatial triangle three limits.
After the implementation that has had distance and angle to judge, to both comprising that distance also comprises the compound action that various angles require, can see that any independent use all has its limitation, move judgement so fully utilize two kinds of methods, adopt the computing method of the combination of relative distance and relative angle, the processing procedure of the method is as follows:
S61), calculate the relative distance of putting between every group of trace point according to the computing method of distance;
S62), calculate the relative angle between every group of trace point according to the computing method of the relative angle in the computing method of angle;
S63), calculate the angle of every group of trace point in XOY, XOZ, YOZ plane projection according to the computing method of the plane projection angle in the determination methods of angle.
Where applicable based on this judgement order also can division operation the judgement complexity, and no matter many complicated postures also can resolve into a plurality of better simply condition judgment in view of the above.
Calculate body builder's motion state in real time, distance, angle between every group of three trace point, movement velocity when movement velocity slowly levels off to zero the time, can judge that action of body builder puts in place, at this moment, fixed pose of mark begins, and notes the start time, calculates leg-of-mutton each angle or space distance or angle and space distance that every group of three trace point formed, if satisfy criterion, the fixed pose that then shows the body builder is standard compliant.After this in certain moment, if there is ungratified situation to occur, anchor of mark finishes, and notes the concluding time, and the difference of concluding time and start time is exactly the time that posture keeps.In the posture maintenance process, all spatial positional informations of every group of three trace points also are saved and utilize, at first calculate the mean place of each trace point, calculate the variance that its mean place is arrived in each trace point all positions that live through in posture maintenance process then.After the Yoga release, can give a mark to the situation that this action is finished, marking is according to being time and the average variance that attitude keeps, and average variance is the mean value of each trace point variance sum.Retention time does not reach given threshold value, directly judges to keep baulk; Retention time reaches more than certain threshold value, and mark was directly proportional with the retention time, is inversely proportional to average variance.After calculating the mark of each action, just can calculate the score of entire exercise, the mean value that must be divided into each action score of entire exercise, this average mark can be used for weighing body builder's body-building effect.
A kind of yoga action recognition system is provided, comprises:
Action database: the data and the criterion thereof that are used for storage representation yoga standard operation;
Point data collecting unit based on video sampling: the multi-track that is used for that human body is provided with is on one's body put the locus point data at each quarter and is gathered, and the exercise data that collects is sent into computing unit;
Computing unit: calculate relative tertiary location relation between them according to the spatial position data of a plurality of trace points, and will calculate afterwards data and send into analytic unit;
Analytic unit: standard operation data and criterion thereof according to action database stores action make a decision calculating the back data, and judged result is sent into output unit;
Output unit: be used for judged result is exported prompting.
Utilize method and system disclosed by the invention, each respectively follows the tracks of the spatial positional information at position constantly when doing yoga body-building can to follow the tracks of the user, therefore can note the movement locus of following the tracks of the position, human body attitude is positioned, describes, whether action criteria when more real reflection user did yoga, by man-machine interaction, system points out incorrect operation and advises, is beneficial to the study of user to the yoga project.
Description of drawings
Fig. 1 is the yoga action identification method process flow diagram of a preferred embodiment of the present invention;
Fig. 2 is the coordinate diagram of a preferred embodiment of the present invention;
Fig. 3 is the action synoptic diagram based on distance calculating method of a preferred embodiment of the present invention;
Fig. 4 is the action synoptic diagram based on the plane projection angle computation method of a preferred embodiment of the present invention;
Fig. 5 is the synoptic diagram based on the relative angle computing method of a preferred embodiment of the present invention;
Fig. 6 is the processing flow chart of computing method of the combination of the relative distance of a preferred embodiment of the present invention and relative angle;
Fig. 7 is the Yoga action recognition system architecture synoptic diagram of a preferred embodiment of the present invention.
Embodiment
In order to make purpose of the present invention, technical scheme and advantage clearer,, the present invention is further elaborated below in conjunction with drawings and Examples.Should be appreciated that specific embodiment described herein only in order to explanation the present invention, and be not used in qualification the present invention.
The yoga action identification method that the embodiment of the invention provides needs the user to do the preliminary work of a little necessity: put on the cap that is used to catch human body movement data, handle, waistband, pin band as data collection point.The flow process of yoga action identification method is then: a plurality of trace points that human body is provided with are on one's body gathered in difference spatial position data constantly; Select predetermined computing method according to different action requests, spatial position data is calculated space distance or relative angle or space distance and angle between described a plurality of trace point; According to yoga standard operation data and the criterion thereof that action database is stored in advance, judge whether the data that calculate meet the yoga standard; Judged result is exported prompting.
Fig. 1 is the yoga action identification method process flow diagram that the embodiment of the invention provides.As shown in Figure 1, the method that provides of this embodiment comprises following four steps:
S1) gather a plurality of trace points that human body is provided with on one's body in difference spatial position data constantly;
S2) relative tertiary location that calculates between them according to the spatial position data of a plurality of trace points concerns;
S3) according to the standard operation data and the criterion thereof of action database stores action, the data that S2 calculates are carried out the qualification judgement;
S4) judged result is exported prompting.
The trace point that human body is provided with on one's body among the step S1 has six trace points in head, both hands, waist, trick, three trace points on head and the both hands are one group of trace point above the waist, and three trace points on waist and the both feet are one group of trace point of the lower part of the body; The video acquisition per second can provide 30 groups of data, and this high-frequency information acquisition can obtain very accurate positional information.
Need to prove, trace point is set among the present invention, and can pass through accomplished in many ways the process that its spatial position data is gathered.In one embodiment, trace point can be launched infrared light by infrared launcher to it for having the material of high reflective characteristic, by the different locus coordinates constantly of video acquisition trace point, carries out follow-up processing again.
Fig. 2 is the coordinate diagram that the embodiment of the invention provides.As shown in Figure 2, among the step S1 video acquisition to data all be with tlv triple (z) mode of three-dimensional coordinate is preserved for x, y, and horizontal right-hand lay is the x axle, and vertical direction is the y axle, with video acquisition over against direction be the Z axle.
The computing method of calculating the relative tertiary location relation between the trace point among the step S2 comprise: the computing method of the combination of the computing method of relative distance, the computing method of relative angle and relative distance and relative angle, the computing method of relative angle comprise the computing method of plane projection angle of trace point line or the computing method of the relative angle between the trace point line.Computing method are separate computations to the upper part of the body and the lower part of the body of human body, and three collection points above the waist constitute a spatial triangle, and three collection points of the lower part of the body constitute another spatial triangle.
Although people's particular location is transportable, height also is to distinguish to some extent, but the relative position of each body part of people is stable, so it is just passable only to need to detect relative position in some actions, this moment, step S2 adopted the decision method of relative distance to simple motion: the distance between 2 of the computer memories, some A (X a, Y a, Z a), some B (X b, Y b, Z b), by
Figure B200910109546XD0000071
Obtain apart from S.The data that collect are two groups, one group of three point of the upper part of the body, and one group of three point of the lower part of the body can obtain three sections relative distances of every group respectively by calculating, and judge above the waist and the action of the lower part of the body by every group three segment distance.
Fig. 3 is the action synoptic diagram based on the relative distance computing method that the embodiment of the invention provides.As shown in Figure 3, this action is vertically stood for the user, and two hands stretch upwards and outwards spread out respectively a segment distance.Two hands are some A and some B, and head is some C.
Here suppose that the upper part of the body data that collect are:
A(1.1,2.0,0.4)B(0.6,2.2,0.5)C(1.2,2.1,0.4)
So according to formula
Figure B200910109546XD0000072
S Ab=0.55, S Ac=0.14, S Bc=0.61
This computational data is sent into analytic unit, analytic unit according to calculate S Ab, S Ac, S BcCompare with these action criteria data and the criterion of action database stores action, thereby judge whether action is qualified.
From top algorithm as can be seen, owing to there are differences between men, so it can only be a value range that the required value of adjusting the distance is doomed, certainly working as value range requires the accurate more standardization to posture also strong more, but dirigibility has just reduced thereupon, adapting to different crowd is more coarse by the position distance only obviously, though and distance judges it is almost to apply to everything to detect in the present invention, but what applications distances was judged action separately also is more specific simple motions, so step S2 also adopts the computing method of plane projection angle: obtained the relative position that to analyze them behind the spatial point position, by calculating the relative position that angle between them reflects them.With the Z axle is that normal can obtain XOY plane, with the Y-axis is that normal can obtain the XOZ plane, with the X-axis is that normal can obtain the YOZ plane, according to the orthogonal geometry projection principle, now just the angle problem in space can be transformed on the angle problem on three planes: the line L of arbitrary group of any two points of trace point, project on the reference planes, obtain straight line L 1, straight line L 1The axis of reference of reference planes can form certain included angle therewith, and this angle also is the projection angle of body part on these reference planes of these 2 representatives.
As being projected with the line L of a hand wherein, head obtains straight line L on the plane X Z 1, can obtain L 1With the angle of X-axis be θ, the angle that θ then turns forward corresponding to hand, with respect to the health head back, hand is with respect to the health head forward when 0~π for θ at-π~0 o'clock hand for visible θ.
Fig. 4 is the action synoptic diagram of the plane projection angle computation method that provides of the embodiment of the invention, and as shown in Figure 4, two hands are combined in the crown above the waist, littlely forward inclines.Two hands lump together, so the right-hand man can be considered a bit, record a situation and are:
Hand (0.6,2.3,1.2) head (0.6,2.0,0.4)
Project to the YZ plane and can simplify coordinate: hand (0.3,0.8), head (0,0)
Be angle=arc tan (0.375)=0.358
0.358×(360/π)=41°
This computational data is sent into analytic unit, analytic unit according to calculate angle value and this action criteria angle and the judgement thereof of action database stores action compare, thereby judge whether this moves qualified above the waist.
Further, when the action that is detected is a continuous process, projection angle so in the plane is constantly to change, so on the basis of the computing method that adopt the plane projection angle, also adopt the computing method of relative angle, can whether change according to relative angle and judge this process.Can whether change according to relative angle and judge this process.Judge relative angle by the cosine law: C=arccos[(a 2+ b 2-c 2)/2bc], the C upper part of the body or three collection points of the lower part of the body form arbitrary angle of spatial triangle, and a, b, c are three limit space lengths of corresponding spatial triangle.
Fig. 5 is the synoptic diagram of the relative angle computing method that provide of the embodiment of the invention, and as shown in Figure 5, a, b, c are the length of side, and A, B, C are limit a, b, the pairing angle of c, and limit a, b, c can obtain by the computing method based on distance.Utilize the cosine law:
c 2=a 2+b 2-2abcosC
C=arccos[(a 2+b 2-c 2)/2bc]
Similarly, can obtain the value of angle A, B.This computational data is sent into analytic unit, analytic unit according to calculate angle value and this action criteria angle and the criterion thereof of action database stores action compare, thereby judge whether action qualified.
Fig. 6 is the processing flow chart of computing method of the combination of the relative distance that provides of the embodiment of the invention and relative angle.As shown in Figure 6, comprising:
S61), calculate the relative distance of putting between every group of trace point according to the decision method of distance;
S62), calculate the relative angle between every group of trace point according to the computing method of the relative angle in the determination methods of angle;
S63), calculate the angle of every group of trace point in XOY, XOZ, YOZ plane projection according to the computing method of the plane projection angle in the determination methods of angle.
Where applicable based on this judgement order also can division operation the judgement complexity, and no matter many complicated postures also can resolve into a plurality of better simply condition judgment in view of the above.
Before the Yoga action beginning, the body builder is towards video frequency pick-up head, be in standby condition, by the relative position between the trace point coordinate, do mark for each trace point, after body-building begins, calculate body builder's motion state in real time, distance between every group of three trace point, angle, movement velocity when movement velocity slowly levels off to zero the time, can judge that action of body builder puts in place, at this moment, fixed pose of mark begins, and notes the start time, calculates leg-of-mutton each angle or space distance or angle and space distance that every group of three trace point formed, if satisfy criterion, the fixed pose that then shows the body builder is standard compliant.After this in certain moment, if there is ungratified situation to occur, anchor of mark finishes, and notes the concluding time, and the difference of concluding time and start time is exactly the time that posture keeps.In the posture maintenance process, all spatial positional informations of every group of three trace points also are saved and utilize, at first calculate the mean place of each trace point, calculate the variance that its mean place is arrived in each trace point all positions that live through in posture maintenance process then.After the Yoga release, can give a mark to the situation that this action is finished, marking is according to being time and the average variance that attitude keeps, and average variance is the mean value of each trace point variance sum.Retention time does not reach given threshold value, directly judges to keep baulk; Retention time reaches more than certain threshold value, and mark was directly proportional with the retention time, is inversely proportional to average variance.After calculating the mark of each action, just can calculate the score of entire exercise, the mean value that must be divided into each action score of entire exercise, this average mark can be used for weighing body builder's body-building effect.So far, a complete description of judging whether human action meets Yoga fitness project and matching degree, body-building effect is through with.
Fig. 7 is the Yoga action recognition system architecture synoptic diagram that the embodiment of the invention provides, as shown in Figure 7, and based on point data collecting unit 71, action database 72, computing unit 73, analytic unit 74 and the output unit 75 of video sampling.
Gather based on the different locus point data constantly of multi-track point that 71 pairs of human bodies of point data collecting unit of video sampling are provided with on one's body, and the exercise data that collects sent into computing unit 73, action database 72 is used for storage standards action data and criterion thereof, computing unit 73 computing units concern according to the relative tertiary location that the action request that is stored in the action database calculates between them the locus point data, and will calculate the back data and send into analytic unit 74, analytic unit 74 according to the standard operation data of action database stores action and criterion thereof to the calculating of input after data make a decision, and judged result sent into output unit 75, the judged result that output unit 75 is sent into analytic unit is exported, and gives corresponding prompting.
The above only is preferred embodiment of the present invention, not in order to restriction the present invention, all any modifications of being done within the spirit and principles in the present invention, is equal to and replaces and improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. a computer based yoga action identification method is characterized in that, may further comprise the steps:
S1) gather a plurality of trace points that human body is provided with on one's body in difference spatial position data constantly;
S2) relative tertiary location that calculates between them according to the spatial position data of a plurality of trace points concerns;
S3) according to the standard operation data and the criterion thereof of action database stores action, the data that S2 calculates are carried out the qualification judgement;
S4) judged result is exported prompting.
2. method according to claim 1 is characterized in that a plurality of trace points are divided into two groups among the described step S1: three trace points on head and the both hands are trace point above the waist, and three trace points on waist and the both feet are lower part of the body trace point.
3. as method as described in the claim 2, it is characterized in that the described relative tertiary location relation among the described step S2 comprises: the combination of relative distance, relative angle or described relative distance and described relative angle.
4. as method as described in the claim 3, it is characterized in that the computing method of described relative distance are to calculate above the waist wantonly relative distance method in distance wantonly in three trace points or three trace points of the calculating lower part of the body at 2 at 2: point (X a, Y a, Z a) and point (X b, Y b, Z b) between relative distance S be calculated as S = ( x a - x b ) 2 + ( y a - y b ) 2 + ( z a - z b ) 2 .
5. as method as described in the claim 3, it is characterized in that described relative angle comprises the plane projection angle of described trace point line or the relative angle between the described trace point line.
6. as method as described in the claim 5, it is characterized in that, the computing method of described plane projection angle are, wantonly 2 line L in wantonly 2 line or three trace points of the lower part of the body in three trace points above the waist, project on the reference planes, obtain straight line L1, the axis of reference of straight line L1 and described reference planes can form certain included angle, and this angle also is the projection angle of body part on these reference planes of these 2 representatives.
7. as method as described in the claim 5, it is characterized in that, the computing method of described relative angle, be the method for calculating above the waist three angles of the spatial triangle that three trace points form or calculating three angles of the spatial triangle that three trace points of the lower part of the body form, judge relative angle by the cosine law: C=arccos[(a 2+ b 2-c 2)/2bc], C is arbitrary angle of arbitrary spatial triangle, a, b, c are three limit space lengths of corresponding spatial triangle.
8. as method as described in the claim 3, it is characterized in that the computing method of the combination of described relative distance and described relative angle comprise the steps:
S61) calculating of relative distance: calculate distance wantonly in three trace points of the upper part of the body at 2 or calculate wantonly relative distance method in three trace points of the lower part of the body at 2: point (X a, Y a, Z a) and point (X b, Y b, Z b) between relative distance S be calculated as
Figure F200910109546XC0000021
S62) calculating of relative angle: calculate above the waist three angles of the spatial triangle that three trace points form or calculate the method for three angles of the spatial triangle that three trace points of the lower part of the body form, judge relative angle by the cosine law: C=arccos[(a 2+ b 2-c 2)/2bc], C is arbitrary angle of arbitrary spatial triangle, a, b, c are three limit space lengths of corresponding spatial triangle;
S63) calculating of plane projection angle: wantonly 2 line L in wantonly 2 line or three trace points of the lower part of the body in three trace points above the waist, project on the reference planes, obtain straight line L1, the straight line L1 axis of reference of reference planes therewith can form certain included angle, and this angle also is the projection angle of body part on these reference planes of these 2 representatives.
9. as method as described in the claim 2, it is characterized in that, among the described step S4 judged result output is further comprised: judge on the whether qualified basis of action at S3, according to the retention time of different action requests, calculate the average variance of every group of trace point and the time that attitude keeps, give an evaluation of estimate action.
10. a yoga action recognition system is characterized in that, comprising:
Action database: the data and the criterion thereof that are used for storage representation yoga standard operation;
Point data collecting unit based on video sampling: the multi-track that is used for that human body is provided with is on one's body put each locus point data constantly and is gathered, and the exercise data that collects is sent into computing unit;
Computing unit: calculate relative tertiary location relation between them according to the spatial position data of a plurality of trace points, and will calculate afterwards data and send into analytic unit;
Analytic unit: standard operation data and criterion thereof according to action database stores action make a decision calculating the back data, and judged result is sent into output unit;
Output unit: be used for judged result is exported prompting.
CN 200910109546 2009-08-10 2009-08-10 Yoga stance recognition method and system Pending CN101996311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200910109546 CN101996311A (en) 2009-08-10 2009-08-10 Yoga stance recognition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200910109546 CN101996311A (en) 2009-08-10 2009-08-10 Yoga stance recognition method and system

Publications (1)

Publication Number Publication Date
CN101996311A true CN101996311A (en) 2011-03-30

Family

ID=43786450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200910109546 Pending CN101996311A (en) 2009-08-10 2009-08-10 Yoga stance recognition method and system

Country Status (1)

Country Link
CN (1) CN101996311A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104360743A (en) * 2014-11-20 2015-02-18 姚尧 Body posture data acquisition method and system, and data processing device
CN104581079A (en) * 2015-01-21 2015-04-29 山东大学 Device and method for assisting user in using fitness equipment
CN105825325A (en) * 2016-03-10 2016-08-03 南京市建筑安装工程质量监督站 Project quality supervision personnel supervision capability evaluation method and device
CN106126556A (en) * 2011-06-13 2016-11-16 索尼公司 Information processor, information processing method and computer program
CN106344030A (en) * 2016-08-30 2017-01-25 苏州品诺维新医疗科技有限公司 Posture correction method and device
CN106372294A (en) * 2016-08-30 2017-02-01 苏州品诺维新医疗科技有限公司 Method and device for correcting posture
CN106600915A (en) * 2016-12-06 2017-04-26 广州视源电子科技股份有限公司 Method, device and system for body shape correction reminding
CN106774831A (en) * 2016-11-14 2017-05-31 广东小天才科技有限公司 A kind of sitting posture prompting method, apparatus and system
CN109308437A (en) * 2017-07-28 2019-02-05 上海形趣信息科技有限公司 Action recognition error correction method, electronic equipment, storage medium
CN109325466A (en) * 2018-10-17 2019-02-12 兰州交通大学 A kind of smart motion based on action recognition technology instructs system and method
CN109344706A (en) * 2018-08-28 2019-02-15 杭州电子科技大学 It is a kind of can one man operation human body specific positions photo acquisition methods
CN111144185A (en) * 2018-11-06 2020-05-12 珠海格力电器股份有限公司 Information prompting method and device, storage medium and electronic device
US10737140B2 (en) 2016-09-01 2020-08-11 Catalyft Labs, Inc. Multi-functional weight rack and exercise monitoring system for tracking exercise movements
CN111861275A (en) * 2020-08-03 2020-10-30 河北冀联人力资源服务集团有限公司 Method and device for identifying household working mode

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106126556A (en) * 2011-06-13 2016-11-16 索尼公司 Information processor, information processing method and computer program
CN106126556B (en) * 2011-06-13 2020-09-29 索尼公司 Information processing apparatus, information processing method, and computer program
CN104360743B (en) * 2014-11-20 2017-05-31 武汉准我飞科技有限公司 The acquisition methods of human body attitude data, system and data processing equipment
WO2016078131A1 (en) * 2014-11-20 2016-05-26 姚尧 Human body posture data acquisition method and system, and data processing device
US10395106B2 (en) 2014-11-20 2019-08-27 Wuhan Zhunwofei Science and Technology Co., LTD Human body posture data acquisition method and system, and data processing device
CN104360743A (en) * 2014-11-20 2015-02-18 姚尧 Body posture data acquisition method and system, and data processing device
CN104581079A (en) * 2015-01-21 2015-04-29 山东大学 Device and method for assisting user in using fitness equipment
CN105825325A (en) * 2016-03-10 2016-08-03 南京市建筑安装工程质量监督站 Project quality supervision personnel supervision capability evaluation method and device
CN106372294A (en) * 2016-08-30 2017-02-01 苏州品诺维新医疗科技有限公司 Method and device for correcting posture
CN106344030A (en) * 2016-08-30 2017-01-25 苏州品诺维新医疗科技有限公司 Posture correction method and device
US10737140B2 (en) 2016-09-01 2020-08-11 Catalyft Labs, Inc. Multi-functional weight rack and exercise monitoring system for tracking exercise movements
CN106774831A (en) * 2016-11-14 2017-05-31 广东小天才科技有限公司 A kind of sitting posture prompting method, apparatus and system
CN106600915A (en) * 2016-12-06 2017-04-26 广州视源电子科技股份有限公司 Method, device and system for body shape correction reminding
CN106600915B (en) * 2016-12-06 2019-03-15 广州视源电子科技股份有限公司 The method, apparatus and system that body correction is reminded
CN109308437A (en) * 2017-07-28 2019-02-05 上海形趣信息科技有限公司 Action recognition error correction method, electronic equipment, storage medium
CN109308437B (en) * 2017-07-28 2022-06-24 上海史贝斯健身管理有限公司 Motion recognition error correction method, electronic device, and storage medium
CN109344706A (en) * 2018-08-28 2019-02-15 杭州电子科技大学 It is a kind of can one man operation human body specific positions photo acquisition methods
CN109325466A (en) * 2018-10-17 2019-02-12 兰州交通大学 A kind of smart motion based on action recognition technology instructs system and method
CN109325466B (en) * 2018-10-17 2022-05-03 兰州交通大学 Intelligent motion guidance system and method based on motion recognition technology
CN111144185A (en) * 2018-11-06 2020-05-12 珠海格力电器股份有限公司 Information prompting method and device, storage medium and electronic device
CN111861275A (en) * 2020-08-03 2020-10-30 河北冀联人力资源服务集团有限公司 Method and device for identifying household working mode
CN111861275B (en) * 2020-08-03 2024-04-02 河北冀联人力资源服务集团有限公司 Household work mode identification method and device

Similar Documents

Publication Publication Date Title
CN101996311A (en) Yoga stance recognition method and system
CN101964047B (en) Multiple trace point-based human body action recognition method
US10716989B2 (en) Swing analysis method using a sweet spot trajectory
JP5858261B2 (en) Virtual golf simulation apparatus, and sensing apparatus and sensing method used therefor
KR101565739B1 (en) Movement recognition method, device and movement auxiliary device for ball games
US8175326B2 (en) Automated scoring system for athletics
US10456653B2 (en) Swing quality measurement system
US10974121B2 (en) Swing quality measurement system
US9162132B2 (en) Virtual golf simulation apparatus and sensing device and method used for the same
KR20110139694A (en) Method and system for gesture recognition
CN102243687A (en) Physical education teaching auxiliary system based on motion identification technology and implementation method of physical education teaching auxiliary system
CN102989174A (en) Method for obtaining inputs used for controlling operation of game program
CN202662011U (en) Physical education teaching auxiliary system based on motion identification technology
US20200179753A1 (en) Real time golf swing training aid
WO2021085578A1 (en) Ball tracking apparatus and ball tracking method
US11875697B2 (en) Real time sports motion training aid
WO2015098251A1 (en) Information processing device, recording medium, and information processing method
KR20020017576A (en) System and method for motion capture using camera image
CN109200562A (en) Body-sensing yoga realization method and system
Petrič et al. Real-time 3D marker tracking with a WIIMOTE stereo vision system: Application to robotic throwing
CN112847374A (en) Parabolic-object receiving robot system
KR102505470B1 (en) Management system of screen golf using LiDAR sensor and big-data
JP7027745B2 (en) Analysis device for the behavior of hitting tools
Lu et al. Binocular Vision-Based Recognition Method for Table Tennis Motion Trajectory
Lin et al. Robot catching system with stereo vision and DSP platform

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20110330