CN113663312B - Micro-inertia-based non-apparatus body-building action quality evaluation method - Google Patents

Micro-inertia-based non-apparatus body-building action quality evaluation method Download PDF

Info

Publication number
CN113663312B
CN113663312B CN202110935608.3A CN202110935608A CN113663312B CN 113663312 B CN113663312 B CN 113663312B CN 202110935608 A CN202110935608 A CN 202110935608A CN 113663312 B CN113663312 B CN 113663312B
Authority
CN
China
Prior art keywords
action
motion
classification
state
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110935608.3A
Other languages
Chinese (zh)
Other versions
CN113663312A (en
Inventor
阳媛
杨浩然
王庆
王慧青
况余进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202110935608.3A priority Critical patent/CN113663312B/en
Publication of CN113663312A publication Critical patent/CN113663312A/en
Application granted granted Critical
Publication of CN113663312B publication Critical patent/CN113663312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/836Sensors arranged on the body of the user

Abstract

The invention discloses a micro-inertia based non-equipment fitness motion quality evaluation method, which can be used for identifying and evaluating the quality of non-equipment fitness motion of people. The method adopts a state-action two-step classification method, firstly, 3 6-axis micro-inertial sensors are utilized to identify the motion state of a human body, then an Elman-Kalman trajectory estimation model is called to predict the motion trajectory of each sensor node, and then a trajectory signal and an acquired signal are used for the second step of body-building action classification. After classification is completed, the generated motion sequence is compared with sequences in a standard motion sequence library for analysis, the whole body and local standard degree and stability of the motion are evaluated, and the motion quality evaluation result is given to help people to efficiently and safely carry out free-hand fitness activities. The method can effectively identify the body-building action type of the user in the bare-handed movement body-building process of the person, has high accuracy, and can reasonably evaluate the movement posture of each part of the body of the user.

Description

Micro-inertia-based non-apparatus body-building action quality evaluation method
Technical Field
The invention belongs to the field of fitness exercise, and particularly relates to a method for evaluating the quality of fitness actions without instruments based on micro-inertia.
Background
In recent years, the body-building concept is deeply concentrated under the promotion of national fitness policy, and the demand of people on body building is more and more vigorous. With the vigorous development of the internet, the home and on-line body-building method attracts most people to be added into the body-building. However, with the hot tide of body building, the injury caused by improper exercise is also the focus of people's attention.
Lack of scientific-guided body-building activities aiming at physical conditions and easily cause damage to the body. Scientific fitness ways are always sought after. However, since the current gym personal trainer is expensive, and a part of the population chooses to do home autonomous exercises, there is a problem that the standardization of the motion is difficult to estimate. Because the user independently imitates the video action training, lack professional guidance, the action standard nature in the motion process is difficult to judge, can cause permanent damage when serious.
In order to solve the above problems, according to the patent of chinese patent publication (public disclosure) No. CN104888444A, an intelligent glove, method and system for identifying calorie consumption and hand gesture are provided, which can identify fitness data and calorie consumption of a user. This patent utilizes inertial sensor and pressure sensor to carry out the gesture classification to the power training action to can in time warn when the hand gesture takes place to squint. The motion data that this gloves collected the processing is comparatively single, can not carry out effectual aassessment to the whole body gesture of body-building process, also can not carry out aassessment analysis to hand body-building in-process real-time position, leads to the user to the cognitive degree of body-building action overall quality low. Chinese patent publication (publication) No. CN106073793B discloses a posture tracking and identifying method based on micro inertial sensors, which uses wearable micro inertial sensors to collect posture data of a human body and perform posture data processing, tracks and identifies actions and behaviors of a user, and judges the normative degree of the actions of the user according to preset standard actions and provides corresponding correction suggestions. According to the method, the motion process of a person is divided into a plurality of basic actions, the person actions are classified in a template matching mode, the definition of a template library is complex, and position information of each part of the human body in the motion process is lacked in the template library. In evaluating human actions, the normality of user actions is evaluated in a time series distance metric. The method omits the requirement of body building action on each part, only measures the standard degree but neglects the stability, focuses on the integral difference, and ignores the action score of local key position.
The current fitness auxiliary system lacks reasonable overall and local quality assessment means, lacks the judgment of real-time positions of all parts in the human motion process and lacks an efficient classification method for freehand fitness actions in the fitness training process of people. Therefore, it is necessary to guide the fitness training of the user to develop an efficient motion estimation method suitable for the whole motion and considering the local key positions for the free-hand fitness training of the user.
Disclosure of Invention
Aiming at the problem that the existing fitness auxiliary system cannot evaluate the action quality of human fitness, the invention provides a micro-inertia-based non-equipment fitness action quality evaluation method which is used for guiding non-professionals to carry out fitness activities in a healthy, safe and effective manner.
The invention provides a micro-inertia based non-equipment fitness action quality evaluation method, which comprises the following steps:
(1) collecting and processing body-building exercise information;
in the step (1), the information of the body-building exercise is collected and processed, 6-axis inertial sensors are arranged on the human body, N inertial sensors are arranged on the human body, and the number of the inertial sensors is N, wherein the number of the upper body is NuLower part of the body N d1 in the waist and abdomen;
(2) a human body fitness state-action two-step classification model;
the two-step classification model of the fitness action without the apparatus comprises the following specific steps:
(2.1) the motion states of the human body are roughly classified by utilizing the information of 3 micro inertial sensors of the upper body, the lower body and the waist and abdomen, wherein the information of the micro inertial sensors with 6 axes is divided into five types, specifically: upper limb movement, lower limb movement, waist and abdomen movement, whole body movement and non-movement state;
(2.2) calling the Elman-Kalman model in the step (3) according to the state classification result to obtain coordinate track information of each sensor, performing fine classification of body-building exercises according to nodes of the whole body, and specifically classifying the exercise state into various actions:
(2.3) for the sensor i, combining the classification result, the action coordinate and the Euler angle to form an action quality evaluation sequence, wherein the specific form is as follows:
Figure BDA0003212743300000021
wherein x, y, z are three-dimensional coordinates of the attachment part of the motion process sensor, α, β, γ are corresponding Euler angle sequences, C1The state classification result is evaluated as 1-5, which respectively correspond to 5 motion states, C2Taking the value of 1-m as an action classification result, respectively corresponding to m actions in each motion state, and updating the value after the action classification step is finished;
(3) an Elman-Kalman trajectory estimation model of human body motion;
the Elman-Kalman trajectory estimation model comprises the following specific steps:
(3.1) aiming at the first four motion states of the coarse classification result in the step (2), establishing 4 corresponding motion track estimation models for each sensor;
(3.2) the Elman-Kalman trajectory estimation model structure is specifically that in a hidden layer of an Elman neural network, estimation on future data is added, and an extended Kalman filter is constructed in an output layer to optimize a coordinate result;
in the hidden layer, a single step data input vector x is addedp(k)=f(w1u(k)+w2xc(k+1)+b1) In this case, the output unit vector of the hidden layer is x (k) f (w)1u(k-1)+w2xc(k)+w3xp(k)+b1) In the formula, w1,w2,w3Is a weight, u is an input vector of the neural network, xcFor hidden layer feedback vectors, b1F is a sigmoid function, and k is a time;
adding an extended Kalman filter in the output layer of the Elman network to calibrate the original output
Figure BDA0003212743300000031
Figure BDA0003212743300000032
For the estimated rough three-dimensional trajectory coordinates, the state equation is set as:
Figure BDA0003212743300000033
setting the observation equation to y (k) ═ p (k) + δkThe Jacobian matrix of the equation of state is
Figure BDA0003212743300000034
In the above formula, ωkAngular velocity at time k, ηkkIs white gaussian noise;
(3.3) the output adopted in the training of the neural network is a three-dimensional coordinate which is collected by an optical dynamic capturing system;
(4) analyzing the posture and track coordinate information of key nodes of the whole body, and evaluating the action quality;
the action quality evaluation method comprises the following specific steps:
(4.1) for a sequence of actions1O,2O,...,NO } and the corresponding sequence of the standard action sequence library1Os,2Os,...,NOsComparing, and analyzing the action quality;
and sequences in the standard action sequence library, wherein the sequence format of one sensor i is as follows:
Figure BDA0003212743300000035
wherein w1i,w2iThe scoring weight is set for the person according to the action type and the sensor part;
(4.2) foriO andiOscalculating a standard degree E corresponding to the quality of the motioniAnd a stability SiThe specific calculation method is as follows:
Figure BDA0003212743300000041
Figure BDA0003212743300000042
wherein std represents a standard deviation;
at this time, the quality score of the joint whole node is { Q, Q ] for the action to be evaluated1,q2,...,qNIs totally divided into
Figure BDA0003212743300000043
Wherein q ismin,qmaxRespectively representing the maximum and minimum scores of each part, and the action quality score of each part of the body is qi=w1iEi+w2iSiWherein w is1i,w2iCan be based on the type of action (C)1,C2) Directly from the standard library.
As a further improvement of the invention, step (2.2), the specific freehand fitness action for identification comprises:
Figure BDA0003212743300000044
as a further improvement of the invention, the first 6 columns of data of the action sequence of the action classification of the step (2.3) are all normalized data.
Compared with the prior art, the invention has the advantages that:
the method of the invention sequentially classifies the motion state and the body-building action of the human body by applying a two-step classification algorithm, and can effectively improve the identification rate of the category of the body-building action of the human body by introducing the estimated motion track coordinate of the human body when the body-building action is classified in the second step. When the human body action quality is evaluated, the nodes of the whole body are subjected to fusion evaluation, so that not only can the whole action quality be fed back, but also the action quality of each part of the body can be evaluated.
The invention provides an Elman-Kalman trajectory estimation algorithm for estimating the motion trajectory of each part of a human body, effectively solves the problems of difficult initial alignment, great environmental influence and the like of the traditional pure inertia calculation, and can be widely applied to the trajectory identification of fixed repeated actions.
Drawings
FIG. 1 is a flow chart of an implementation of a method for evaluating the quality of a non-mechanical fitness action based on micro-inertia;
FIG. 2 is a schematic diagram of a state-action two-step classification algorithm;
FIG. 3 is an Algorithm diagram of the Elman-Kalman trajectory estimation of a single node.
Detailed Description
The invention is described in further detail below with reference to the following detailed description and accompanying drawings:
the invention provides a micro-inertia based fitness action quality evaluation method without equipment, which is used for guiding non-professional persons to carry out fitness activities in a healthy, safe and effective manner.
FIG. 1 is a schematic flow chart of the method of the present invention.
Step S1: collecting and transmitting body-building exercise data. The method specifically comprises the following steps:
s1.1, laying 6-axis inertial sensors at 9 key positions of action postures of a human body to be evaluated, wherein the 6-axis inertial sensors comprise 4 upper limbs (a left forearm, a left upper limb, a right forearm and a right upper limb), 4 lower limbs (a left calf, a left thigh, a right calf and a right thigh) and 1 waist;
s1.2, returning acquired sensor data, triaxial acceleration, triaxial angular velocity and Euler angle, wherein the data acquisition frequency is 50 Hz:
a state-action two-step classification algorithm is shown in fig. 2.
Step S2: the exercise information is used for identifying the body-building action of the human body, and a state-exercise two-step classification algorithm is adopted. The method specifically comprises the following steps:
s2.1, dividing the returned data of each sensor, wherein each two seconds is an action sequence, the size of each sequence is 100 multiplied by 9, one action is,iO′k=[ak ωk αk βk γk]including three-axis acceleration, three-axis angular velocity, and euler angles.
S2.2, dividing the motion states of the human body into 5 classes by using machine learning algorithms including but not limited to machine learning algorithms such as a support vector machine, a decision tree, a random forest and a neural network, respectively representing the motion states of the upper limb, the lower limb, the waist and abdomen, the whole body and the non-motion states, and marking C1The value, the motion sequence extension is 100 x 10, where one behavior,iO′k=[ak ωkαk βk γk C1]the motion state flag is incremented.
S2.3 according to the motion state, namely C1And calling a corresponding Elman-Kalman motion trail estimation model to estimate the motion trail of the corresponding node. The model is illustrated in a single node.
Fig. 3 is a block diagram of a trajectory estimation model of a single node.
The input of the model is the first 9 columns of the motion sequence, where u (k-1) ═ ak-1k-1k-1k-1k-1) First, add a future one-step action sequence, x, to the hidden layerp(k)=f(w1u(k)+w2xc(k+1)+b1) Therefore, the output of the hidden layer is changed to x (k) ═ f (w)1u(k-1)+w2xc(k)+w3xp(k)+b1) Wherein f is a sigmoid function. And adding extended Kalman filtering in an output layer to further calibrate the estimated value. The moment of the fitness activity can be considered as a uniform acceleration of the movement about the fulcrum, so the equation of state can be set as:
Figure BDA0003212743300000061
wherein the content of the first and second substances,
Figure BDA0003212743300000062
is an estimate of the coordinates having a jacobian matrix of
Figure BDA0003212743300000063
The observation equation is y (k) ═ p (k) + δk. Wherein etakkAre all gaussian noise. The output is estimated three-dimensional coordinates p (k) ═ xk yk zk)T. During model training, the true value coordinates are obtained by the optical dynamic capturing system.
S2.4, after the estimated value of the motion trail is obtained, the motion sequence of each sensor is updated to be
Figure BDA0003212743300000064
And applying the classification to the action classification of the second step, and obtaining a classification result and then applying the action sequenceFurther updated to1O,2O,...,NO }, wherein
Figure BDA0003212743300000065
Step S3: analyzing the posture and track coordinate information of the key nodes of the whole body, and evaluating the action quality, wherein the method specifically comprises the following steps:
s3.1, making a check on the action sequence formed in step 21O,2O,...,NO } according to C1,C2Calling a corresponding sequence in a standard action sequence library1Os,2Os,...,NOsComparative analysis was performed.
S3.2, analyzing the standard degree and the stability of all the nodes, calculating the standard degree for the node i,
Figure BDA0003212743300000066
calculating the degree of stability
Figure BDA0003212743300000067
Where std is the standard deviation.
S3.3, for the node i, the action local score is as follows: q. q.si=w1iEi+w2iSiWherein w is1i,w2iThe scoring weight is set by human according to the action type and calculated according to C1,C2The value is called.
S3.4, Total score for Whole body action
Figure BDA0003212743300000071
The final feedback to the user is evaluated as a sequence of scores { Q, Q1,q2,...,qNAnd the exercise quality is measured, and the standard reaching conditions of all parts of the body are measured, so that the user is guided to efficiently and safely carry out fitness activities.
The above description is only one of the preferred embodiments of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made in accordance with the technical spirit of the present invention are within the scope of the present invention as claimed.

Claims (3)

1. A method for evaluating the quality of non-equipment fitness actions based on micro-inertia is characterized by comprising the following steps:
(1) collecting and processing body-building exercise information;
in the step (1), the information of the body-building exercise is collected and processed, 6-axis inertial sensors are arranged on the human body, N inertial sensors are arranged on the human body, and the number of the inertial sensors is N, wherein the number of the upper body is NuLower part of the body Nd1 in the waist and abdomen;
(2) a human body fitness state-action two-step classification model;
the two-step classification model of the fitness action without the apparatus comprises the following specific steps:
(2.1) the acceleration angle speed and the solved Euler angle information that utilize each sensor of upper part of the body, lower part of the body, a sensor of waist abdomen, totally 3 little inertial sensor collection of 6 axles carry out coarse classification to human motion state, divide into five types with the motion state, specifically do: upper limb movement, lower limb movement, waist and abdomen movement, whole body movement and non-movement state;
and (2.2) calling the Elman-Kalman model in the step (3) according to the state classification result to obtain coordinate track information of each sensor, performing fine classification of body-building movement according to the attitude angle information of the whole nodes, and specifically classifying the movement state into various actions:
(2.3) for the sensor i, combining the classification result, the action track coordinate and the Euler angle to form an action quality evaluation sequence, wherein the specific form is as follows:
Figure FDA0003544821110000011
wherein x, y, z are three-dimensional coordinates of the attachment part of the motion process sensor, α, β, γ are corresponding Euler angle sequences, C1The state classification result is evaluated as 1-5, which respectively correspond to 5 motion states, C2The value of the classification result of the motion is 1-m, which respectively corresponds to m motions in each motion stateThe value will be updated after the action classification step is finished;
(3) an Elman-Kalman trajectory estimation model of human motion;
the Elman-Kalman trajectory estimation model comprises the following specific steps:
(3.1) aiming at the first four motion states of the state classification result in the step (2), establishing 4 corresponding motion track estimation models for each sensor;
(3.2) the Elman-Kalman trajectory estimation model structure is specifically that in a hidden layer of an Elman neural network, estimation on future data is added, and an extended Kalman filter is constructed in an output layer to optimize a coordinate result;
in the hidden layer, a single step data input vector x is addedp(k)=f(w1u(k)+w2xc(k+1)+b1) In this case, the output unit vector of the hidden layer is x (k) f (w)1u(k-1)+w2xc(k)+w3xp(k)+b1) In the formula, w1,w2,w3Is a weight, u is an input vector of the neural network, xcFor hidden layer feedback vectors, b1F is a sigmoid function, and k is a time;
adding an extended Kalman filter in the output layer of the Elman network to calibrate the original output
Figure FDA0003544821110000021
Figure FDA0003544821110000022
For the estimated rough three-dimensional trajectory coordinates, the state equation is set as:
Figure FDA0003544821110000023
setting the observation equation to y (k) ═ p (k) + δkThe Jacobian matrix of the equation of state is
Figure FDA0003544821110000024
In the above formula, ωkIs time kAngular velocity of [, ]kkIs Gaussian white noise;
(3.3) the output adopted during training the neural network is a three-dimensional coordinate which is collected by an optical dynamic capturing system;
(4) analyzing the posture and track coordinate information of key nodes of the whole body, and evaluating the action quality;
the action quality evaluation method comprises the following specific steps:
(4.1) for a sequence of actions1O,2O,...,NO } and the corresponding sequence of the standard action sequence library1Os,2Os,...,NOsComparing, and analyzing the action quality;
and sequences in the standard action sequence library, wherein the sequence format of one sensor i is as follows:
Figure FDA0003544821110000025
wherein w1i,w2iThe scoring weight is set for the person according to the action type and the sensor part;
(4.2) foriO andiOscalculating a standard degree E corresponding to the quality of the motioniAnd a stability SiThe specific calculation method is as follows:
Figure FDA0003544821110000026
Figure FDA0003544821110000027
wherein std represents a standard deviation;
at this time, the quality of the joint whole node is rated as { Q, Q } to be evaluated1,q2,...,qNIs totally divided into
Figure FDA0003544821110000031
Wherein q ismin,qmaxRespectively representing the maximum and minimum scores of each part, and the action quality score of each part of the body is qi=w1iEi+w2iSiWhich isMiddle w1i,w2iCan be based on the type of action (C)1,C2) Directly from the standard library.
2. The method of claim 1, wherein the method comprises: step (2.2), the specific bare-handed fitness action for identification comprises the following steps:
chest expanding, upward inclined push-ups, kneeling position push-ups, standard push-ups, wide-distance push-ups, bare-handed lateral lifting, and bare-handed Cuba pushing;
deep squat, deep squat jump, gun type deep squat, vigorous walking squat, alternate arrow step squat jump, duck step, and lower limb movement leaning against the wall for static squat;
sit-up, abdominal curl in the west, boat-type swing, flat plate support, knee-bending flat plate support, side support, in-situ crawling, waist and abdomen exercise of raising legs in the supine position;
climbing, climbing bear, Bobby jump, crab walking, turning back, and abdomen contracting jump.
3. The method of claim 1, wherein the method comprises: and (4) the first 6 rows of data of the action sequence of the action classification in the step (2.3) are normalized data.
CN202110935608.3A 2021-08-16 2021-08-16 Micro-inertia-based non-apparatus body-building action quality evaluation method Active CN113663312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110935608.3A CN113663312B (en) 2021-08-16 2021-08-16 Micro-inertia-based non-apparatus body-building action quality evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110935608.3A CN113663312B (en) 2021-08-16 2021-08-16 Micro-inertia-based non-apparatus body-building action quality evaluation method

Publications (2)

Publication Number Publication Date
CN113663312A CN113663312A (en) 2021-11-19
CN113663312B true CN113663312B (en) 2022-05-13

Family

ID=78543007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110935608.3A Active CN113663312B (en) 2021-08-16 2021-08-16 Micro-inertia-based non-apparatus body-building action quality evaluation method

Country Status (1)

Country Link
CN (1) CN113663312B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114146363B (en) * 2021-12-14 2022-08-30 国家康复辅具研究中心 Walking aid training system and integrated control method thereof
CN116259111A (en) * 2023-05-15 2023-06-13 江西工业贸易职业技术学院 VR-based sports action scoring method, VR-based sports action scoring system, electronic device and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008312215A (en) * 2007-06-18 2008-12-25 Sony (China) Ltd Video-image analyzer, video-image analyzing method, automatic digest preparation system, and automatic highlight extraction system
CN103100205A (en) * 2012-12-12 2013-05-15 徐玉文 Auxiliary device suitable for middle and primary school physical education evaluation and achieving method thereof
CN105144194A (en) * 2013-03-07 2015-12-09 阿尔派回放股份有限公司 Systems and methods for identifying and characterizing athletic maneuvers
CN106073793A (en) * 2016-06-13 2016-11-09 中南大学 Attitude Tracking based on micro-inertia sensor and recognition methods
CN108245880A (en) * 2018-01-05 2018-07-06 华东师范大学 Body-sensing detection method for visualizing and system based on more wearing annulus sensor fusions
BR202017002365U2 (en) * 2018-08-21 CONSTRUCTIVE ARRANGEMENT APPLIED TO SPORT ARC
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
CN111249691A (en) * 2018-11-30 2020-06-09 百度在线网络技术(北京)有限公司 Athlete training method and system based on body shape recognition
CN111631698A (en) * 2020-05-12 2020-09-08 东南大学 Wearable blood pressure monitoring and correcting method based on motion mode cascade constraint
CN111652078A (en) * 2020-05-11 2020-09-11 浙江大学 Yoga action guidance system and method based on computer vision
CN111744156A (en) * 2020-07-06 2020-10-09 深圳市蝙蝠云科技有限公司 Football action recognition and evaluation system and method based on wearable equipment and machine learning
CN112945225A (en) * 2021-01-19 2021-06-11 西安理工大学 Attitude calculation system and method based on extended Kalman filtering

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015002912A (en) * 2013-06-21 2015-01-08 セイコーエプソン株式会社 Motion analysis device and motion analysis program
US20200197744A1 (en) * 2018-12-21 2020-06-25 Motion Scientific Inc. Method and system for motion measurement and rehabilitation

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR202017002365U2 (en) * 2018-08-21 CONSTRUCTIVE ARRANGEMENT APPLIED TO SPORT ARC
JP2008312215A (en) * 2007-06-18 2008-12-25 Sony (China) Ltd Video-image analyzer, video-image analyzing method, automatic digest preparation system, and automatic highlight extraction system
CN103100205A (en) * 2012-12-12 2013-05-15 徐玉文 Auxiliary device suitable for middle and primary school physical education evaluation and achieving method thereof
CN105144194A (en) * 2013-03-07 2015-12-09 阿尔派回放股份有限公司 Systems and methods for identifying and characterizing athletic maneuvers
CN106073793A (en) * 2016-06-13 2016-11-09 中南大学 Attitude Tracking based on micro-inertia sensor and recognition methods
CN108245880A (en) * 2018-01-05 2018-07-06 华东师范大学 Body-sensing detection method for visualizing and system based on more wearing annulus sensor fusions
CN111249691A (en) * 2018-11-30 2020-06-09 百度在线网络技术(北京)有限公司 Athlete training method and system based on body shape recognition
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
CN111652078A (en) * 2020-05-11 2020-09-11 浙江大学 Yoga action guidance system and method based on computer vision
CN111631698A (en) * 2020-05-12 2020-09-08 东南大学 Wearable blood pressure monitoring and correcting method based on motion mode cascade constraint
CN111744156A (en) * 2020-07-06 2020-10-09 深圳市蝙蝠云科技有限公司 Football action recognition and evaluation system and method based on wearable equipment and machine learning
CN112945225A (en) * 2021-01-19 2021-06-11 西安理工大学 Attitude calculation system and method based on extended Kalman filtering

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于遗传模糊推理的自适应UKF组合测姿滤波算法;肖文健等;《航天控制》;20131015(第05期);全文 *
基于高效二阶最小化的运动模糊目标跟踪算法;赵越等;《东北大学学报(自然科学版)》;20131215(第12期);全文 *
穿戴式颈部动作检测系统的设计;盛希宁;《林业机械与木工设备》;20200911(第09期);全文 *

Also Published As

Publication number Publication date
CN113663312A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN106650687B (en) Posture correction method based on depth information and skeleton information
Ghasemzadeh et al. Sport training using body sensor networks: A statistical approach to measure wrist rotation for golf swing
Olguın et al. Human activity recognition: Accuracy across common locations for wearable sensors
Wang et al. Using wearable sensors to capture posture of the human lumbar spine in competitive swimming
CN113663312B (en) Micro-inertia-based non-apparatus body-building action quality evaluation method
Parisi et al. Human motion assessment in real time using recurrent self-organization
CN103678859B (en) Motion comparison method and motion comparison system
US9510789B2 (en) Motion analysis method
CN114067358A (en) Human body posture recognition method and system based on key point detection technology
CN109621331A (en) Fitness-assisting method, apparatus and storage medium, server
CN110490109A (en) A kind of online human body recovery action identification method based on monocular vision
CN112294295A (en) Human body knee motion posture identification method based on extreme learning machine
CN109731316B (en) Shooting training system
CN112370045B (en) Functional action detection method and system based on artificial intelligence
CN111569397B (en) Handle motion counting method and terminal
CN115661930A (en) Action scoring method and device, action scoring equipment and storage medium
Zhou et al. Posture tracking meets fitness coaching: A two-phase optimization approach with wearable devices
Zhen et al. Hybrid deep-learning framework based on Gaussian fusion of multiple spatiotemporal networks for walking gait phase recognition
Yuan et al. Adaptive recognition of motion posture in sports video based on evolution equation
Muhammad et al. Mono Camera-based Human Skeletal Tracking for Squat Exercise Abnormality Detection using Double Exponential Smoothing
Chen Research on intelligent bodybuilding system based on machine learning
Chu et al. Image recognition of badminton swing motion based on single inertial sensor
CN117109567A (en) Riding gesture monitoring method and system for dynamic bicycle movement and wearable riding gesture monitoring equipment
Jiang et al. Deep learning algorithm based wearable device for basketball stance recognition in basketball
Rungsawasdisap et al. Squat movement recognition using hidden Markov models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant