CN111744156A - Football action recognition and evaluation system and method based on wearable equipment and machine learning - Google Patents

Football action recognition and evaluation system and method based on wearable equipment and machine learning Download PDF

Info

Publication number
CN111744156A
CN111744156A CN202010642331.0A CN202010642331A CN111744156A CN 111744156 A CN111744156 A CN 111744156A CN 202010642331 A CN202010642331 A CN 202010642331A CN 111744156 A CN111744156 A CN 111744156A
Authority
CN
China
Prior art keywords
data
motion
football
wearable
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010642331.0A
Other languages
Chinese (zh)
Other versions
CN111744156B (en
Inventor
王宇帆
范梦娇
张潇
卢姗姗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bayun Technology Co ltd
Original Assignee
Shenzhen Bayun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bayun Technology Co ltd filed Critical Shenzhen Bayun Technology Co ltd
Priority to CN202010642331.0A priority Critical patent/CN111744156B/en
Publication of CN111744156A publication Critical patent/CN111744156A/en
Application granted granted Critical
Publication of CN111744156B publication Critical patent/CN111744156B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/002Training appliances or apparatus for special sports for football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]

Abstract

The invention relates to the technical field of motion capture identification, and discloses a football motion identification and evaluation system and method based on wearable equipment and machine learning, wherein the football motion identification and evaluation system comprises a wearable measuring sensor, a motion sensor and a motion sensor, wherein the wearable measuring sensor is used for acquiring motion data of one or more motion motions in limbs of a football player; the computer is used for acquiring the motion data of a plurality of groups of professional football players and football amateurs, taking the motion data as training data and establishing a motion track model; extracting characteristic parameters of data from all data, inputting the characteristic parameters into an SVM classifier, debugging the parameters of the SVM classifier, and obtaining a classification model after training is completed; and the intelligent mobile terminal equipment is used for inputting the characteristic parameters into the trained classification model for classification to obtain a final classification result. The technical scheme of the invention can accurately classify the football actions and grade the action completer.

Description

Football action recognition and evaluation system and method based on wearable equipment and machine learning
Technical Field
The invention relates to the technical field of motion capture identification, in particular to a football motion identification and evaluation system and method based on wearable equipment and machine learning.
Background
Motion recognition is a very popular field, and particularly, the recognition of human motion gestures is a direction of research interest for a large number of students. The current research is mainly applied to scenes such as medical monitoring, human body falling detection or gesture recognition, and the like, and the key point is to recognize common actions such as running, jumping, falling and the like; but the research on the motion recognition in each motion is still in the stage of starting. In fact, if the human body posture recognition technology can be applied to various sports disciplines to help coaches objectively and efficiently judge the levels of students, taking football as an example, the skilled and accurate mastering of basic actions such as passing, shooting and the like by football players is a premise of obtaining excellent performance, and therefore the basic actions are also important points in daily teaching and training.
At present, motion capture identification mainly comprises two modes, namely gesture analysis based on computer vision, namely motion processes of athletes are captured simultaneously at different angles through a camera, relevant visual information is extracted from a video sequence, and the visual information is expressed by finding a proper method to realize motion identification; secondly, by utilizing the attitude analysis of the inertial sensor, the athlete wears a special sensor to acquire data, and establishes a model to process the data so as to achieve the aim of recognition. Because of individual differences among human bodies, diversity and complexity of actions, how to sufficiently and effectively express action gestures is a big difficulty in gesture analysis based on vision, and the calculation cost and accuracy need to be balanced in specific application; in a research test using attitude analysis of the inertial sensor, an athlete often needs to wear a plurality of sensors, and the established model can only recognize relatively simple actions such as walking, running and falling.
Generally speaking, the research cost of the existing research mode is high, and the requirements on the actual application environment are strict: multiple high speed cameras are often required or multiple wearable devices must be worn while performing the sport; most researches only can identify the types of the actions but cannot realize accurate grading of the action completers, so that the technologies are difficult to be really applied to actual training scenes to help coaches analyze the athletic ability of trainees quickly and efficiently.
Disclosure of Invention
The invention mainly aims to provide a football action recognition and evaluation system and method based on wearable equipment and machine learning, and aims to recognize and classify basic actions (positive instep shooting and arch passing) of a football by using a machine learning algorithm under the condition of only using a single sensor, classify completers with the same actions and improve the precision of action classification and action completer classification.
In order to achieve the purpose, the football act recognition and evaluation system based on wearable equipment and machine learning comprises a wearable measuring sensor, wherein an inertia measuring unit and a wireless communication module are arranged in the wearable measuring sensor, and the inertia measuring unit is used for collecting motion data of one or more motion acts in limbs of a football player;
the computer is internally provided with an SVM classifier and is in data connection with the wearable measuring sensors, and is used for acquiring the motion data of a plurality of groups of professional football players and football amateurs, taking the motion data as training data and establishing a motion trajectory model; extracting characteristic parameters of data from all data, inputting the characteristic parameters into the SVM classifier, debugging the parameters of the SVM classifier, and obtaining a classification model after training is completed;
the cloud storage server is in data connection with the computer and is used for storing data; and
the intelligent mobile terminal device is in wireless communication connection with the wearable measuring sensor and the cloud storage server respectively, the computer transmits the classification model to the intelligent mobile terminal device through the cloud storage server, the intelligent mobile terminal device obtains the motion data of the football player in real time through the wearable sensor, extracts the characteristic parameters of the data from all the data, inputs the characteristic parameters into the trained classification model for classification, and obtains the final classification result.
Further, the wearable measurement sensor is provided as a single, and the inertial measurement unit includes an x-axis, y-axis, z-axis gravitational acceleration sensor and an x-axis, y-axis, z-axis gyro angular velocity sensor.
Further, the inertial measurement unit employs a BMI160 integrated sensor chip.
Further, the wireless communication module adopts a DA14583 Bluetooth chip.
Further, the invention also provides a football action recognition and evaluation method based on wearable equipment and machine learning, which is carried out by adopting the system and comprises the following steps:
acquiring training data samples based on a single wearable measuring sensor, and cleaning and de-noising the training data;
establishing a motion trail model according to the training data samples, and extracting characteristic parameters of data from all training data; inputting the characteristic parameters into an SVM classifier, and performing parameter debugging on the SVM classifier; selecting optimal model parameters to obtain a classification model;
acquiring motion data of a football player based on a single wearable measuring sensor, cleaning and denoising user data, and extracting characteristic parameters of the data from all the data;
classifying the test data samples by using a trained SVM classifier, inputting the characteristic parameters into a trained classification model for classification, and obtaining a final classification result; and
and comparing the output prediction result with the real situation, and finally calculating the classification accuracy of the classifier.
Further, the collecting training data samples based on the single wearable measuring sensor and the cleaning and de-noising processing of the training data comprises: calibrating the inertial sensor by adopting a six-position test calibration method; the tester wearing the equipment makes a specified action; the inertial measurement unit may collect a set of data at each sampling point: x-axis, y-axis, and z-axis angular velocities in three-dimensional space, and x-axis, y-axis, and z-axis accelerations in three-dimensional space.
Further, the extracting of the characteristic parameters of the data from all the training data includes: cutting the collected data matrix all the time according to the completion of actions, and dividing the collected data matrix into P multiplied by N data sections, wherein P represents the number of experimental people, and each data section is different in length; and finding a sampling point Q corresponding to the peak value in each data segment, cutting all the data segments, obtaining a matrix Z of (P multiplied by N multiplied by L) multiplied by 6, wherein L represents the length of the retained single data segment and has a value of Q + f + b, and carrying out normalization processing on Z to obtain Z'.
Further, the extracting the characteristic parameters of the data from all the training data further includes: and marking each group of data according to the data acquisition condition, wherein the marking comprises marking the action type and the action accomplishment grade to obtain two label matrixes with the size of (P multiplied by N) multiplied by 1.
Further, the extracting the characteristic parameters of the data from all the training data further includes: and respectively calculating the Euler angle by using the three-dimensional angular velocity value and the three-dimensional acceleration value in each data segment. From the relationship of angular velocity to Euler angle of
Figure BDA0002571952850000031
ωxyzThe three-dimensional angular velocity values, alpha, β and gamma respectively represent Euler angles, and the relationship between the acceleration and the Euler angles is as follows:
Figure BDA0002571952850000032
ax,ay,azand representing the three-dimensional acceleration values, and respectively solving to obtain Euler angle matrixes. Performing Kalman filtering fusion on Euler angle matrixes obtained by respectively solving angular velocity values and acceleration values to obtainAnd combining the attitude angle matrix of (P × N × L) × 3 with the original (P × N × L) × 6 matrix to obtain a (P × N × L) × 9 matrix Z.
Further, calculating the maximum value, the minimum value, the average value, the variance, the standard deviation, the root mean square, the four-point value, the difference value between the maximum value and the minimum value and the skewness of each dimension data stream of each data segment in the Z to obtain a (P multiplied by N) multiplied by 60 characteristic matrix; and carrying out PCA dimension reduction processing on the feature matrix.
By adopting the technical scheme of the invention, the invention has the following beneficial effects: according to the technical scheme, basic actions (front instep shooting and arch passing) of the football are recognized and classified by using a machine learning algorithm under the condition that only a single wearable measuring sensor is used, and the completers of the same actions are classified, so that the completers of the actions are recognized to be professional athletes or amateurs; meanwhile, aiming at the problem, a motion trail model is established to extract relevant characteristic values, and the precision of motion classification and motion completer classification is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a schematic overall framework structure diagram of a soccer action recognition and evaluation system based on wearable devices and machine learning according to an embodiment of the present invention;
fig. 2 is a schematic flow chart and structure diagram of a soccer action recognition and evaluation method based on wearable devices and machine learning according to an embodiment of the present invention;
fig. 3 is another schematic flow chart of a soccer action recognition and evaluation method based on wearable device and machine learning according to an embodiment of the present invention;
fig. 4 is a schematic euler angle rotation diagram of a soccer action recognition and evaluation method based on wearable device and machine learning according to an embodiment of the present invention;
fig. 5 is an exploded view of the euler rotation process of a soccer action recognition and evaluation method based on wearable device and machine learning according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a data acquisition manner of a soccer action recognition and evaluation method based on wearable devices and machine learning according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indicators (such as up, down, left, right, front, and rear … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicator is changed accordingly.
In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
The invention provides a football action recognition and evaluation system and method based on wearable equipment and machine learning.
As shown in fig. 1 to 6, in an embodiment of the present invention, the soccer action recognition and evaluation system based on wearable device and machine learning includes a wearable measurement sensor 100, an inertial measurement unit 101 and a wireless communication module 102 are disposed in the wearable measurement sensor 100, the inertial measurement unit 101 is configured to collect motion data of one or more motion actions in a limb of a soccer player;
the system comprises a computer 200, wherein an SVM classifier 201 is arranged in the computer 200, the computer 200 is in data connection with the wearable measuring sensor 100 and is used for acquiring the motion data of a plurality of groups of professional football players and football amateurs, taking the motion data as training data and establishing a motion trajectory model; extracting characteristic parameters of data from all data, inputting the characteristic parameters into the SVM classifier 201, debugging the parameters of the SVM classifier 201, and obtaining a classification model after training is completed;
the cloud storage server 400 is in data connection with the computer 200 and used for storing data; and
the intelligent mobile terminal device 300 is characterized in that the intelligent mobile terminal device 300 is in wireless communication connection with the wearable measuring sensor 100 and the cloud storage server 400 respectively, the computer 200 transmits the classification model to the intelligent mobile terminal device 300 through the cloud storage server 400, the intelligent mobile terminal device 300 obtains motion data of football players in real time through the wearable sensor 100, extracts characteristic parameters of data from all data, inputs the characteristic parameters into the trained classification model and classifies the motion data to obtain a final classification result.
Alternatively, the wearable measurement sensor 100 is provided as a single, and the inertial measurement unit 101 includes x-axis, y-axis, and z-axis gravitational acceleration sensors and x-axis, y-axis, and z-axis gyro angular velocity sensors.
Optionally, the inertial measurement unit 101 employs a BMI160 integrated sensor chip.
Optionally, the wireless communication module 102 employs a DA14583 bluetooth chip.
The invention also provides a motion capture recognition evaluation method based on edge calculation, which is carried out by adopting the system and comprises the following steps:
s100, acquiring training data samples based on a single wearable measuring sensor, and cleaning and de-noising the training data;
s200, establishing a motion trail model according to the training data samples, and extracting characteristic parameters of data from all training data; inputting the characteristic parameters into an SVM classifier, and performing parameter debugging on the SVM classifier; selecting optimal model parameters to obtain a classification model;
s300, acquiring motion data of a football player based on a single wearable measuring sensor, cleaning and denoising user data, and extracting characteristic parameters of the data from all the data;
s400, classifying the test data samples by using the trained SVM classifier, inputting the characteristic parameters into the trained classification model for classification, and obtaining a final classification result;
and S500, comparing the output prediction result with the real situation, and finally calculating the classification accuracy of the classifier.
As shown in fig. 3, the whole classification process includes two stages: a training phase and a testing phase.
(1) In the training phase:
acquiring the motion data of a plurality of groups of professional football players and football amateurs through wearable measuring sensors, and taking the motion data as training data;
establishing an action track model; extracting characteristic parameters of the data from all the data;
inputting the characteristic parameters into an SVM classifier, and performing parameter debugging on the classifier;
and obtaining a classification model after training.
(2) In the testing stage:
acquiring data to be classified; establishing an action track model;
extracting characteristic parameters of the data from all the data;
and inputting the characteristic parameters into the trained classification model for classification to obtain a final classification result.
The specific process is as follows:
collecting training data samples: motion data is collected using a single inertial sensor based wearable device.
The inertial sensor can measure 6 sets of data: x-axis, y-axis, z-axis angular velocities used to characterize angular velocity in three-dimensional space, and x-axis, y-axis, z-axis accelerations characterizing acceleration in three-dimensional space.
Data cleaning and denoising: low-pass filtering the data; and (6) normalizing the data.
Establishing a motion track model: respectively calculating attitude angles including pitch around an x-axis, yaw around a y-axis and roll around a z-axis by using the three-dimensional angular velocity and the three-dimensional acceleration; and the Kalman filtering algorithm fuses the two groups of attitude angle data to obtain the final three-dimensional attitude angular velocity.
Feature extraction: 9 features of the 9-dimensional data are extracted: maximum value, minimum value, average value, variance, standard deviation, root mean square, quarter point value, maximum value, minimum value difference and skewness, wherein the total number of the features is 81 dimensions; the PCA algorithm performs data dimensionality reduction.
Training SVM classifier debugging parameters: dividing the processed data into a training set and a testing set; and training the SVM classifier by using the training set data, and selecting the optimal model parameter to obtain a classifier model.
Classifying the test samples by using the trained classifier: and classifying the test set samples by using the trained classifier, comparing the output prediction result with the real situation, and finally calculating the classification accuracy of the classifier.
The following is a preferred embodiment of the present invention, comprising:
step S1: calibrating the inertial sensor by adopting a six-position test calibration method; the tester wearing the equipment makes a specified action; for different types of actions, the wearable device transmits data collected by the sensor to the mobile terminal through the Bluetooth; the in-device sensor IMU may collect a set of data at each sampling point: angular velocities of an x-axis, a y-axis and a z-axis in a three-dimensional space and accelerations of the x-axis, the y-axis and the z-axis in the three-dimensional space; finally corresponding can be collected
Figure BDA0002571952850000071
N denotes the number of acquisitions, MjRepresenting the completion of a single complete actionThe number of acquisition points contained; the mobile terminal can check the change of the data in real time and store the data.
Step S2: cutting the collected data matrix all the time according to the completion of actions, and dividing the collected data matrix into P multiplied by N data sections, wherein P represents the number of experimental people, and each data section is different in length; and finding a sampling point Q corresponding to the peak value in each data segment, cutting all the data segments, obtaining a matrix Z of (P multiplied by N multiplied by L) multiplied by 6, wherein L represents the length of the retained single data segment and has a value of Q + f + b, and carrying out normalization processing on Z to obtain Z'.
And step S3, marking each group of data according to the data acquisition condition, including marking the action type and the action accomplishment level, and obtaining two label matrixes with the size of (P multiplied by N) multiplied by 1.
Step S4: and respectively calculating the Euler angle by using the three-dimensional angular velocity value and the three-dimensional acceleration value in each data segment. From the relationship of angular velocity to Euler angle of
Figure BDA0002571952850000081
ωxyzThe three-dimensional angular velocity values, alpha, β and gamma respectively represent Euler angles, and the relationship between the acceleration and the Euler angles is as follows:
Figure BDA0002571952850000082
ax,ay,azand performing Kalman filtering fusion on the Euler angle matrixes obtained by respectively solving the angular velocity values and the acceleration values to obtain an attitude angle matrix of (P × N × L) × 3, and combining the attitude angle matrix with the original matrix of (P × N × L) × 6 to obtain a matrix Z of (P × N × L) × 9.
Step S5: calculating the maximum value, the minimum value, the average value, the variance, the standard deviation, the root mean square, the four-point score, the difference value of the maximum value and the minimum value and the skewness of each dimension data stream of each data section in the Z to obtain a (P multiplied by N) multiplied by 60 characteristic matrix; and carrying out PCA dimension reduction processing on the feature matrix.
Step S6: training a Support Vector Machine (SVM) model based on the feature matrix obtained in the previous step to obtain an SVM classifier for identifying different actions and an SVM classifier for grading action completers.
Step S7: and (4) obtaining sample data to be classified in the same acquisition mode as the training sample, carrying out the data denoising, modeling and feature extraction processing of the steps S2-S5 to obtain the feature vector of the sample to be recognized, inputting the feature vector into the corresponding classifier obtained in the step 6 according to the classification requirement, classifying, and outputting the classification result.
Specifically, according to the technical scheme, basic actions (positive instep shooting and arch passing) of the football are identified and classified by using a machine learning algorithm under the condition that only a single wearable measuring sensor is used, and completers of the same actions are classified, so that the completers of the actions are identified to be professional athletes or amateurs; meanwhile, aiming at the problem, a motion trail model is established to extract relevant characteristic values, and the precision of motion classification and motion completer classification is effectively improved.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A football act recognition and evaluation system based on wearable equipment and machine learning is characterized by comprising a wearable measuring sensor, wherein an inertia measuring unit and a wireless communication module are arranged in the wearable measuring sensor, and the inertia measuring unit is used for collecting motion data of one or more motion acts in limbs of a football player;
the computer is internally provided with an SVM classifier and is in data connection with the wearable measuring sensors, and is used for acquiring the motion data of a plurality of groups of professional football players and football amateurs, taking the motion data as training data and establishing a motion trajectory model; extracting characteristic parameters of data from all data, inputting the characteristic parameters into the SVM classifier, debugging the parameters of the SVM classifier, and obtaining a classification model after training is completed;
the cloud storage server is in data connection with the computer and is used for storing data; and
the intelligent mobile terminal device is in wireless communication connection with the wearable measuring sensor and the cloud storage server respectively, the computer transmits the classification model to the intelligent mobile terminal device through the cloud storage server, the intelligent mobile terminal device obtains the motion data of the football player in real time through the wearable sensor, extracts the characteristic parameters of the data from all the data, inputs the characteristic parameters into the trained classification model for classification, and obtains the final classification result.
2. The wearable device and machine learning based soccer motion recognition evaluation system of claim 1, wherein the wearable measurement sensors are provided as a single, and the inertial measurement unit comprises x-axis, y-axis, z-axis gravitational acceleration sensors and x-axis, y-axis, z-axis gyroscope angular velocity sensors.
3. The wearable device and machine learning based soccer motion recognition evaluation system of claim 1, wherein the inertial measurement unit employs a BMI160 integrated sensor chip.
4. The wearable device and machine learning based soccer motion recognition evaluation system of claim 1, wherein said wireless communication module employs a DA14583 bluetooth chip.
5. A football motion recognition and evaluation method based on wearable equipment and machine learning, which is carried out by adopting the system of any one of claims 1-4, and comprises the following steps:
acquiring training data samples based on a single wearable measuring sensor, and cleaning and de-noising the training data;
establishing a motion trail model according to the training data samples, and extracting characteristic parameters of data from all training data; inputting the characteristic parameters into an SVM classifier, and performing parameter debugging on the SVM classifier; selecting optimal model parameters to obtain a classification model;
acquiring motion data of a football player based on a single wearable measuring sensor, cleaning and denoising user data, and extracting characteristic parameters of the data from all the data;
classifying the test data samples by using a trained SVM classifier, inputting the characteristic parameters into a trained classification model for classification, and obtaining a final classification result; and
and comparing the output prediction result with the real situation, and finally calculating the classification accuracy of the classifier.
6. The method for football recognition and evaluation based on wearable device and machine learning as claimed in claim 5, wherein said collecting training data samples based on a single wearable measurement sensor and cleaning and de-noising the training data comprises: calibrating the inertial sensor by adopting a six-position test calibration method; the tester wearing the equipment makes a specified action; the inertial measurement unit may collect a set of data at each sampling point: x-axis, y-axis, and z-axis angular velocities in three-dimensional space, and x-axis, y-axis, and z-axis accelerations in three-dimensional space.
7. The wearable device and machine learning-based soccer motion recognition evaluation method of claim 5, wherein said extracting characteristic parameters of data from all training data comprises: cutting the collected data matrix all the time according to the completion of actions, and dividing the collected data matrix into P multiplied by N data sections, wherein P represents the number of experimental people, and each data section is different in length; and finding a sampling point Q corresponding to the peak value in each data segment, cutting all the data segments, obtaining a matrix Z of (P multiplied by N multiplied by L) multiplied by 6, wherein L represents the length of the retained single data segment and has a value of Q + f + b, and carrying out normalization processing on Z to obtain Z'.
8. The method for identifying and evaluating football actions based on wearable device and machine learning as claimed in claim 5, wherein said extracting characteristic parameters of data from all training data further comprises: and marking each group of data according to the data acquisition condition, wherein the marking comprises marking the action type and the action accomplishment grade to obtain two label matrixes with the size of (P multiplied by N) multiplied by 1.
9. The method for identifying and evaluating football actions based on wearable device and machine learning as claimed in claim 5, wherein said extracting characteristic parameters of data from all training data further comprises: and respectively calculating the Euler angle by using the three-dimensional angular velocity value and the three-dimensional acceleration value in each data segment. From the relationship of angular velocity to Euler angle of
Figure FDA0002571952840000031
ωxyzThe three-dimensional angular velocity values, alpha, β and gamma respectively represent Euler angles, and the relationship between the acceleration and the Euler angles is as follows:
Figure FDA0002571952840000032
ax,ay,azand performing Kalman filtering fusion on the Euler angle matrixes obtained by respectively solving the angular velocity values and the acceleration values to obtain an attitude angle matrix of (P × N × L) × 3, and combining the attitude angle matrix with the original matrix of (P × N × L) × 6 to obtain a matrix Z of (P × N × L) × 9.
10. The wearable device and machine learning-based soccer motion recognition assessment method of claim 9, further comprising: calculating the maximum value, the minimum value, the average value, the variance, the standard deviation, the root mean square, the four-point score, the difference value of the maximum value and the minimum value and the skewness of each dimension data stream of each data section in the Z to obtain a (P multiplied by N) multiplied by 60 characteristic matrix; and carrying out PCA dimension reduction processing on the feature matrix.
CN202010642331.0A 2020-07-06 2020-07-06 Football action recognition and evaluation system and method based on wearable equipment and machine learning Active CN111744156B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010642331.0A CN111744156B (en) 2020-07-06 2020-07-06 Football action recognition and evaluation system and method based on wearable equipment and machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010642331.0A CN111744156B (en) 2020-07-06 2020-07-06 Football action recognition and evaluation system and method based on wearable equipment and machine learning

Publications (2)

Publication Number Publication Date
CN111744156A true CN111744156A (en) 2020-10-09
CN111744156B CN111744156B (en) 2021-11-09

Family

ID=72680417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010642331.0A Active CN111744156B (en) 2020-07-06 2020-07-06 Football action recognition and evaluation system and method based on wearable equipment and machine learning

Country Status (1)

Country Link
CN (1) CN111744156B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112754472A (en) * 2021-01-05 2021-05-07 北京诺亦腾科技有限公司 Calibration method and device for sensor in motion capture system
CN113663312A (en) * 2021-08-16 2021-11-19 东南大学 Micro-inertia-based non-apparatus body-building action quality evaluation method
CN114239724A (en) * 2021-12-17 2022-03-25 中南民族大学 Cuball motion recognition and skill evaluation method based on inertial sensor
CN114241603A (en) * 2021-12-17 2022-03-25 中南民族大学 Shuttlecock action recognition and level grade evaluation method and system based on wearable equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020116147A1 (en) * 1994-11-21 2002-08-22 Vock Curtis A. Methods and systems for assessing athletic performance
CN104880190A (en) * 2015-06-02 2015-09-02 无锡北微传感科技有限公司 Intelligent chip for accelerating inertial navigation attitude fusion
US20180311563A1 (en) * 2017-05-01 2018-11-01 Heads Up Dummy Llc Training accessories and methods for improving athletic techniques
CN110327595A (en) * 2019-05-09 2019-10-15 深圳市蝙蝠云科技有限公司 Motion capture identification and assessment device and method based on wearable sensors
CN110388919A (en) * 2019-07-30 2019-10-29 上海云扩信息科技有限公司 Threedimensional model localization method in augmented reality based on characteristic pattern and inertia measurement
CN110841262A (en) * 2019-12-06 2020-02-28 郑州大学体育学院 Football training system based on wearable equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020116147A1 (en) * 1994-11-21 2002-08-22 Vock Curtis A. Methods and systems for assessing athletic performance
CN104880190A (en) * 2015-06-02 2015-09-02 无锡北微传感科技有限公司 Intelligent chip for accelerating inertial navigation attitude fusion
US20180311563A1 (en) * 2017-05-01 2018-11-01 Heads Up Dummy Llc Training accessories and methods for improving athletic techniques
CN110327595A (en) * 2019-05-09 2019-10-15 深圳市蝙蝠云科技有限公司 Motion capture identification and assessment device and method based on wearable sensors
CN110388919A (en) * 2019-07-30 2019-10-29 上海云扩信息科技有限公司 Threedimensional model localization method in augmented reality based on characteristic pattern and inertia measurement
CN110841262A (en) * 2019-12-06 2020-02-28 郑州大学体育学院 Football training system based on wearable equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112754472A (en) * 2021-01-05 2021-05-07 北京诺亦腾科技有限公司 Calibration method and device for sensor in motion capture system
CN113663312A (en) * 2021-08-16 2021-11-19 东南大学 Micro-inertia-based non-apparatus body-building action quality evaluation method
CN113663312B (en) * 2021-08-16 2022-05-13 东南大学 Micro-inertia-based non-apparatus body-building action quality evaluation method
CN114239724A (en) * 2021-12-17 2022-03-25 中南民族大学 Cuball motion recognition and skill evaluation method based on inertial sensor
CN114241603A (en) * 2021-12-17 2022-03-25 中南民族大学 Shuttlecock action recognition and level grade evaluation method and system based on wearable equipment
CN114241603B (en) * 2021-12-17 2022-08-26 中南民族大学 Shuttlecock action recognition and level grade evaluation method and system based on wearable equipment

Also Published As

Publication number Publication date
CN111744156B (en) 2021-11-09

Similar Documents

Publication Publication Date Title
CN111744156B (en) Football action recognition and evaluation system and method based on wearable equipment and machine learning
Wang et al. Volleyball skill assessment using a single wearable micro inertial measurement unit at wrist
CN109643499B (en) System and method for swimming analysis
EP3086320A1 (en) Method and device for associating frames in a video of an activity of a person with an event
KR101872907B1 (en) Motion analysis appratus and method using dual smart band
Jensen et al. Classification of kinematic swimming data with emphasis on resource consumption
JP6505614B2 (en) Training classification system, training classification method and training classification server
Seeberg et al. A multi-sensor system for automatic analysis of classical cross-country skiing techniques
Brock et al. Assessing motion style errors in ski jumping using inertial sensor devices
Meghji et al. An algorithm for the automatic detection and quantification of athletes’ change of direction incidents using IMU sensor data
Groh et al. IMU-based trick classification in skateboarding
Malawski Depth versus inertial sensors in real-time sports analysis: A case study on fencing
Kim et al. Golf swing analysis system with a dual band and motion analysis algorithm
Salman et al. Classification and legality analysis of bowling action in the game of cricket
Beily et al. A sensor based on recognition activities using smartphone
Kautz et al. Sensor fusion for multi-player activity recognition in game sports
Tabrizi et al. Data acquired by a single object sensor for the detection and quality evaluation of table tennis forehand strokes
Ghobadi et al. A robust automatic gait monitoring approach using a single IMU for home-based applications
CN210078765U (en) Motion capture recognition and evaluation device based on wearable sensor
CN105105757A (en) Wearable human motion gesture track recording and assessment device
Smith et al. Automatic Classification of Locomotion in Sport: A Case Study from Elite Netball.
Schmidt et al. A wearable flexible sensor network platform for the analysis of different sport movements
Brzostowski et al. Data fusion in ubiquitous sports training: methodology and application
Wang et al. Analysis of movement effectiveness in badminton strokes with accelerometers
Saponaro et al. Estimating Physical Activity Intensity And Energy Expenditure Using Computer Vision On Videos

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant