CN111401435B - Human body motion mode identification method based on motion bracelet - Google Patents
Human body motion mode identification method based on motion bracelet Download PDFInfo
- Publication number
- CN111401435B CN111401435B CN202010173918.1A CN202010173918A CN111401435B CN 111401435 B CN111401435 B CN 111401435B CN 202010173918 A CN202010173918 A CN 202010173918A CN 111401435 B CN111401435 B CN 111401435B
- Authority
- CN
- China
- Prior art keywords
- acceleration
- data
- window
- directions
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 68
- 238000000034 method Methods 0.000 title claims abstract description 19
- 230000001133 acceleration Effects 0.000 claims abstract description 127
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 24
- 238000001914 filtration Methods 0.000 claims abstract description 24
- 238000003066 decision tree Methods 0.000 claims abstract description 20
- 238000013145 classification model Methods 0.000 claims abstract description 13
- 238000012545 processing Methods 0.000 claims abstract description 10
- 238000012567 pattern recognition method Methods 0.000 claims abstract description 8
- 238000005070 sampling Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 5
- 238000007635 classification algorithm Methods 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 4
- 238000012216 screening Methods 0.000 claims description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 230000000694 effects Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 238000000605 extraction Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 101100136092 Drosophila melanogaster peng gene Proteins 0.000 description 1
- 101000739754 Homo sapiens Semenogelin-1 Proteins 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 102100037550 Semenogelin-1 Human genes 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003183 myoelectrical effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/02—Preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Theoretical Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Probability & Statistics with Applications (AREA)
- Psychiatry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a human body motion mode identification method based on a motion bracelet, and belongs to the technical field of intelligent wearable equipment. The human motion pattern recognition method carries out recognition on the human motion pattern by the steps of collecting human data, carrying out sample processing on the data, carrying out filtering processing, extracting features, selecting the features and the like. The method only calculates the characteristics of the mean value, the variance and the like of the acceleration data of each window in a period of time, and uses a simple decision tree classification model for identification, so that the time complexity is low, the algorithm is simple, and the identification speed is high. Meanwhile, the characteristics which are most beneficial to classification are extracted and selected, and then the decision tree model based on probability statistics is used for classification, so that high identification accuracy is guaranteed. In the process of obtaining the experimental sample, fifty percent of the overlapping percentage of the selected window ensures the continuity of the movement of the human body and the accuracy of the experimental result.
Description
Technical Field
The invention relates to the technical field of intelligent wearable equipment, in particular to a human body motion mode identification method based on a motion bracelet.
Background
Along with the popularization and the application of miniature intelligent wearing equipment, the intelligent equipment computing power is stronger and stronger, and people can utilize its wearability and operational performance to bring convenience for our life. The recognition of the movement of the human body is a valuable research for the contemporary generation. Motion recognition of the human body is a core component for many high impact applications, such as health and fitness monitoring, personal biometrics, urban computing, assistive technology, elderly care, indoor positioning and navigation, etc. Similarly, the basic exercise situation of the user in one day can be calculated by wearing the intelligent bracelet, the exercise amount of the user in different exercise modes can be calculated according to the information, the user can be helped to know the exercise situation of the user in time, adjustment is made, and the user can arrange the exercise time of various exercises more scientifically.
With the development of artificial intelligence, the identification of motion patterns is possible by utilizing a bracelet embedded with an acceleration sensor and combining with a machine learning and data mining algorithm. At present, scholars at home and abroad have developed the research on human motion pattern recognition algorithms and have achieved fruitful results. Such as recognizing motion patterns using bio-signals generated by the human body during motion. Zhang et al utilizes a surface myoelectric (SEMG) sensor to capture muscle activity information on the skin surface of a corresponding muscle group, and detects and identifies a gesture command. Peng et al identified gestures by processing images of the gestures captured by the camera and analyzing the images using computer vision-related algorithms. In addition, zhao, zhang et al extract Hu invariant moment of the image as the feature vector of the gesture, then calculate the distance between the input gesture and the feature vector of the template gesture image to realize static gesture recognition, and the recognition rate reaches 97.4%. The trainees also monitor key gait events by collecting sole pressure information and perform feature extraction on lower limb acceleration data before and after the events, so that 4 common behaviors (walking, running, going upstairs and going downstairs) are identified. The motion pattern recognition by adopting the method has too high cost and too complex algorithm; or the identification accuracy is low and cannot meet the requirement. Therefore, a new method for recognizing the motion pattern of the human body is required.
Through retrieval, the Chinese patent application number ZL201710866208.5 has the following invention name: motion state identification method and system and animal behavior identification system, application date: year 2017, month 9 and day 22. The application collects three-dimensional acceleration data of a target through an acceleration sensor, calculates the resultant acceleration according to the three-dimensional acceleration data, and extracts the characteristic information of the resultant acceleration; inputting the characteristic information into a decision tree model, identifying the characteristic information by using nodes of the decision tree model, and determining the motion state of the target, thereby achieving the technical effect of identifying the motion state of the target; however, this application extracts only the characteristic information of the resultant acceleration as the identification information, and cannot accurately identify the motion state of the human body.
Disclosure of Invention
1. Technical problem to be solved by the invention
In view of the problems of complex algorithm and low recognition accuracy rate in the recognition of motion patterns in the prior art, the invention provides a human motion pattern recognition method based on a motion bracelet.
2. Technical scheme
In order to achieve the purpose, the technical scheme provided by the invention is as follows:
the invention discloses a human motion mode identification method based on a motion bracelet, which comprises the following steps:
acquiring acceleration data of human body movement by using a sports bracelet;
segmenting the data acquired in the step one, and selecting a proper window width;
thirdly, filtering the acceleration data in three directions of each window;
step four, extracting the characteristics of the acceleration data after the filtering processing in the step three;
step five, screening the features extracted in the step four by using a feature selection algorithm, and abandoning the features which do not contribute to the classification algorithm;
and step six, sending the screened feature data into a classification model, and adjusting a parameter model to optimize the model.
Furthermore, in the first step, after the exercise data is obtained, the acceleration data of other actions is removed, and only the acceleration data of five exercises of walking, running, badminton playing, table tennis playing and rowing are kept.
Furthermore, in the second step, the acceleration in the three directions of X, Y and Z is set as a x ,a y ,a z And 4 seconds are selected as the window width, adjacent windowsThe mouth overlap is fifty percent.
Furthermore, in the third step, a first-order low-pass filtering algorithm is adopted to filter the acceleration signals in three directions, and the first-order low-pass filtering algorithm is shown as formula (1):
wherein F (n) is the current filtering output value, m is the filtering coefficient,and F (n-1) is the last filtering output value.
Furthermore, in the fourth step, the eigenvalues are extracted by using the following formulas respectively:
s1, reflecting the characteristic value of the intensity of each direction when a human body moves: mean value of acceleration in three directionsCalculated by using the formulas (2), (3) and (4),
in the formula, a x,k ,a y,k ,a z,k Obtaining an acceleration signal of three axes of X, Y and Z for the kth sampling point in a window; n is the number of sampling points of each window;
s2, representing a characteristic value of the comprehensive intensity of a certain motion mode: mean value of synthesized accelerationCalculated by using the formulas (5) and (6),
(5) In the formula, a k A is the mode of the three-axis resultant velocity of the kth sampling point in the window k Taking formula (6) to obtain the mean value of the resultant acceleration
S3, acceleration variance and standard deviation in three directions, and acceleration variance sigma 2 x ,σ 2 y ,σ 2 z Calculated by the formulas (7), (8) and (9),
for acceleration variance sigma 2 x ,σ 2 y ,σ 2 z The square of the opening is used to obtain the standard deviation sigma of the acceleration x ,σ y ,σ z ;
S4, synthesizing the acceleration variance sigma 2 Calculated by the formula (10),
s5, acceleration peak-valley value a pv The difference between the maximum value Max (a) and the minimum value Min (a) of the acceleration,
a pv =Max(a)-Min(a) (11)
respectively obtaining the peak-valley value of the resultant acceleration and the peak-valley values of the accelerations in three directions by using a formula (11);
s6, two correlation coefficients rho (X, Y), rho (X, Z) and rho (Y, Z) of the three-direction acceleration,
wherein cov (X, Y), cov (X, Z) and cov (Y, Z) are covariance of acceleration in two directions, and the calculation formula is as follows:
cov(X,Y)=E(XY)-E(X)E(Y) (15)
cov(X,Z)=E(XZ)-E(X)E(Z) (16)
cov(Y,Z)=E(YZ)-E(Y)E(Z) (17)
in the formula, E (X) is an expected X-direction acceleration in 4 seconds, E (Y) is an expected Y-direction acceleration, E (Z) is an expected Z-direction acceleration, E (XY) is an expected product of the X-direction acceleration and the Y-direction acceleration, E (XZ) is an expected product of the X-direction acceleration and the Z-direction acceleration, and E (YZ) is an expected product of the Y-direction acceleration and the Z-direction acceleration.
Furthermore, in the fifth step, importance values of all the features are calculated by using a feature selection algorithm based on a decision tree, and 14 features are selected according to the importance values.
Furthermore, in the sixth step, the feature set of the data in each sliding window is combined into a complete sample, the sample is divided into a training sample and a testing sample, a Decision Tree classification model is selected for classification, and model parameters are adjusted for multiple times by using the training sample, so that the model is optimal.
3. Advantageous effects
Compared with the prior art, the technical scheme provided by the invention has the following remarkable effects:
(1) In view of the problems of complex algorithm and low identification accuracy rate in the prior art for identifying the motion mode, the human motion mode identification method only calculates the characteristics of the mean value, the variance and the like of the acceleration data of each window within a period of time, and only uses a simple decision tree classification model for identification, so that the time complexity is low, the algorithm is simple, and the identification speed is high; meanwhile, the invention extracts and selects some features which are most beneficial to classification, and then the decision tree model based on probability statistics is used for classification, thereby ensuring higher identification accuracy.
(2) The human motion mode identification method collects the acceleration through the motion bracelet embedded with the acceleration sensor, can analyze and identify five motion modes without other hardware support, has simple and easily obtained used equipment, and reduces the experiment cost; in the process of obtaining the experimental sample, the selected window is overlapped by fifty percent, so that the continuity of the motion of the human body is ensured, and the accuracy of the experimental result is ensured.
Drawings
FIG. 1 is a line graph of an acceleration processing sample of the present invention.
Detailed Description
For a further understanding of the invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
Example 1
The human body motion pattern recognition method based on the sports bracelet specifically comprises the following steps:
step one, acquiring acceleration data of human body movement by utilizing a motion bracelet:
the bracelet is directly worn on the wrist of the right hand, the switch is turned on, and the bracelet begins to record acceleration data generated in the motion process of the human body and is stored in the built-in SD card in a text format. The system only identifies five set actions (walking, running, badminton playing, table tennis playing and rowing), namely, data acquisition is only carried out on the five actions. Acceleration data of other actions included in the obtained data are removed before operation, given data only comprise the five kinds of acceleration data, and other actions are not processed. This embodiment only needs the motion bracelet as experimental device, need not other hardware support alright analysis and discernment five kinds of motion modes, and the equipment is simple easily obtained, has reduced the experiment cost.
Step two, segmenting the data collected in the step one, and selecting a proper window width:
the acceleration sensor in the bracelet samples 200 times per second and records the acceleration a in the X, Y and Z directions in the movement process x, ,a y, ,a z, The characteristics of the five motion modes are comprehensively analyzed, the time required for completing each action is selected to be 4 seconds as the width of a window, and the acceleration data of each window in three directions is used as sample data to be processed. The method of overlapping fifty percent of adjacent windows is adopted to ensure the continuity of the motion of the human body (see figure 1) and ensure the accuracy of the experimental result.
Step three, performing filtering processing on the acceleration data in three directions of each window:
the signals generated by the motion of the human body are basically low-frequency signals, and in order to filter high-frequency interference signals, the embodiment adopts a first-order low-pass filtering algorithm to perform filtering processing on acceleration signals in three directions. And performing first-order low-pass filtering algorithm processing on the acceleration data in the X, Y and Z directions of each window. The first order low pass filtering algorithm is shown in equation (1):
wherein, F (n) is the current filtering output value; m is a filter systemA number (usually a number greater than 0 and less than 1) has a large influence on the filtering effect;the sampling value of the nth time; f (n-1) is the last filtering output value. According to the formula, the acceleration data of the current sample in the X direction, the Y direction and the Z direction are respectively substituted to obtain the filtered acceleration data.
Step four, performing feature extraction on the acceleration data subjected to the filtering processing in the step three:
after the data filtered by each sample data is calculated, feature extraction is carried out on each filtered acceleration signal, and the description and the calculation formula of all extracted features are as follows:
s1, mean value of accelerations in three directionsThe average state of the signals in the X, Y and Z directions is reflected, and the intensity of the human body in each direction during movement can be reflected. Calculated by using the formulas (2), (3) and (4),
in the formula, wherein a x,k ,a y,k ,a z,k Obtaining X, Y and Z three-axis acceleration signals for a kth sampling point in a window; n is the number of sampling points per window. In this embodiment, n =200.
S2, synthesizing the mean value of the accelerationThe comprehensive intensity of a certain motion mode can be reflected. Calculated by using the formulas (5) and (6),
(5) In the formula, a k For the mode of three combined velocities of the kth sampling point in the window, a k Carry-in (6) to obtain the mean value of the resultant acceleration
S3, variance and standard deviation of acceleration in three directions, and variance sigma of acceleration 2 x ,σ 2 y ,σ 2 z Calculated by the formulas (7), (8) and (9),
to acceleration variance σ 2 x ,σ 2 y ,σ 2 z The standard deviation sigma of the acceleration is obtained by the square of the opening x ,σ y ,σ z 。
S4, synthesizing the acceleration variance sigma 2 The formula of (1) is as follows:
s5, acceleration peak-valley value a pv The difference between the maximum value Max (a) and the minimum value Min (a) of the acceleration,
a pv =Max(a)-Min(a) (11)
the peak-to-valley value of the resultant acceleration and the peak-to-valley values of the accelerations in the three directions are calculated by the formula (11).
S6, two correlation coefficients rho (X, Y), rho (X, Z) and rho (Y, Z) of the three-direction acceleration,
wherein cov (X, Y), cov (X, Z), cov (Y, Z) is the covariance of the acceleration in two directions. The calculation formula is as follows:
cov(X,Y)=E(XY)-E(X)E(Y) (15)
cov(X,Z)=E(XZ)-E(X)E(Z) (16)
cov(Y,Z)=E(YZ)-E(Y)E(Z) (17)
in the formula, E (X) is an expected X-direction acceleration in 4 seconds, E (Y) is an expected Y-direction acceleration, E (Z) is an expected Z-direction acceleration, E (XY) is an expected product of the X-direction acceleration and the Y-direction acceleration, E (XZ) is an expected product of the X-direction acceleration and the Z-direction acceleration, and E (YZ) is an expected product of the Y-direction acceleration and the Z-direction acceleration.
Step five, screening the features extracted in the step four by using a feature selection algorithm, and discarding the features which do not contribute to the classification algorithm:
after all the above features are calculated on the basis of each sample, useful samples are screened out by using a feature selection algorithm, and features which hardly contribute to a classification algorithm are discarded. In the embodiment, feature selection algorithm modules feature _ attributes _basedon decision trees in the scidit-learn machine learning library are used to calculate importance values of all features, 14 features are selected according to the calculation result, and table 1 shows each feature importance value. In the embodiment, only the characteristics of the mean value, the variance and the like of the acceleration data of each window in a period of time are calculated, and only a simple decision tree classification model is used for identification, so that the time complexity is low, the algorithm is simple, and the identification speed is high.
As can be seen from the data in table 1, the Y-direction acceleration standard deviation importance value is the largest, and represents the degree of deviation in the movement in the left and right directions of different movements. According to the maximum value of the acceleration in the Z direction, the motion with larger amplitude and smaller amplitude of the acceleration of the up-and-down motion can be accurately distinguished. The minimum value of the acceleration in the X direction can represent the minimum motion strength in the horizontal direction. Likewise, the other 11 features play a crucial role in the classification of motion data in different respects. Finally, 14 characteristics in the table are determined as final sample characteristics.
TABLE 1 table of values of importance of respective characteristics
Feature(s) | Importance value |
Standard deviation of acceleration in Y direction | 0.533833 |
Maximum value of acceleration in Z direction | 0.238100 |
Minimum value of acceleration in X direction | 0.062283 |
Acceleration in X directionDegree standard deviation | 0.060749 |
Maximum value of acceleration in Y direction | 0.046991 |
Acceleration correlation coefficient in Y direction and Z direction | 0.025611 |
Mean value of acceleration in X direction | 0.013072 |
Peak-to-valley value of acceleration in Y direction | 0.006507 |
Resultant acceleration standard deviation | 0.005229 |
Minimum value of acceleration in Y direction | 0.002205 |
Mean value of acceleration in Y direction | 0.001984 |
Peak to valley resultant acceleration | 0.001984 |
Resultant acceleration | 0.000842 |
Acceleration correlation coefficient in X direction and Y direction | 0.000611 |
Step six, sending the screened feature data into a classification model, and adjusting a parameter model to optimize the model:
firstly, making a training sample set and a testing sample set by using the selected characteristic values: and calculating corresponding features in each sliding window, and finally obtaining a complete sample. The acceleration data for the multiple windows constitutes a sample set. And dividing the sample set into a training sample and a test sample according to the proportion. And selecting a proper classification model, and adjusting model parameters for multiple times by using the training samples to optimize the classification accuracy of the classification model for the five motion modes.
According to the number of samples and the characteristics of the sample features, a Decision Tree (Decision Tree) classification model is selected for classification. And evaluating the performance of the decision tree classification model and judging the optimal model according to the accuracy of the decision tree classification model on the test sample set. The decision tree algorithm is mainly based on probability statistics, normalization of data is not needed, original data samples are directly used for calculation, distortion of the data is avoided, and robustness is strong. In this embodiment, a decision tree module of a sklern.tree in scimit-lern is called, features of a training sample set are input as X parameters, tags are input as y parameters, and a model type fit method is used for training. After training is completed, the decision tree model is exported in the form of pictures or pdfs by using a Graphviz visualization tool, and a subsequent decision tree model is instantiated.
The model trained by the method of the embodiment has the recognition accuracy rate of more than 97 percent for five motion modes of walking, running, badminton playing, table tennis playing and rowing. Utilize motion bracelet to discern more convenient simple, the discernment is fast, and is with low costs.
The present invention and its embodiments have been described above schematically, without limitation, and what is shown in the drawings is only one of the embodiments of the present invention, and the actual structure is not limited thereto. Therefore, if the person skilled in the art receives the teaching, without departing from the spirit of the invention, the person skilled in the art shall not inventively design the similar structural modes and embodiments to the technical solution, but shall fall within the scope of the invention.
Claims (6)
1. A human motion mode identification method based on a motion bracelet is characterized by comprising the following steps:
acquiring acceleration data of human body movement by using a sports bracelet;
segmenting the data acquired in the step one, and selecting a proper window width;
thirdly, filtering the acceleration data in three directions of each window;
step four, extracting the characteristics of the acceleration data after the filtering processing in the step three;
screening the features extracted in the step four by using a feature selection algorithm, and discarding the features which do not contribute to the classification algorithm;
step six, sending the screened feature data into a classification model, and adjusting a parameter model to optimize the model;
in the fourth step, the characteristic values are respectively extracted by using the following formulas:
s1, reflecting the characteristic value of the intensity of each direction when a human body moves: mean value of acceleration in three directionsCalculated by using the formulas (2), (3) and (4),
in the formula, a x,k ,a y,k ,a z,k Obtaining an acceleration signal of three axes of X, Y and Z for the kth sampling point in a window; n is the number of sampling points of each window;
s2, representing a characteristic value of the comprehensive intensity of a certain motion mode: mean value of resultant accelerationCalculated by the formulas (5) and (6),
(5) In the formula, a k A is the mode of the three-axis resultant velocity of the kth sampling point in the window k Taking formula (6) to obtain the mean value of the resultant acceleration
S3, acceleration variance and standard deviation in three directions, and acceleration variance sigma 2 x ,σ 2 y ,σ 2 z Calculated by the formulas (7), (8) and (9),
for acceleration variance sigma 2 x ,σ 2 y ,σ 2 z The square of the opening is used to obtain the standard deviation sigma of the acceleration x ,σ y ,σ z ;
S4, synthesizing the acceleration variance sigma 2 Calculated by the formula (10),
s5, acceleration peak-valley value a pv The difference between the maximum value Max (a) and the minimum value Min (a) of the acceleration,
a pv =Max(a) -Min(a) (11)
respectively obtaining the peak-valley value of the resultant acceleration and the peak-valley values of the accelerations in three directions by using a formula (11);
s6, two correlation coefficients rho (X, Y), rho (X, Z) and rho (Y, Z) of the three-direction acceleration,
wherein cov (X, Y), cov (X, Z) and cov (Y, Z) are covariance of acceleration in two directions, and the calculation formula is as follows:
cov(X,Y)=E(XY)-E(X)E(Y) (15)
cov(X,Z)=E(XZ)-E(X)E(Z) (16)
cov(Y,Z)=E(YZ)-E(Y)E(Z) (17)
where (X) is the desired X-direction acceleration in 4 seconds, E (Y) is the desired Y-direction acceleration, E (Z) is the desired Z-direction acceleration, E (XY) is the desired product of the X and Y-direction accelerations, E (XZ) is the desired product of the X and Z-direction accelerations, and E (YZ) is the desired product of the Y and Z-direction accelerations.
2. The human motion pattern recognition method based on the sports bracelet of claim 1, characterized in that: in the first step, after the motion data is obtained, the acceleration data of other actions is removed, and only the acceleration data of five motions of walking, running, badminton playing, table tennis playing and rowing are reserved.
3. The human motion pattern recognition method based on the sports bracelet according to claim 1 or 2, characterized in that: in the second step, the acceleration in the three directions of X, Y and Z is set as a x ,a y ,a z And 4 seconds was chosen as the window width, with adjacent windows overlapping fifty percent.
4. The human motion pattern recognition method based on the sports bracelet of claim 3, characterized in that: in the third step, a first-order low-pass filtering algorithm is adopted to filter the acceleration signals in three directions, and the first-order low-pass filtering algorithm is shown as a formula (1):
5. The human motion pattern recognition method based on the sports bracelet of claim 4, characterized in that: in the fifth step, importance values of all the features are calculated by using a feature selection algorithm based on a decision tree, and 14 features are selected according to the importance values.
6. The human motion pattern recognition method based on the sports bracelet of claim 5, characterized in that: and in the sixth step, the feature set of the data in each sliding window is combined into a complete sample which is divided into a training sample and a testing sample, a Decision Tree precision classification model is selected for classification, and model parameters are adjusted for multiple times by using the training sample to enable the model to be optimal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010173918.1A CN111401435B (en) | 2020-03-13 | 2020-03-13 | Human body motion mode identification method based on motion bracelet |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010173918.1A CN111401435B (en) | 2020-03-13 | 2020-03-13 | Human body motion mode identification method based on motion bracelet |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111401435A CN111401435A (en) | 2020-07-10 |
CN111401435B true CN111401435B (en) | 2023-04-07 |
Family
ID=71432518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010173918.1A Active CN111401435B (en) | 2020-03-13 | 2020-03-13 | Human body motion mode identification method based on motion bracelet |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111401435B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113008231A (en) * | 2021-04-30 | 2021-06-22 | 东莞市小精灵教育软件有限公司 | Motion state identification method and system, wearable device and storage medium |
CN113627340B (en) * | 2021-08-11 | 2024-02-09 | 广东沃莱科技有限公司 | Method and equipment capable of identifying rope skipping mode |
CN113780159A (en) * | 2021-09-08 | 2021-12-10 | 广州偶游网络科技有限公司 | Touchdown motion recognition method and storage medium |
CN114241603B (en) * | 2021-12-17 | 2022-08-26 | 中南民族大学 | Shuttlecock action recognition and level grade evaluation method and system based on wearable equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018053912A1 (en) * | 2016-09-23 | 2018-03-29 | 上海葡萄纬度科技有限公司 | Method for real-time action recognition, and related bracelet and computing device |
CN108509897A (en) * | 2018-03-29 | 2018-09-07 | 同济大学 | A kind of human posture recognition method and system |
CN110245718A (en) * | 2019-06-21 | 2019-09-17 | 南京信息工程大学 | A kind of Human bodys' response method based on joint time-domain and frequency-domain feature |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10339371B2 (en) * | 2015-09-23 | 2019-07-02 | Goertek Inc. | Method for recognizing a human motion, method for recognizing a user action and smart terminal |
-
2020
- 2020-03-13 CN CN202010173918.1A patent/CN111401435B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018053912A1 (en) * | 2016-09-23 | 2018-03-29 | 上海葡萄纬度科技有限公司 | Method for real-time action recognition, and related bracelet and computing device |
CN108509897A (en) * | 2018-03-29 | 2018-09-07 | 同济大学 | A kind of human posture recognition method and system |
CN110245718A (en) * | 2019-06-21 | 2019-09-17 | 南京信息工程大学 | A kind of Human bodys' response method based on joint time-domain and frequency-domain feature |
Non-Patent Citations (1)
Title |
---|
基于三轴加速度传感器的人体运动识别;李锋等;《计算机研究与发展》(第03期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111401435A (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111401435B (en) | Human body motion mode identification method based on motion bracelet | |
CN103970271B (en) | The daily routines recognition methods of fusional movement and physiology sensing data | |
CN108446733A (en) | A kind of human body behavior monitoring and intelligent identification Method based on multi-sensor data | |
Zhang et al. | Beyond the standard clinical rating scales: fine-grained assessment of post-stroke motor functionality using wearable inertial sensors | |
CN107157450A (en) | Quantitative estimation method and system are carried out for the hand exercise ability to patient Parkinson | |
CN108958482B (en) | Similarity action recognition device and method based on convolutional neural network | |
CN112464738B (en) | Improved naive Bayes algorithm user behavior identification method based on mobile phone sensor | |
CN108681685A (en) | A kind of body work intension recognizing method based on human body surface myoelectric signal | |
Shin et al. | Korean sign language recognition using EMG and IMU sensors based on group-dependent NN models | |
Jung et al. | Deep neural network-based gait classification using wearable inertial sensor data | |
CN114358194A (en) | Gesture tracking based detection method for abnormal limb behaviors of autism spectrum disorder | |
Beily et al. | A sensor based on recognition activities using smartphone | |
CN113974612A (en) | Automatic assessment method and system for upper limb movement function of stroke patient | |
CN117690583B (en) | Internet of things-based rehabilitation and nursing interactive management system and method | |
ud din Tahir et al. | Daily life log recognition based on automatic features for health care physical exercise via IMU sensors | |
Mekruksavanich et al. | Badminton activity recognition and player assessment based on motion signals using deep residual network | |
CN114881079A (en) | Human body movement intention abnormity detection method and system for wearable sensor | |
CN113095379A (en) | Human motion state identification method based on wearable six-axis sensing data | |
Saidani et al. | An efficient human activity recognition using hybrid features and transformer model | |
CN109271889A (en) | A kind of action identification method based on the double-deck LSTM neural network | |
Jitpattanakul et al. | Enhancing Sensor-Based Human Activity Recognition using Efficient Channel Attention | |
CN116747495A (en) | Action counting method and device, terminal equipment and readable storage medium | |
CN114916928B (en) | Human body posture multichannel convolutional neural network detection method | |
CN115067934A (en) | Hand motion function analysis system based on machine intelligence | |
Khokhlova et al. | Kinematic covariance based abnormal gait detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |