Detailed Description
The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
FIG. 1 is a schematic diagram of a feature extraction method of the present invention. The invention provides a human motion feature extraction method, which sequentially comprises the following steps:
s101, preprocessing is carried out, wherein the preprocessing comprises the steps of converting three-dimensional gyroscope components (three-axis gyroscope signals) in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body, and converting three-dimensional acceleration components (three-axis accelerometer signals) in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
s102, searching the beginning and the end of a specific motion segment in the first data queue;
s103, searching the beat chain in the specific motion segment in the second data queue, and providing average description of the beat chain;
s104, extracting basic action characteristics;
s105, judging whether the quantity of the extracted and stored basic motion characteristics reaches the set required quantity, and if so, generating the description of the basic motion of the motion; otherwise, the sliding processing is carried out, and the process of finding out the beginning and the end of the specific motion segment in the first data queue is returned to, and the circulating processing is carried out.
In the present invention, the specific motion segment refers to the longest span among the plurality of motion segments found in the first data queue.
In the present invention, the description of the basic motion of the motion includes a mean value and a mean square value of each dimensional component of the vector value group of the basic motion features.
In the present invention, the processing of converting the three-dimensional gyroscope components in the collected data queue into a one-dimensional first data queue for measuring the human activity amplitude is as follows: and aiming at each sequence point, respectively averaging the three-dimensional gyroscope components of each historical point, and then summing and averaging the calculated values.
In the present invention, the processing of converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the variation beat of the periodic motion of the human body is as follows: and summing the three-dimensional acceleration components for each sequence point.
Specifically, step S101 further includes:
the processor continuously collects synchronous data from the gyroscope and the accelerometer, and stores the collected data in a FIFO data queue, namely the windows data, so that the windows data is filled with the data; it should be noted that the length of the FIFO data queue is selected in relation to the sampling frequency of the sensor, for example, the length of the FIFO data queue should be no less than 200 for a sampling frequency of 25Hz, and the length of the FIFO data queue should be no less than 400 for a sampling frequency of 50Hz, in other words, one FIFO data queue can hold about 8 seconds of sampled data. By such design, for the situation that the period of regular motion of a general human body is about 1 second and the maximum time generally does not exceed 1.5 seconds, a motion segment waveform data containing a plurality of beats (waveform periods) can be captured by a FIFO data queue.
Constructing a 1-dimensional data queue motionLevel reflecting the activity amplitude of the human body by using a 3-dimensional gyroscope component in a six-dimensional FIFO data queue windowData, and specifying a critical value of motion and stillness of the human body as motionLevelthreshold equal to 1 for data in the motionLevel. If a certain piece of data in motionLevel is higher than motionLevelthreshold, the person is considered to be in motion in the period of time; otherwise, the person is considered to be stationary for that period of time;
and constructing a 1-dimensional beat data queue motionClock by using a 3-dimensional accelerometer component in a six-dimensional FIFO data queue windowData, wherein the change beat of the motionClock reflects the change beat of the periodic motion of the human body.
Step S102 further includes searching the current motion activity index queue according to the motion activity index threshold motionLevelthreshold, recording the start position and the end position of each small segment (i.e. the sequence number of the corresponding data queue) whose activity index exceeds motionLevelthreshold, and searching a small segment with the largest sequence span from these small segments as the specific motion segment to be processed.
Step S103 further includes processing according to the start position and the end position of the specific motion segment obtained in step S102 and the beat data queue motionClock obtained in step S101. The processing result of the beat chain provides the number of beats and the start and end positions of a plurality of beats in addition to the average description of the beat chain described above. Wherein a beat chain average description refers to an average description of a plurality of beats in the beat chain. More specifically, the average description of a plurality of beats is obtained by using feature extraction based on wavelet analysis. In this embodiment, the feature extraction based on wavelet analysis includes: for each sequence point in each beat, normalization processing of the deviation relative variance is performed. In this embodiment, a secondary clustering analysis method is adopted for processing the beat chain. Specifically, the first-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a difference comparison technology is mainly adopted in the classification process. The second-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a similarity comparison technology is mainly adopted in the classification process.
In step S104, wavelet analysis is performed on each dimensional component of the vector value group. Thus, by reducing the computational dimensionality of the wavelet analysis, the computational load can be greatly reduced. A similarity comparison based on basic action features is employed and the basic action feature criteria are dynamically updated as the basic action feature queue is populated. The similarity comparison between the basic action characteristics and the basic action characteristic standards is performed under a doubtful mechanism, and when the doubtful degree reaches the appointed degree, the basic action characteristic queue and the basic action characteristic standards are emptied. The similarity comparison of the basic action characteristics and the basic action characteristic standard adopts multi-layer comparison. The multi-level comparison includes a similarity comparison between the average descriptions of the beat chains.
In step S105, the setting request number is 20, for example.
Further preferred embodiments of the present invention are: the process of searching for the moving segment and the static segment in the first data queue specifically comprises the following steps:
firstly, setting the serial number of a motion segment as 0 and the serial number of a static segment as 0;
then, sequentially comparing the data of the first data queue with the human motion and rest critical values according to the sequence from front to back:
for the 1 st element of the first data queue, when the value of the 1 st element is found to be more than or equal to the critical value of the motion and the rest of the human body, the number of the motion segment is added by 1, the number 1 is stored as the starting position of the motion segment pointed by the number of the motion segment, the value of the 2 nd element is searched next, and if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the rest of the human body, the exit is carried out; if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the motion segment indicated by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number 2 is stored as the start position of the stillness segment indicated by the stillness segment number; when the value of the 1 st element is found to be smaller than the critical value of the motion and the stillness of the human body, the number of the stillness section is added by 1, the number 1 is stored as the starting position of the stillness section pointed by the number of the stillness section, the value of the 2 nd element is searched, and if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the operation is exited; if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the stillness section pointed by the number of the stillness section, the length of the stillness section is calculated and stored, the number of the motion section is added by 1, and the number 2 is stored as the start position of the motion section pointed by the number of the motion section;
for the nth element of the first data queue, the nth element is an element between the 1 st element and the last element, when the value of the nth element is found to be greater than or equal to the critical value of human motion and stillness and the value of the (n + 1) th element is found to be less than the critical value of human motion and stillness, the number n is stored as the end position of the motion segment pointed by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number n +1 is stored as the start position of the stillness segment pointed by the stillness segment number; when the value of the nth element is found to be smaller than the critical value of the motion and the stillness of the human body and the value of the (n + 1) th element is found to be larger than or equal to the critical value of the motion and the stillness of the human body, storing the number n as the end position of the stillness section pointed by the number of the stillness section, calculating and storing the length of the stillness section, adding 1 to the number of the motion section, and storing the number n +1 as the start position of the motion section pointed by the number of the motion section;
for the last element of the first data queue, when the value of the last element is found to be more than or equal to the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the motion segment pointed by the number of the motion segment, and calculating and storing the length of the motion segment; and when the value of the last element is found to be smaller than the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the stillness section pointed by the number of the stillness section, and calculating and storing the length of the stillness section.
Further preferred embodiments of the present invention are: the processing of searching the information of the beat chain in the specific motion segment in the second data queue adopts a secondary clustering analysis method, the first-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on difference comparison, and the second-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on similarity comparison.
Further preferred embodiments of the present invention are: the calculation steps (i.e. the implementation process of the second-level clustering) of the feature extraction of the beat signals generated by the repetitive motion of the human body are as follows:
providing a beat { a) of repetitive motion of the human bodyi}1≤i≤n;
Calculate the beat expectation and variance: <math>
<mrow>
<mi>E</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>n</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msub>
<mi>a</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<mi>V</mi>
<mo>=</mo>
<mroot>
<mrow>
<mfrac>
<mn>1</mn>
<mi>n</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>a</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<mi>E</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mn>2</mn>
</mroot>
<mo>;</mo>
</mrow>
</math>
structure { bi}1≤i≤n,
Setting the number of the segmentation sections: sectionNum, and segmentation scale:
and the following calculations were made:
when i is 1, …, sectionNum-1, there are
<math>
<mrow>
<msub>
<mi>s</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</munderover>
<msub>
<mi>b</mi>
<mi>i</mi>
</msub>
</mrow>
</math>
When i is sectionNum, there are
<math>
<mrow>
<msub>
<mi>s</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</mfrac>
<mo>×</mo>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>sec</mi>
<mi>tionNum</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msub>
<mi>b</mi>
<mi>j</mi>
</msub>
<mo>;</mo>
</mrow>
</math>
So called { E, V, { si}1≤i≤sectionNumIs { a }i}1≤i≤nWherein { s }i}1≤i≤sectionNumIs { ai}1≤i≤nThe shape characteristics of (a).
Further preferred embodiments of the present invention are: the calculation steps of the similarity comparison between the human body repetitive motion beat signals are as follows:
set the tempo A to
The shape is characterized in that
Beat B is
The shape is characterized in that
Definition d
i}
1≤i≤sectionNumWherein
A similarity threshold, simiaritythreshold, is set and calculated as follows:
<math>
<mrow>
<msup>
<mi>E</mi>
<mi>d</mi>
</msup>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msubsup>
<msub>
<mi>d</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<msup>
<mi>V</mi>
<mi>d</mi>
</msup>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msubsup>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>d</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<msup>
<mi>E</mi>
<mi>d</mi>
</msup>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
</mrow>
</math>
taking the similarity threshold not less than 0.1 and not more than 0.3,
when V isdWhen the value is less than or equal to micromeritythreshold, { ai}1≤i≤nAnd { bi}1≤i≤nSimilarly;
when V isdWhen > micromeritythreshold, { ai}1≤i≤nAnd { bi}1≤i≤nAre not similar.
Fig. 2 is a flowchart of a feature extraction method according to an embodiment of the present invention. It substantially comprises the following steps:
s201, preprocessing, namely forming a three-dimensional vector aiming at each three-dimensional gyroscope component, adding the length of the three-dimensional gyroscope component and the lengths corresponding to all historical vectors arranged in front of the three-dimensional gyroscope component in a buffer data queue together, and then calculating an average value, wherein a calculation result is used as a value at a corresponding position in a first data queue, so that the first data queue reflecting the human motion amplitude in 1 dimension is constructed; and forming a three-dimensional vector aiming at each three-dimensional acceleration component, summing the three-dimensional acceleration components, and taking the calculation result as a value at a corresponding position in a second data queue, thus constructing a 1-dimensional second data queue reflecting the activity beat of the human body.
S202, finding out the starting position and the ending position of the specific motion segment in the first data queue, which comprises the following steps:
firstly, setting the serial number of a motion segment as 0 and the serial number of a static segment as 0;
then, sequentially comparing the data of the first data queue with the human motion and rest critical values according to the sequence from front to back:
for the 1 st element of the first data queue, when the value of the 1 st element is found to be more than or equal to the critical value of the motion and the rest of the human body, the number of the motion segment is added by 1, the number 1 is stored as the starting position of the motion segment pointed by the number of the motion segment, the value of the 2 nd element is searched next, and if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the rest of the human body, the exit is carried out; if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the motion segment indicated by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number 2 is stored as the start position of the stillness segment indicated by the stillness segment number; when the value of the 1 st element is found to be smaller than the critical value of the motion and the stillness of the human body, the number of the stillness section is added by 1, the number 1 is stored as the starting position of the stillness section pointed by the number of the stillness section, the value of the 2 nd element is searched, and if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the operation is exited; if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the stillness section pointed by the number of the stillness section, the length of the stillness section is calculated and stored, the number of the motion section is added by 1, and the number 2 is stored as the start position of the motion section pointed by the number of the motion section;
for the nth element of the first data queue, the nth element is an element between the 1 st element and the last element, when the value of the nth element is found to be greater than or equal to the critical value of human motion and stillness and the value of the (n + 1) th element is found to be less than the critical value of human motion and stillness, the number n is stored as the end position of the motion segment pointed by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number n +1 is stored as the start position of the stillness segment pointed by the stillness segment number; when the value of the nth element is found to be smaller than the critical value of the motion and the stillness of the human body and the value of the (n + 1) th element is found to be larger than or equal to the critical value of the motion and the stillness of the human body, storing the number n as the end position of the stillness section pointed by the number of the stillness section, calculating and storing the length of the stillness section, adding 1 to the number of the motion section, and storing the number n +1 as the start position of the motion section pointed by the number of the motion section;
for the last element of the first data queue, when the value of the last element is found to be more than or equal to the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the motion segment pointed by the number of the motion segment, and calculating and storing the length of the motion segment; and when the value of the last element is found to be smaller than the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the stillness section pointed by the number of the stillness section, and calculating and storing the length of the stillness section.
And S203, searching the information of the beat chain in the specific motion segment in the second data queue according to the starting position and the ending position searched in the first data queue, wherein the information comprises the characteristic information of the beat waveform, the number of beats, and the starting position and the ending position of each beat.
The processing of searching the information of the beat chain in the specific motion segment in the second data queue adopts a secondary clustering analysis method, the first-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on difference comparison, and the second-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on similarity comparison.
And S204, extracting waveform characteristics of the triaxial accelerometer signal segment and the triaxial gyroscope signal segment which are synchronous with the beat from the sensor data buffer queue according to the starting position and the ending position of each beat of the information of the beat chain to serve as basic action characteristics, and if the basic action characteristics are extracted for the first time, storing the basic action characteristics as a first element of the characteristic queue and using the basic action characteristics as a characteristic standard of the basic action.
The method for extracting the basic action features not only extracts the features of a six-dimensional data sequence segment which is synchronous with a beat signal and is composed of a triaxial acceleration signal and a triaxial gyroscope signal, but also specifically comprises the following steps:
calculating each dimension of the six-dimensional data sequence fragment to obtain the expectation and variance of each dimension; reconstructing a six-dimensional data sequence segment according to the expectation and the variance and obtaining a data floating proportion matrix; longitudinally dividing and calculating the data floating proportion matrix according to set parameters including the number of the division sections and the division scale to obtain a data floating proportion description sequence; obtaining basic action characteristics marked by corresponding movement beats according to expectation and variance of each dimension of the six-dimensional data sequence segment and a data floating proportion description sequence; the number of the segmentation sections is used for setting the number of the sections longitudinally segmented by the data floating proportion matrix, the segmentation scale is used for setting the data length of each dimension segmentation section of the data floating proportion matrix, and the number of the segmentation sections is set to be 3-10 sections.
The specific implementation mode comprises the following steps:
six-dimensional data sequence fragment unitData belonged to R6×l(ii) a Six-dimensional data sequence fragment length: l;
expectation and variance for each dimension of the six-dimensional data sequence fragment:
<math>
<mrow>
<mi>accXE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>accYE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>accZE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mi>accXV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>accXE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
<mi>accYV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>accYE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mi>accZV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>accZE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mi>gyroXE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>gyroYE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>gyroZE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mi>gyroXV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>gyroXE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
<mi>gyroYV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>gyroYE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mi>gyroZV</mi>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mi>l</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>unitData</mi>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>gyroZE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
</msup>
<mo>,</mo>
</mrow>
</math>
wherein:
accX represents that the quantity of the accX is related to the X-axis component of the three-dimensional acceleration data;
accY represents that the quantity of the accY is related to the Y-axis component of the three-dimensional acceleration data;
accZ represents that the quantity of the accZ is related to the Z-axis component of the three-dimensional acceleration data;
gyroX represents that the quantity it is in is related to the X-axis component of the three-dimensional gyroscope data;
gyroY represents that the quantity in which it is located is related to the Y-axis component of the three-dimensional gyroscope data;
gyroZ represents that the quantity it is in is related to the Z-axis component of the three-dimensional gyroscope data;
accXE represents the expectation of the X-axis component of the three-dimensional acceleration data;
the expectation of other axis components is consistent with the variance as represented by accXE above, and is not described.
Obtaining a data floating proportion matrix unitDataCorrect:
first, the number of segments is set: sectionNum, and segmentation scale:
subsequently, the unitDataCorrection is chunked along the row direction, their widths are both the segmentation scale for the preceding segmentnum-1 block, and for the last block, their widths do not necessarily just reach the segmentation scale.
For the first line of data of unitDataCorrection, the following calculation is made:
when i is 1, …, sectionNum-1, there are
<math>
<mfenced open='' close='' separators=' '>
<mtable>
<mtr>
<mtd>
<mi>accXRoughShape</mi>
<mo>=</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</mfrac>
<mo>×</mo>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</msubsup>
<mi>unitDataCorrection</mi>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>;</mo>
</mfenced>
</math>
When i is sectionNum, there are
<math>
<mrow>
<mfenced open='' close=''>
<mtable>
<mtr>
<mtd>
<mi>accXRoughShape</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>l</mi>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mi>sec</mi>
<mi>tionNum</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
</mrow>
</mfrac>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>×</mo>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>×</mo>
<mi>sec</mi>
<mi>tionMeasure</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
<mi>l</mi>
</msubsup>
<mi>unitDataCorrection</mi>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
The acxroughshape (i) represents the shape description sequence of the data floating scale matrix corresponding to the three-dimensional acceleration data sequence segment on the X axis, and the other scale description sequences are represented in the same way as the table of the acxroughshape (i), which is not always described.
In this case, similar processing is performed for the data of the other lines of unitDataCorrection.
And finally obtaining the basic motion characteristic actionFeature of the motion:
<math>
<mrow>
<mi>actionFeature</mi>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<mi>accXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
wherein:
accXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional acceleration data shape;
accYRoughShape represents a shape description sequence of the Y-axis component of the three-dimensional acceleration data shape;
accZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional acceleration data shape;
gyroXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional gyroscope data shape;
gyroYRoughShape represents a shape description sequence of a Y-axis component of a three-dimensional gyroscope data shape;
gyroZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional gyroscope data shape;
s205, judging whether the extracted characteristics of the basic motion are similar to the characteristics standard of the basic motion, if so, turning to the step S206, and if not, turning to the step S209. The implementation of S205 is described in detail as follows:
when the program extracts a group of signal change characteristics from the vector value data sequence segment generated by the basic motion of the motion, the program compares the extracted basic motion characteristics with the characteristic standard of the current basic motion, and further judges whether the basic motion represented by the change characteristics of the group of signals and the previous basic motion belong to the same type of motion. And setting actionFeature as a basic motion feature of the motion extracted by the program.
<math>
<mrow>
<mi>actionFeature</mi>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<mi>accXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
</math>
And setting actionFeatureStd as a basic action feature standard for the motion;
<math>
<mrow>
<mi>actionFeatureStd</mi>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<mi>accXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
</math>
accXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional acceleration data shape;
accYRoughShape represents a shape description sequence of the Y-axis component of the three-dimensional acceleration data shape;
accZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional acceleration data shape;
gyroXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional gyroscope data shape;
gyroYRoughShape represents a shape description sequence of a Y-axis component of a three-dimensional gyroscope data shape;
gyroZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional gyroscope data shape;
and setting a threshold value of the acceleration data shape similarity, accRoughShapeDifferThreshold, 0.1 ≤ accRoughShapeDifferThreshold ≤ 0.3,
setting a gyroscope data shape similarity threshold value gyroroughshapedifferfreshold which is more than or equal to 0.2 and less than or equal to 0.4,
set a feature similarity threshold value featuresimililaryievaluatethreshold? {3,4}
Setting the initialized feature similarity counter featuresimilartyevaluation to 0;
the specific algorithm comprises the following steps:
A. the shape difference description is generated using the shape of the basic motion feature actionFeature and the basic motion feature standard actionFeatureStd on each component,
<math>
<mrow>
<mi>actionFeatureStd</mi>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accXRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accYRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accZRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroXRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroYRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroZRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>.</mo>
</mrow>
</math>
wherein:
accXRoughShapeDiffer represents the shape difference of the shape description sequence of the X-axis component of the two three-dimensional acceleration data;
acyroughshapediffer represents a shape difference of shape description sequences of Y-axis components of two three-dimensional acceleration data;
accZRoughShapeDiffer represents a shape difference of shape description sequences of Z-axis components of two three-dimensional acceleration data;
gyroXRoughShapeDiffer represents the shape difference of the shape description sequence of the X-axis component of the two three-dimensional gyroscope data;
gyroYRoughShapeDiffer represents a shape difference of a shape description sequence of a Y-axis component of two three-dimensional gyroscope data;
gyroZRoughShapeDiffer represents a shape difference of shape description sequences of Z-axis components of two three-dimensional gyroscope data;
the following calculations were made:
when i is more than or equal to 1 and less than or equal to sectionNum,
accXRoughShapeDiffer(i)=actionFeature.accXRoughShape(i)-actionFeatureStd.accXRoughShape(i);
the algorithms for the other dimensions are as described above and will not be described further herein.
B. Calculating the shape difference describing the expectation and variance of the shape difference over the respective components:
<math>
<mrow>
<mi>accXRoughShapeDifferE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msubsup>
<mi>accXRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
</math>
<math>
<mrow>
<mfenced open='' close=''>
<mtable>
<mtr>
<mtd>
<mi>accXRoughShapeDifferV</mi>
<mo>=</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mroot>
<mrow>
<mfrac>
<mn>1</mn>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msubsup>
<msup>
<mrow>
<mo>(</mo>
<mi>accXRoughShapeDiffer</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>accXRoughShapeDifferE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mn>2</mn>
</mroot>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
other algorithms for the expectation and variance of shape variance in dimension are described above and will not be described further herein.
C. Judging the similarity degree between the extracted features of the basic motion and the feature standard of the motion basic motion:
if accXRoughShapeDifferV ≦ accRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if accYROGROUGhSHAPEDifferV ≦ accROUGhSHAPEDifferThreshold, then featureSimiliyEvaluate is added by 1;
if accZRoughShapeDifferV ≦ accRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if gyroXRoughShapeDifferV ≦ gyroRoughShapeDifferThreshold, then featureSililityEvaluate is added by 1;
if gyroyRoughShapeDifferV ≦ gyroyRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if gyroZRoughShapeDifferV ≦ gyroRoughShapeDifferThreshold, then featureSililityEvaluate is added by 1;
if featuresimililartyEvaluate ≧ featuresimililartyEvaluateThreshold, then the program considers the extracted features of the base action to be similar to the feature criteria of the motion base action;
if featuresimilivelevalation < featuresimililarevalation threshold, then the program considers the extracted features of the underlying motion to be dissimilar from the feature criteria of the motion underlying motion;
s206, storing the extracted basic action characteristics in a queue of the basic action characteristics; and generating a new feature standard of the basic action by using the existing basic action features and the extracted features.
The implementation of S206 is described in detail as follows:
if the program finds that the extracted basic action features are similar to the current feature standard of the basic action, the program regenerates a new basic action feature standard by using the extracted basic action features and the current feature standard of the basic action.
Setting actionFeature as a basic motion feature of the motion extracted by the program and setting actionFeatureStd as a basic motion feature standard of the motion;
the algorithm specifically comprises the following steps:
when i is more than or equal to 1 and less than or equal to sectionNum,
the algorithms for the other dimensions are as described above and will not be described further herein.
Actionfeaturestd. accxroughshape is a new basic action feature standard.
S207, judging whether the number of the basic motion features in the basic motion feature queue reaches the set requirement, if so, turning to the step S208, otherwise, turning to the step S212.
And S208, calculating the statistical characteristics of the basic motion of the motion by using the queue of the basic motion characteristics of the motion, and storing the statistical characteristics in a flash memory as the description of the basic motion of the motion.
The statistical analysis of the motion-based basic motion characteristic queue specifically comprises the following steps:
obtaining a plurality of basic motion features of a certain motion to form a feature queue of a moving basic motion based on feature extraction of the six-dimensional data sequence segment; calculating expectation and variance of a sequence formed by data at the corresponding position of each basic action characteristic; forming a knowledge point of a basic motion of a certain motion from the generated expectation and variance; the knowledge base is formed by a plurality of knowledge points.
The specific implementation mode for extracting the knowledge points of the basic actions from the basic action feature sequence comprises the following steps:
the basic action feature number actionFeatureNum is set to obtain a plurality of basic action features: { actionfeature (k) }1≤k≤actionFeatureNum,
Wherein:
<math>
<mrow>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<mi>accXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>accZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroXV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroYV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZE</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>gyroZV</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
knowledge points for such actions can be generated by statistical calculations:
<math>
<mrow>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open='(' close=')'>
<mtable>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accXEE</mi>
<mo>,</mo>
<mi>accXEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accXVE</mi>
<mo>,</mo>
<mi>accXVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accYEE</mi>
<mo>,</mo>
<mi>accYEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accYVE</mi>
<mo>,</mo>
<mi>accYVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>accYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accZEE</mi>
<mo>,</mo>
<mi>accZEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>accZVE</mi>
<mo>,</mo>
<mi>accZVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>accZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroXEE</mi>
<mo>,</mo>
<mi>gyroXEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroXVE</mi>
<mo>,</mo>
<mi>gyroXVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>gyroXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroYEE</mi>
<mo>,</mo>
<mi>gyroYEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroYVE</mi>
<mo>,</mo>
<mi>gyroYVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>gyroYRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroZEE</mi>
<mo>,</mo>
<mi>gyroZEV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>gyroZVE</mi>
<mo>,</mo>
<mi>gyroZVV</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>,</mo>
<mi>gyroZRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mn>1</mn>
<mo>≤</mo>
<mi>i</mi>
<mo>≤</mo>
<mi>sec</mi>
<mi>tionNum</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
wherein,
<math>
<mrow>
<mi>accXEE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>actionFeatureNum</mi>
</msubsup>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXE</mi>
<mo>;</mo>
</mrow>
</math>
<math>
<mrow>
<mroot>
<mrow>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>actionFeatureNum</mi>
</msubsup>
<msup>
<mrow>
<mo>(</mo>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXE</mi>
<mo>-</mo>
<mi>accXEE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mn>2</mn>
</mroot>
<mo>;</mo>
</mrow>
</math>
<math>
<mrow>
<mi>accXVE</mi>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>actionFeatureNum</mi>
</msubsup>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXV</mi>
<mo>;</mo>
</mrow>
</math>
<math>
<mrow>
<mfenced open='' close=''>
<mtable>
<mtr>
<mtd>
<mi>accXVV</mi>
<mo>=</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mroot>
<mrow>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>actionFeatureNum</mi>
<msup>
<mrow>
<mo>(</mo>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXV</mi>
<mo>-</mo>
<mi>accXVE</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msubsup>
</mrow>
<mn>2</mn>
</mroot>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
when k is more than or equal to 1 and less than or equal to sectionNum, there are
<math>
<mrow>
<mfenced open='' close=''>
<mtable>
<mtr>
<mtd>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>=</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<msubsup>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>actionFeatureNum</mi>
</msubsup>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>,</mo>
</mrow>
</math>
<math>
<mrow>
<mfenced open='' close=''>
<mtable>
<mtr>
<mtd>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>V</mi>
<mo>=</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mroot>
<mrow>
<mfrac>
<mn>1</mn>
<mi>actionFeatureNum</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>actionFeatureNum</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mi>actionFeature</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>accXRoughShape</mi>
<mrow>
<mo>(</mo>
<mi>k</mi>
<mo>)</mo>
</mrow>
<mo>.</mo>
<mi>E</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mn>2</mn>
</mroot>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
</math>
The algorithms for the other dimensions are as described above and will not be described further herein.
S209, the basic action suspicion counter is self-incremented by 1.
S210, judging whether the suspicion counter of the basic action characteristics reaches the suspicion threshold value, if so, turning to step S211, and if not, turning to step S212.
S211, sliding pretreatment, namely emptying a queue of basic action characteristics; clearing the basic action characteristic standard; the basic action suspicion counter is clear.
S212, sliding processing, namely covering the processed data with the unprocessed sensor data, and filling the buffer queue with the newly acquired sensor data; and regenerating the first data queue and the second data queue by using the new buffered data queue, and turning to the step S202.
Fig. 3 is a block diagram of the intelligent wearable device according to the present invention. The invention provides an intelligent wearable device, such as: a sports wristband comprising a feature extraction module 301, a sports description library 303 and a sensing module 304. The feature extraction module 301 may establish a motion description library 303 by using the feature extraction method described above through the sensing module 304, where the motion description library 303 includes descriptions of at least one basic motion of a motion. It is clear to one skilled in the art that the modules described herein can be implemented in hardware, or by means of software plus a necessary general hardware platform.
Referring to fig. 4, a feature extraction module embodiment of an intelligent wearable device is provided, which generally comprises: a first unit 401, configured to correspondingly implement the function of step S101 in fig. 1; a second unit 402, configured to correspondingly implement the function of step S102 in fig. 1; a third unit 403, configured to correspondingly implement the function of step S103 in fig. 1; a fourth unit 404, configured to correspondingly implement the function of step S104 in fig. 1; and a fifth unit 405, configured to correspondingly implement the function of step S105 in fig. 1. It will be clear to a person skilled in the art that the modules and/or units described herein may be implemented in hardware, or by means of software plus a necessary general hardware platform.
The invention provides a general algorithm for discovering periodic signals and extracting information, which comprises the following steps: and searching similar vector value signal segments from a six-dimensional vector value sequence formed by the synchronous three-axis acceleration signals and the three-axis sensor signals, extracting the characteristics of the vector value signal segments, and finally extracting corresponding statistical characteristic calculation flows from a plurality of vector value signal characteristics. The algorithm is not limited to the application on the bracelet, and can be embedded into a plurality of systems for application.
It should be understood that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, and those skilled in the art can modify the technical solutions described in the above embodiments, or make equivalent substitutions for some technical features; and such modifications and substitutions are intended to be included within the scope of the appended claims.