CN103892840A - Intelligent wearing device and method for extracting human body motion features - Google Patents

Intelligent wearing device and method for extracting human body motion features Download PDF

Info

Publication number
CN103892840A
CN103892840A CN201410164185.XA CN201410164185A CN103892840A CN 103892840 A CN103892840 A CN 103892840A CN 201410164185 A CN201410164185 A CN 201410164185A CN 103892840 A CN103892840 A CN 103892840A
Authority
CN
China
Prior art keywords
mrow
motion
data queue
basic
basic action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410164185.XA
Other languages
Chinese (zh)
Other versions
CN103892840B (en
Inventor
夏波
王志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Love Technology Co Ltd
Original Assignee
SHENZHEN DEKAIRUI TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN DEKAIRUI TECHNOLOGY Co Ltd filed Critical SHENZHEN DEKAIRUI TECHNOLOGY Co Ltd
Priority to CN201410164185.XA priority Critical patent/CN103892840B/en
Publication of CN103892840A publication Critical patent/CN103892840A/en
Application granted granted Critical
Publication of CN103892840B publication Critical patent/CN103892840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to the field of human body regular motion measurement, in particular to an intelligent wearing device and a method for extracting human body motion features. The intelligent wearing device comprises a feature extracting module, and the feature extracting module comprises a preprocessing unit, a first searching unit, a second searching unit, a feature extracting unit and a motion description generating unit, wherein the preprocessing unit is used for converting three-dimensional gyroscope components in a collected data queue into a first data queue and converting three-dimensional accelerated speed components into a second data queue, the first searching unit is used for searching the first data queue for the beginning position and the stopping position of a specific motion section, the second searching unit is used for searching the second data queue for a rhythm chain in the specific motion section and providing a rhythm chain average description, and the feature extracting unit is used for extracting basic motion features. According to the intelligent wearing device and the method, an algorithm can be effectively simplified to lower the operation resource requirements and the operation power consumption requirements.

Description

Intelligent wearable device and human motion feature extraction method
Technical Field
The present invention relates to an apparatus and a method for measuring a regular motion of a human body, and more particularly, to an apparatus and a method for extracting a feature of a regular motion by using a sensor worn on a human body.
Background
In today's society, the increasing pace of life and the increasing work pressures have placed more and more people in sub-health. Thus, people pay more and more attention to their health conditions, and take various measures to improve their health conditions, such as adjusting their work and rest rhythm, making meals reasonably, and doing various exercises moderately. Among the various measures to improve the health condition, exercise is a very important measure. Proper exercise can enhance the metabolism level of the human body, shape perfect posture and help people to eliminate bad emotions. With the development of science and technology, a series of electronic products for monitoring sports appear in society. For example: chinese patent CN200710097593.8 discloses a wristwatch type acceleration sensing module for measuring amount of exercise, which comprises a microprocessor, an acceleration sensor, a timer, a database of step length corresponding to the swing acceleration of one hand, and a display. The acceleration sensor is used for sensing the hand swing times and the hand swing acceleration of the sporter in moving, and the timer is used for calculating the moving time of the sporter. The microprocessor compares the received hand swing acceleration with a curve chart of the step length corresponding to the hand swing acceleration stored in a database of the step length corresponding to the hand swing acceleration to obtain the corresponding step length, and then calculates the step length, the hand swing times of the sporter and the moving time of the sporter through a formula to obtain the moving distance and the moving speed. These products generally allow for a more accurate measurement of the time, distance, and energy consumed by a user during walking, running, swimming, climbing, etc. However, before using the device, the user is often required to set the monitoring content of the device by himself to enable the device to perform accurate measurements of the movements to be performed by the user. In this case, it is easy for the user to forget to switch the monitoring contents of the apparatus, and the exercise data is inaccurate. Moreover, it is easy for the user to generate a feeling that it is troublesome to frequently manually set the monitoring contents of the devices, which results in poor user experience. With the continuous development of sensing technology, the commercial application of a nine-axis sensing module integrating three-axis acceleration, a three-axis gyroscope and a three-axis magnetometer appears, for example: in US2012/0323520 it is disclosed to employ machine learning and automatic recognition techniques in an intelligent wearable device to capture and analyze human body regular movements to further report the amount of motion to the user. The adoption of these intelligent technologies has increased the demand for computing power of the devices, and accordingly, the demand for power consumption has also increased.
In view of smart wearable devices, such as: the motion wrist strap is limited by a small device space, has design constraints of limited computing capacity and requirement of long-term power supply of a battery as far as possible, and is always the direction of people's efforts on how to simplify an algorithm to reduce the requirements on computing resources and power consumption on the premise of improving user experience as far as possible.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide an intelligent wearable device and a method for extracting human body movement features, which can effectively simplify an algorithm to reduce the requirement of computational resources and the requirement of power consumption.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps: the method for extracting the characteristics of the human motion sequentially comprises the following steps:
preprocessing, including converting the three-dimensional gyroscope component in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body, and converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
processing to find out the beginning and the end of a specific motion segment in the first data queue;
processing to find out the beat chain in the specific motion segment in the second data queue, and providing average description of the beat chain;
extracting basic action characteristics;
judging whether the quantity of the extracted and stored basic motion characteristics reaches the set required quantity, and if so, generating the description of the basic motion of the motion; otherwise, the sliding processing is carried out, and the process of finding out the beginning and the end of the specific motion segment in the first data queue is returned to, and the circulating processing is carried out.
The technical scheme adopted by the invention for solving the technical problem also comprises the following steps: the utility model provides an intelligence wearing equipment, includes a module for accomplish human motion's feature extraction, the module includes:
the preprocessing unit is used for converting the three-dimensional gyroscope component in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body and converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
a first searching unit, configured to perform processing for searching for the start and end of a specific motion segment in the first data queue;
the second searching unit is used for searching the beat chain in the specific motion segment in the second data queue and providing average description of the beat chain;
a feature extraction unit for extracting a basic motion feature; and
a motion description generation unit for performing a judgment process of whether the number of the extracted and stored basic motion features reaches a set required number, if so, generating a description of the basic motion of the motion; otherwise, the sliding process is performed, and the process returns to the first search unit to perform the loop process.
The invention has the advantages that the data queue of the three-dimensional gyroscope component is converted into the first data queue of the one-dimensional gyroscope component for measuring the activity amplitude of the human body, and the specific motion segment is found out according to the first data queue; the data queue of the three-dimensional acceleration component is converted into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body, a beat chain of a specific motion segment is found out according to the second data queue, basic motion characteristics are extracted on the basis until the generation of the description of the basic motion of the motion is completed, and then the identification can be carried out according to the obtained description of the basic motion of the motion, so that the algorithm can be effectively simplified to reduce the requirements on operation resources and power consumption.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a schematic diagram of a feature extraction method of the present invention.
Fig. 2 is a flowchart of a feature extraction method according to an embodiment of the present invention.
Fig. 3 is a block diagram of the intelligent wearable device according to the present invention.
Fig. 4 is a block diagram of a feature extraction module according to an embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
FIG. 1 is a schematic diagram of a feature extraction method of the present invention. The invention provides a human motion feature extraction method, which sequentially comprises the following steps:
s101, preprocessing is carried out, wherein the preprocessing comprises the steps of converting three-dimensional gyroscope components (three-axis gyroscope signals) in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body, and converting three-dimensional acceleration components (three-axis accelerometer signals) in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
s102, searching the beginning and the end of a specific motion segment in the first data queue;
s103, searching the beat chain in the specific motion segment in the second data queue, and providing average description of the beat chain;
s104, extracting basic action characteristics;
s105, judging whether the quantity of the extracted and stored basic motion characteristics reaches the set required quantity, and if so, generating the description of the basic motion of the motion; otherwise, the sliding processing is carried out, and the process of finding out the beginning and the end of the specific motion segment in the first data queue is returned to, and the circulating processing is carried out.
In the present invention, the specific motion segment refers to the longest span among the plurality of motion segments found in the first data queue.
In the present invention, the description of the basic motion of the motion includes a mean value and a mean square value of each dimensional component of the vector value group of the basic motion features.
In the present invention, the processing of converting the three-dimensional gyroscope components in the collected data queue into a one-dimensional first data queue for measuring the human activity amplitude is as follows: and aiming at each sequence point, respectively averaging the three-dimensional gyroscope components of each historical point, and then summing and averaging the calculated values.
In the present invention, the processing of converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the variation beat of the periodic motion of the human body is as follows: and summing the three-dimensional acceleration components for each sequence point.
Specifically, step S101 further includes:
the processor continuously collects synchronous data from the gyroscope and the accelerometer, and stores the collected data in a FIFO data queue, namely the windows data, so that the windows data is filled with the data; it should be noted that the length of the FIFO data queue is selected in relation to the sampling frequency of the sensor, for example, the length of the FIFO data queue should be no less than 200 for a sampling frequency of 25Hz, and the length of the FIFO data queue should be no less than 400 for a sampling frequency of 50Hz, in other words, one FIFO data queue can hold about 8 seconds of sampled data. By such design, for the situation that the period of regular motion of a general human body is about 1 second and the maximum time generally does not exceed 1.5 seconds, a motion segment waveform data containing a plurality of beats (waveform periods) can be captured by a FIFO data queue.
Constructing a 1-dimensional data queue motionLevel reflecting the activity amplitude of the human body by using a 3-dimensional gyroscope component in a six-dimensional FIFO data queue windowData, and specifying a critical value of motion and stillness of the human body as motionLevelthreshold equal to 1 for data in the motionLevel. If a certain piece of data in motionLevel is higher than motionLevelthreshold, the person is considered to be in motion in the period of time; otherwise, the person is considered to be stationary for that period of time;
and constructing a 1-dimensional beat data queue motionClock by using a 3-dimensional accelerometer component in a six-dimensional FIFO data queue windowData, wherein the change beat of the motionClock reflects the change beat of the periodic motion of the human body.
Step S102 further includes searching the current motion activity index queue according to the motion activity index threshold motionLevelthreshold, recording the start position and the end position of each small segment (i.e. the sequence number of the corresponding data queue) whose activity index exceeds motionLevelthreshold, and searching a small segment with the largest sequence span from these small segments as the specific motion segment to be processed.
Step S103 further includes processing according to the start position and the end position of the specific motion segment obtained in step S102 and the beat data queue motionClock obtained in step S101. The processing result of the beat chain provides the number of beats and the start and end positions of a plurality of beats in addition to the average description of the beat chain described above. Wherein a beat chain average description refers to an average description of a plurality of beats in the beat chain. More specifically, the average description of a plurality of beats is obtained by using feature extraction based on wavelet analysis. In this embodiment, the feature extraction based on wavelet analysis includes: for each sequence point in each beat, normalization processing of the deviation relative variance is performed. In this embodiment, a secondary clustering analysis method is adopted for processing the beat chain. Specifically, the first-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a difference comparison technology is mainly adopted in the classification process. The second-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a similarity comparison technology is mainly adopted in the classification process.
In step S104, wavelet analysis is performed on each dimensional component of the vector value group. Thus, by reducing the computational dimensionality of the wavelet analysis, the computational load can be greatly reduced. A similarity comparison based on basic action features is employed and the basic action feature criteria are dynamically updated as the basic action feature queue is populated. The similarity comparison between the basic action characteristics and the basic action characteristic standards is performed under a doubtful mechanism, and when the doubtful degree reaches the appointed degree, the basic action characteristic queue and the basic action characteristic standards are emptied. The similarity comparison of the basic action characteristics and the basic action characteristic standard adopts multi-layer comparison. The multi-level comparison includes a similarity comparison between the average descriptions of the beat chains.
In step S105, the setting request number is 20, for example.
Further preferred embodiments of the present invention are: the process of searching for the moving segment and the static segment in the first data queue specifically comprises the following steps:
firstly, setting the serial number of a motion segment as 0 and the serial number of a static segment as 0;
then, sequentially comparing the data of the first data queue with the human motion and rest critical values according to the sequence from front to back:
for the 1 st element of the first data queue, when the value of the 1 st element is found to be more than or equal to the critical value of the motion and the rest of the human body, the number of the motion segment is added by 1, the number 1 is stored as the starting position of the motion segment pointed by the number of the motion segment, the value of the 2 nd element is searched next, and if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the rest of the human body, the exit is carried out; if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the motion segment indicated by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number 2 is stored as the start position of the stillness segment indicated by the stillness segment number; when the value of the 1 st element is found to be smaller than the critical value of the motion and the stillness of the human body, the number of the stillness section is added by 1, the number 1 is stored as the starting position of the stillness section pointed by the number of the stillness section, the value of the 2 nd element is searched, and if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the operation is exited; if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the stillness section pointed by the number of the stillness section, the length of the stillness section is calculated and stored, the number of the motion section is added by 1, and the number 2 is stored as the start position of the motion section pointed by the number of the motion section;
for the nth element of the first data queue, the nth element is an element between the 1 st element and the last element, when the value of the nth element is found to be greater than or equal to the critical value of human motion and stillness and the value of the (n + 1) th element is found to be less than the critical value of human motion and stillness, the number n is stored as the end position of the motion segment pointed by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number n +1 is stored as the start position of the stillness segment pointed by the stillness segment number; when the value of the nth element is found to be smaller than the critical value of the motion and the stillness of the human body and the value of the (n + 1) th element is found to be larger than or equal to the critical value of the motion and the stillness of the human body, storing the number n as the end position of the stillness section pointed by the number of the stillness section, calculating and storing the length of the stillness section, adding 1 to the number of the motion section, and storing the number n +1 as the start position of the motion section pointed by the number of the motion section;
for the last element of the first data queue, when the value of the last element is found to be more than or equal to the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the motion segment pointed by the number of the motion segment, and calculating and storing the length of the motion segment; and when the value of the last element is found to be smaller than the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the stillness section pointed by the number of the stillness section, and calculating and storing the length of the stillness section.
Further preferred embodiments of the present invention are: the processing of searching the information of the beat chain in the specific motion segment in the second data queue adopts a secondary clustering analysis method, the first-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on difference comparison, and the second-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on similarity comparison.
Further preferred embodiments of the present invention are: the calculation steps (i.e. the implementation process of the second-level clustering) of the feature extraction of the beat signals generated by the repetitive motion of the human body are as follows:
providing a beat { a) of repetitive motion of the human bodyi}1≤i≤n
Calculate the beat expectation and variance: <math> <mrow> <mi>E</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>n</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>,</mo> <mi>V</mi> <mo>=</mo> <mroot> <mrow> <mfrac> <mn>1</mn> <mi>n</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msup> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>-</mo> <mi>E</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mn>2</mn> </mroot> <mo>;</mo> </mrow> </math>
structure { bi}1≤i≤n b i = a i - E V ;
Setting the number of the segmentation sections: sectionNum, and segmentation scale:
Figure BDA0000494563250000093
and the following calculations were made:
when i is 1, …, sectionNum-1, there are
<math> <mrow> <msub> <mi>s</mi> <mi>i</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> <mo>+</mo> <mn>1</mn> </mrow> <mrow> <mi>i</mi> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </munderover> <msub> <mi>b</mi> <mi>i</mi> </msub> </mrow> </math>
When i is sectionNum, there are
<math> <mrow> <msub> <mi>s</mi> <mi>i</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>n</mi> <mo>-</mo> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </mfrac> <mo>&times;</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mrow> <mo>(</mo> <mi>sec</mi> <mi>tionNum</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> <mo>+</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>b</mi> <mi>j</mi> </msub> <mo>;</mo> </mrow> </math>
So called { E, V, { si}1≤i≤sectionNumIs { a }i}1≤i≤nWherein { s }i}1≤i≤sectionNumIs { ai}1≤i≤nThe shape characteristics of (a).
Further preferred embodiments of the present invention are: the calculation steps of the similarity comparison between the human body repetitive motion beat signals are as follows:
set the tempo A to
Figure BDA0000494563250000101
The shape is characterized in that
Figure BDA0000494563250000102
Beat B is
Figure BDA0000494563250000103
The shape is characterized in that
Figure BDA0000494563250000104
Definition di}1≤i≤sectionNumWherein
Figure BDA0000494563250000105
A similarity threshold, simiaritythreshold, is set and calculated as follows:
<math> <mrow> <msup> <mi>E</mi> <mi>d</mi> </msup> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </mfrac> <msubsup> <mi>&Sigma;</mi> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msubsup> <msub> <mi>d</mi> <mi>i</mi> </msub> <mo>,</mo> </mrow> </math>
<math> <mrow> <msup> <mi>V</mi> <mi>d</mi> </msup> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msubsup> <msup> <mrow> <mo>(</mo> <msub> <mi>d</mi> <mi>i</mi> </msub> <mo>-</mo> <msup> <mi>E</mi> <mi>d</mi> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> </mrow> </math>
taking the similarity threshold not less than 0.1 and not more than 0.3,
when V isdWhen the value is less than or equal to micromeritythreshold, { ai}1≤i≤nAnd { bi}1≤i≤nSimilarly;
when V isdWhen > micromeritythreshold, { ai}1≤i≤nAnd { bi}1≤i≤nAre not similar.
Fig. 2 is a flowchart of a feature extraction method according to an embodiment of the present invention. It substantially comprises the following steps:
s201, preprocessing, namely forming a three-dimensional vector aiming at each three-dimensional gyroscope component, adding the length of the three-dimensional gyroscope component and the lengths corresponding to all historical vectors arranged in front of the three-dimensional gyroscope component in a buffer data queue together, and then calculating an average value, wherein a calculation result is used as a value at a corresponding position in a first data queue, so that the first data queue reflecting the human motion amplitude in 1 dimension is constructed; and forming a three-dimensional vector aiming at each three-dimensional acceleration component, summing the three-dimensional acceleration components, and taking the calculation result as a value at a corresponding position in a second data queue, thus constructing a 1-dimensional second data queue reflecting the activity beat of the human body.
S202, finding out the starting position and the ending position of the specific motion segment in the first data queue, which comprises the following steps:
firstly, setting the serial number of a motion segment as 0 and the serial number of a static segment as 0;
then, sequentially comparing the data of the first data queue with the human motion and rest critical values according to the sequence from front to back:
for the 1 st element of the first data queue, when the value of the 1 st element is found to be more than or equal to the critical value of the motion and the rest of the human body, the number of the motion segment is added by 1, the number 1 is stored as the starting position of the motion segment pointed by the number of the motion segment, the value of the 2 nd element is searched next, and if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the rest of the human body, the exit is carried out; if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the motion segment indicated by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number 2 is stored as the start position of the stillness segment indicated by the stillness segment number; when the value of the 1 st element is found to be smaller than the critical value of the motion and the stillness of the human body, the number of the stillness section is added by 1, the number 1 is stored as the starting position of the stillness section pointed by the number of the stillness section, the value of the 2 nd element is searched, and if the value of the 2 nd element is found to be smaller than the critical value of the motion and the stillness of the human body, the operation is exited; if the value of the 2 nd element is found to be more than or equal to the critical value of the motion and the stillness of the human body, the number 1 is stored as the end position of the stillness section pointed by the number of the stillness section, the length of the stillness section is calculated and stored, the number of the motion section is added by 1, and the number 2 is stored as the start position of the motion section pointed by the number of the motion section;
for the nth element of the first data queue, the nth element is an element between the 1 st element and the last element, when the value of the nth element is found to be greater than or equal to the critical value of human motion and stillness and the value of the (n + 1) th element is found to be less than the critical value of human motion and stillness, the number n is stored as the end position of the motion segment pointed by the motion segment number, the length of the motion segment is calculated and stored, the number of the stillness segment is added by 1, and the number n +1 is stored as the start position of the stillness segment pointed by the stillness segment number; when the value of the nth element is found to be smaller than the critical value of the motion and the stillness of the human body and the value of the (n + 1) th element is found to be larger than or equal to the critical value of the motion and the stillness of the human body, storing the number n as the end position of the stillness section pointed by the number of the stillness section, calculating and storing the length of the stillness section, adding 1 to the number of the motion section, and storing the number n +1 as the start position of the motion section pointed by the number of the motion section;
for the last element of the first data queue, when the value of the last element is found to be more than or equal to the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the motion segment pointed by the number of the motion segment, and calculating and storing the length of the motion segment; and when the value of the last element is found to be smaller than the critical value of the motion and the stillness of the human body, storing the number of the last element as the end position of the stillness section pointed by the number of the stillness section, and calculating and storing the length of the stillness section.
And S203, searching the information of the beat chain in the specific motion segment in the second data queue according to the starting position and the ending position searched in the first data queue, wherein the information comprises the characteristic information of the beat waveform, the number of beats, and the starting position and the ending position of each beat.
The processing of searching the information of the beat chain in the specific motion segment in the second data queue adopts a secondary clustering analysis method, the first-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on difference comparison, and the second-level clustering of the secondary clustering analysis method adopts a C-means algorithm for classification based on similarity comparison.
And S204, extracting waveform characteristics of the triaxial accelerometer signal segment and the triaxial gyroscope signal segment which are synchronous with the beat from the sensor data buffer queue according to the starting position and the ending position of each beat of the information of the beat chain to serve as basic action characteristics, and if the basic action characteristics are extracted for the first time, storing the basic action characteristics as a first element of the characteristic queue and using the basic action characteristics as a characteristic standard of the basic action.
The method for extracting the basic action features not only extracts the features of a six-dimensional data sequence segment which is synchronous with a beat signal and is composed of a triaxial acceleration signal and a triaxial gyroscope signal, but also specifically comprises the following steps:
calculating each dimension of the six-dimensional data sequence fragment to obtain the expectation and variance of each dimension; reconstructing a six-dimensional data sequence segment according to the expectation and the variance and obtaining a data floating proportion matrix; longitudinally dividing and calculating the data floating proportion matrix according to set parameters including the number of the division sections and the division scale to obtain a data floating proportion description sequence; obtaining basic action characteristics marked by corresponding movement beats according to expectation and variance of each dimension of the six-dimensional data sequence segment and a data floating proportion description sequence; the number of the segmentation sections is used for setting the number of the sections longitudinally segmented by the data floating proportion matrix, the segmentation scale is used for setting the data length of each dimension segmentation section of the data floating proportion matrix, and the number of the segmentation sections is set to be 3-10 sections.
The specific implementation mode comprises the following steps:
six-dimensional data sequence fragment unitData belonged to R6×l(ii) a Six-dimensional data sequence fragment length: l;
expectation and variance for each dimension of the six-dimensional data sequence fragment:
<math> <mrow> <mi>accXE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>accYE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>2</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>accZE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>3</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
<math> <mrow> <mi>accXV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>accXE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> <mi>accYV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>2</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>accYE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> </mrow> </math>
<math> <mrow> <mi>accZV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>3</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>accZE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> </mrow> </math>
<math> <mrow> <mi>gyroXE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>4</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>gyroYE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>5</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>gyroZE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>6</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
<math> <mrow> <mi>gyroXV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>4</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>gyroXE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> <mi>gyroYV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>5</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>gyroYE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> </mrow> </math>
<math> <mrow> <mi>gyroZV</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>unitData</mi> <mrow> <mo>(</mo> <mn>6</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>gyroZE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> </msup> <mo>,</mo> </mrow> </math>
wherein:
accX represents that the quantity of the accX is related to the X-axis component of the three-dimensional acceleration data;
accY represents that the quantity of the accY is related to the Y-axis component of the three-dimensional acceleration data;
accZ represents that the quantity of the accZ is related to the Z-axis component of the three-dimensional acceleration data;
gyroX represents that the quantity it is in is related to the X-axis component of the three-dimensional gyroscope data;
gyroY represents that the quantity in which it is located is related to the Y-axis component of the three-dimensional gyroscope data;
gyroZ represents that the quantity it is in is related to the Z-axis component of the three-dimensional gyroscope data;
accXE represents the expectation of the X-axis component of the three-dimensional acceleration data;
the expectation of other axis components is consistent with the variance as represented by accXE above, and is not described.
Obtaining a data floating proportion matrix unitDataCorrect:
unitDataCorrection = unitData ( 1,1 ) - accXE accXV . . . unitData ( 1 , L ) - accXE accXV . . . . . . . . . unitData ( 6,1 ) - gyroZE gyroZV . . . unitData ( 6 , L ) - gyroZE gyroZV ;
first, the number of segments is set: sectionNum, and segmentation scale:
Figure BDA0000494563250000142
subsequently, the unitDataCorrection is chunked along the row direction, their widths are both the segmentation scale for the preceding segmentnum-1 block, and for the last block, their widths do not necessarily just reach the segmentation scale.
For the first line of data of unitDataCorrection, the following calculation is made:
when i is 1, …, sectionNum-1, there are
<math> <mfenced open='' close='' separators=' '> <mtable> <mtr> <mtd> <mi>accXRoughShape</mi> <mo>=</mo> </mtd> </mtr> <mtr> <mtd> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </mfrac> <mo>&times;</mo> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> <mo>+</mo> <mn>1</mn> </mrow> <mrow> <mi>i</mi> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </msubsup> <mi>unitDataCorrection</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> <mo>;</mo> </mfenced> </math>
When i is sectionNum, there are
<math> <mrow> <mfenced open='' close=''> <mtable> <mtr> <mtd> <mi>accXRoughShape</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>l</mi> <mo>-</mo> <mrow> <mo>(</mo> <mi>sec</mi> <mi>tionNum</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> </mrow> </mfrac> </mtd> </mtr> <mtr> <mtd> <mo>&times;</mo> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>&times;</mo> <mi>sec</mi> <mi>tionMeasure</mi> <mo>+</mo> <mn>1</mn> </mrow> <mi>l</mi> </msubsup> <mi>unitDataCorrection</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
The acxroughshape (i) represents the shape description sequence of the data floating scale matrix corresponding to the three-dimensional acceleration data sequence segment on the X axis, and the other scale description sequences are represented in the same way as the table of the acxroughshape (i), which is not always described.
In this case, similar processing is performed for the data of the other lines of unitDataCorrection.
And finally obtaining the basic motion characteristic actionFeature of the motion:
<math> <mrow> <mi>actionFeature</mi> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mi>accXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
wherein:
accXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional acceleration data shape;
accYRoughShape represents a shape description sequence of the Y-axis component of the three-dimensional acceleration data shape;
accZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional acceleration data shape;
gyroXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional gyroscope data shape;
gyroYRoughShape represents a shape description sequence of a Y-axis component of a three-dimensional gyroscope data shape;
gyroZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional gyroscope data shape;
s205, judging whether the extracted characteristics of the basic motion are similar to the characteristics standard of the basic motion, if so, turning to the step S206, and if not, turning to the step S209. The implementation of S205 is described in detail as follows:
when the program extracts a group of signal change characteristics from the vector value data sequence segment generated by the basic motion of the motion, the program compares the extracted basic motion characteristics with the characteristic standard of the current basic motion, and further judges whether the basic motion represented by the change characteristics of the group of signals and the previous basic motion belong to the same type of motion. And setting actionFeature as a basic motion feature of the motion extracted by the program.
<math> <mrow> <mi>actionFeature</mi> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mi>accXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> </mrow> </math>
And setting actionFeatureStd as a basic action feature standard for the motion;
<math> <mrow> <mi>actionFeatureStd</mi> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mi>accXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> </mrow> </math>
accXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional acceleration data shape;
accYRoughShape represents a shape description sequence of the Y-axis component of the three-dimensional acceleration data shape;
accZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional acceleration data shape;
gyroXRoughShape represents a shape description sequence of the X-axis component of the three-dimensional gyroscope data shape;
gyroYRoughShape represents a shape description sequence of a Y-axis component of a three-dimensional gyroscope data shape;
gyroZRoughShape represents a shape description sequence of the Z-axis component of the three-dimensional gyroscope data shape;
and setting a threshold value of the acceleration data shape similarity, accRoughShapeDifferThreshold, 0.1 ≤ accRoughShapeDifferThreshold ≤ 0.3,
setting a gyroscope data shape similarity threshold value gyroroughshapedifferfreshold which is more than or equal to 0.2 and less than or equal to 0.4,
set a feature similarity threshold value featuresimililaryievaluatethreshold? {3,4}
Setting the initialized feature similarity counter featuresimilartyevaluation to 0;
the specific algorithm comprises the following steps:
A. the shape difference description is generated using the shape of the basic motion feature actionFeature and the basic motion feature standard actionFeatureStd on each component,
<math> <mrow> <mi>actionFeatureStd</mi> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accXRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accYRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accZRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroXRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroYRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroZRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>.</mo> </mrow> </math>
wherein:
accXRoughShapeDiffer represents the shape difference of the shape description sequence of the X-axis component of the two three-dimensional acceleration data;
acyroughshapediffer represents a shape difference of shape description sequences of Y-axis components of two three-dimensional acceleration data;
accZRoughShapeDiffer represents a shape difference of shape description sequences of Z-axis components of two three-dimensional acceleration data;
gyroXRoughShapeDiffer represents the shape difference of the shape description sequence of the X-axis component of the two three-dimensional gyroscope data;
gyroYRoughShapeDiffer represents a shape difference of a shape description sequence of a Y-axis component of two three-dimensional gyroscope data;
gyroZRoughShapeDiffer represents a shape difference of shape description sequences of Z-axis components of two three-dimensional gyroscope data;
the following calculations were made:
when i is more than or equal to 1 and less than or equal to sectionNum,
accXRoughShapeDiffer(i)=actionFeature.accXRoughShape(i)-actionFeatureStd.accXRoughShape(i);
the algorithms for the other dimensions are as described above and will not be described further herein.
B. Calculating the shape difference describing the expectation and variance of the shape difference over the respective components:
<math> <mrow> <mi>accXRoughShapeDifferE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msubsup> <mi>accXRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
<math> <mrow> <mfenced open='' close=''> <mtable> <mtr> <mtd> <mi>accXRoughShapeDifferV</mi> <mo>=</mo> </mtd> </mtr> <mtr> <mtd> <mroot> <mrow> <mfrac> <mn>1</mn> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msubsup> <msup> <mrow> <mo>(</mo> <mi>accXRoughShapeDiffer</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>accXRoughShapeDifferE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mn>2</mn> </mroot> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
other algorithms for the expectation and variance of shape variance in dimension are described above and will not be described further herein.
C. Judging the similarity degree between the extracted features of the basic motion and the feature standard of the motion basic motion:
if accXRoughShapeDifferV ≦ accRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if accYROGROUGhSHAPEDifferV ≦ accROUGhSHAPEDifferThreshold, then featureSimiliyEvaluate is added by 1;
if accZRoughShapeDifferV ≦ accRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if gyroXRoughShapeDifferV ≦ gyroRoughShapeDifferThreshold, then featureSililityEvaluate is added by 1;
if gyroyRoughShapeDifferV ≦ gyroyRoughShapeDifferThreshold, then featureSimiliyEvaluation is added by 1;
if gyroZRoughShapeDifferV ≦ gyroRoughShapeDifferThreshold, then featureSililityEvaluate is added by 1;
if featuresimililartyEvaluate ≧ featuresimililartyEvaluateThreshold, then the program considers the extracted features of the base action to be similar to the feature criteria of the motion base action;
if featuresimilivelevalation < featuresimililarevalation threshold, then the program considers the extracted features of the underlying motion to be dissimilar from the feature criteria of the motion underlying motion;
s206, storing the extracted basic action characteristics in a queue of the basic action characteristics; and generating a new feature standard of the basic action by using the existing basic action features and the extracted features.
The implementation of S206 is described in detail as follows:
if the program finds that the extracted basic action features are similar to the current feature standard of the basic action, the program regenerates a new basic action feature standard by using the extracted basic action features and the current feature standard of the basic action.
Setting actionFeature as a basic motion feature of the motion extracted by the program and setting actionFeatureStd as a basic motion feature standard of the motion;
the algorithm specifically comprises the following steps:
actionFeatureStd . accXE = 1 2 ( actionFeature . accXE + actionFeatureStd . accXE ) ;
actionFeatureStd . accXV = 1 2 ( actionFeature . accXV + actionFeatureStd . accXV ) ;
when i is more than or equal to 1 and less than or equal to sectionNum,
actionFeatureStd . accXRoughShape ( i ) = 1 2 ( actionFeature . accXRoughShape ( i ) + actionFeatureStd . accXRoughShape ( i ) ) ;
the algorithms for the other dimensions are as described above and will not be described further herein.
Actionfeaturestd. accxroughshape is a new basic action feature standard.
S207, judging whether the number of the basic motion features in the basic motion feature queue reaches the set requirement, if so, turning to the step S208, otherwise, turning to the step S212.
And S208, calculating the statistical characteristics of the basic motion of the motion by using the queue of the basic motion characteristics of the motion, and storing the statistical characteristics in a flash memory as the description of the basic motion of the motion.
The statistical analysis of the motion-based basic motion characteristic queue specifically comprises the following steps:
obtaining a plurality of basic motion features of a certain motion to form a feature queue of a moving basic motion based on feature extraction of the six-dimensional data sequence segment; calculating expectation and variance of a sequence formed by data at the corresponding position of each basic action characteristic; forming a knowledge point of a basic motion of a certain motion from the generated expectation and variance; the knowledge base is formed by a plurality of knowledge points.
The specific implementation mode for extracting the knowledge points of the basic actions from the basic action feature sequence comprises the following steps:
the basic action feature number actionFeatureNum is set to obtain a plurality of basic action features: { actionfeature (k) }1≤k≤actionFeatureNum
Wherein:
<math> <mrow> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mi>accXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>accZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>accZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroXE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroXV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroYE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroYV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mi>gyroZE</mi> </mtd> </mtr> <mtr> <mtd> <mi>gyroZV</mi> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
knowledge points for such actions can be generated by statistical calculations:
<math> <mrow> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accXEE</mi> <mo>,</mo> <mi>accXEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accXVE</mi> <mo>,</mo> <mi>accXVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accYEE</mi> <mo>,</mo> <mi>accYEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accYVE</mi> <mo>,</mo> <mi>accYVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>accYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accZEE</mi> <mo>,</mo> <mi>accZEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>accZVE</mi> <mo>,</mo> <mi>accZVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>accZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroXEE</mi> <mo>,</mo> <mi>gyroXEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroXVE</mi> <mo>,</mo> <mi>gyroXVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>gyroXRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroYEE</mi> <mo>,</mo> <mi>gyroYEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroYVE</mi> <mo>,</mo> <mi>gyroYVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>gyroYRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroZEE</mi> <mo>,</mo> <mi>gyroZEV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <mi>gyroZVE</mi> <mo>,</mo> <mi>gyroZVV</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mrow> <mo>{</mo> <mrow> <mo>(</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>,</mo> <mi>gyroZRoughShape</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>)</mo> </mrow> <mo>}</mo> </mrow> <mrow> <mn>1</mn> <mo>&le;</mo> <mi>i</mi> <mo>&le;</mo> <mi>sec</mi> <mi>tionNum</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
wherein,
<math> <mrow> <mi>accXEE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>actionFeatureNum</mi> </msubsup> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXE</mi> <mo>;</mo> </mrow> </math>
accXEV =
<math> <mrow> <mroot> <mrow> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>actionFeatureNum</mi> </msubsup> <msup> <mrow> <mo>(</mo> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXE</mi> <mo>-</mo> <mi>accXEE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mn>2</mn> </mroot> <mo>;</mo> </mrow> </math>
<math> <mrow> <mi>accXVE</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>actionFeatureNum</mi> </msubsup> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXV</mi> <mo>;</mo> </mrow> </math>
<math> <mrow> <mfenced open='' close=''> <mtable> <mtr> <mtd> <mi>accXVV</mi> <mo>=</mo> </mtd> </mtr> <mtr> <mtd> <mroot> <mrow> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mrow> <mi>actionFeatureNum</mi> <msup> <mrow> <mo>(</mo> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXV</mi> <mo>-</mo> <mi>accXVE</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msubsup> </mrow> <mn>2</mn> </mroot> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
when k is more than or equal to 1 and less than or equal to sectionNum, there are
<math> <mrow> <mfenced open='' close=''> <mtable> <mtr> <mtd> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>=</mo> </mtd> </mtr> <mtr> <mtd> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <msubsup> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>actionFeatureNum</mi> </msubsup> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math>
<math> <mrow> <mfenced open='' close=''> <mtable> <mtr> <mtd> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>V</mi> <mo>=</mo> </mtd> </mtr> <mtr> <mtd> <mroot> <mrow> <mfrac> <mn>1</mn> <mi>actionFeatureNum</mi> </mfrac> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>actionFeatureNum</mi> </munderover> <msup> <mrow> <mo>(</mo> <mi>actionFeature</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>accXRoughShape</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>.</mo> <mi>E</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mn>2</mn> </mroot> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow> </math>
The algorithms for the other dimensions are as described above and will not be described further herein.
S209, the basic action suspicion counter is self-incremented by 1.
S210, judging whether the suspicion counter of the basic action characteristics reaches the suspicion threshold value, if so, turning to step S211, and if not, turning to step S212.
S211, sliding pretreatment, namely emptying a queue of basic action characteristics; clearing the basic action characteristic standard; the basic action suspicion counter is clear.
S212, sliding processing, namely covering the processed data with the unprocessed sensor data, and filling the buffer queue with the newly acquired sensor data; and regenerating the first data queue and the second data queue by using the new buffered data queue, and turning to the step S202.
Fig. 3 is a block diagram of the intelligent wearable device according to the present invention. The invention provides an intelligent wearable device, such as: a sports wristband comprising a feature extraction module 301, a sports description library 303 and a sensing module 304. The feature extraction module 301 may establish a motion description library 303 by using the feature extraction method described above through the sensing module 304, where the motion description library 303 includes descriptions of at least one basic motion of a motion. It is clear to one skilled in the art that the modules described herein can be implemented in hardware, or by means of software plus a necessary general hardware platform.
Referring to fig. 4, a feature extraction module embodiment of an intelligent wearable device is provided, which generally comprises: a first unit 401, configured to correspondingly implement the function of step S101 in fig. 1; a second unit 402, configured to correspondingly implement the function of step S102 in fig. 1; a third unit 403, configured to correspondingly implement the function of step S103 in fig. 1; a fourth unit 404, configured to correspondingly implement the function of step S104 in fig. 1; and a fifth unit 405, configured to correspondingly implement the function of step S105 in fig. 1. It will be clear to a person skilled in the art that the modules and/or units described herein may be implemented in hardware, or by means of software plus a necessary general hardware platform.
The invention provides a general algorithm for discovering periodic signals and extracting information, which comprises the following steps: and searching similar vector value signal segments from a six-dimensional vector value sequence formed by the synchronous three-axis acceleration signals and the three-axis sensor signals, extracting the characteristics of the vector value signal segments, and finally extracting corresponding statistical characteristic calculation flows from a plurality of vector value signal characteristics. The algorithm is not limited to the application on the bracelet, and can be embedded into a plurality of systems for application.
It should be understood that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, and those skilled in the art can modify the technical solutions described in the above embodiments, or make equivalent substitutions for some technical features; and such modifications and substitutions are intended to be included within the scope of the appended claims.

Claims (26)

1. A method for extracting characteristics of human motion is characterized by sequentially comprising the following steps:
preprocessing, including converting the three-dimensional gyroscope component in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body, and converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
processing to find out the beginning and the end of a specific motion segment in the first data queue;
processing to find out the beat chain in the specific motion segment in the second data queue, and providing average description of the beat chain;
extracting basic action characteristics;
judging whether the quantity of the extracted and stored basic motion characteristics reaches the set required quantity, and if so, generating the description of the basic motion of the motion; otherwise, the sliding processing is carried out, and the process of finding out the beginning and the end of the specific motion segment in the first data queue is returned to, and the circulating processing is carried out.
2. The method of extracting human motion features according to claim 1, wherein: the processing of finding the chain of beats in the particular motion segment in the second data queue also provides the number of beats and the start and end of multiple beats.
3. The method of extracting human motion features according to claim 2, wherein: the average description of the beat chain refers to the average description of a plurality of beats in the beat chain, and the average description of the plurality of beats is obtained by adopting the feature extraction based on wavelet analysis.
4. The method of extracting human motion features according to claim 3, wherein: the feature extraction based on wavelet analysis comprises the following steps: for each sequence point in each beat, normalization processing of the deviation relative variance is performed.
5. The method of extracting human motion features according to claim 1, wherein: the processing of searching the beat chain in the specific motion segment in the second data queue adopts a secondary clustering analysis method.
6. The method of extracting human motion features according to claim 5, wherein: the first-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a difference comparison technology is mainly adopted in the classification process.
7. The method of extracting human motion features according to claim 6, wherein: the second-level clustering of the second-level clustering analysis method adopts a C-means algorithm idea, and a similarity comparison technology is mainly adopted in the classification process.
8. The method of extracting human motion features according to claim 1, wherein: the extraction processing of the basic action characteristics adopts wavelet analysis on each dimensional component of the vector numerical value group.
9. The method of extracting human motion features according to claim 1, wherein: the extraction processing of the basic action features adopts similarity comparison based on the basic action features and basic action feature standards, and the basic action feature standards are dynamically updated as the basic action feature queue is filled.
10. The method of extracting human motion features according to claim 9, wherein: the similarity comparison between the basic action characteristics and the basic action characteristic standards is performed under a doubtful mechanism, and when the doubtful degree reaches the appointed degree, the basic action characteristic queue and the basic action characteristic standards are emptied.
11. The method of extracting human motion features according to claim 9, wherein: the similarity comparison of the basic action characteristics and the basic action characteristic standard adopts multi-layer comparison.
12. The method of extracting human motion features according to claim 11, wherein: the multi-level comparison includes a similarity comparison between the average descriptions of the beat chains.
13. The method of extracting human motion features according to claim 1, wherein: the specific motion segment refers to the longest span among the plurality of motion segments found in the first data queue.
14. The method of extracting human motion features according to claim 1, wherein: the description of the basic motion of the motion includes a mean value and a mean square value of each dimensional component of the vector value set of the respective basic motion features.
15. The method of extracting human motion features according to claim 1, wherein: the processing of converting the three-dimensional gyroscope components in the acquired data queue into a one-dimensional first data queue for measuring the human body activity amplitude is as follows: and forming a three-dimensional vector aiming at each three-dimensional gyroscope component, adding the length of the three-dimensional gyroscope component and the lengths corresponding to all history vectors arranged in front of the three-dimensional gyroscope component in a buffer data queue, and then averaging the lengths, wherein the calculation result is used as a value at a corresponding position in the first data queue.
16. The method of extracting human motion features according to claim 1, wherein: the processing of converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body is as follows: and summing the three-dimensional acceleration components for each sequence point.
17. The method of extracting human motion features according to claim 1, wherein: the process of generating the description of the basic motion of the motion includes: and calculating the statistical characteristics of the basic motion of the motion by using the motion basic motion characteristic queue, and storing the statistical characteristics in a flash memory as the description of the basic motion of the motion.
18. The method of extracting human motion features according to claim 1, wherein: the processing for searching the beat chain in the specific motion segment in the second data queue includes: and searching the information of the beat chain in the specific motion segment in the second data queue according to the starting position and the ending position searched in the first data queue, wherein the information comprises the characteristic information of the beat waveform, the number of beats, and the starting position and the ending position of each beat.
19. The method of extracting human motion features according to claim 18, wherein: the extraction processing of the basic action features comprises the following steps: and extracting waveform characteristics of the triaxial accelerometer signal segment and the triaxial gyroscope signal segment which are synchronous with the beat from the sensor data buffer queue according to the starting position and the ending position of each beat of the information of the beat chain as basic action characteristics, and if the basic action characteristics are extracted for the first time, storing the basic action characteristics as a first element of the characteristic queue and using the basic action characteristics as a characteristic standard of the basic action.
20. The method of extracting human motion features according to claim 1, wherein: the processing for extracting the basic motion features further comprises: and judging whether the characteristics of the basic action extracted currently are similar to the characteristic standards of the basic action.
21. The method of extracting human motion features according to claim 20, wherein: the judging whether the feature of the basic action extracted currently is similar to the feature standard of the basic action comprises the following steps: subtracting the waveform description of the basic action characteristic standard from the waveform description of the current basic action characteristic to obtain a difference value; calculating an expectation of waveform differences in each dimension; calculating the variance of the waveform differences in each dimension; and (4) judging: if the number of the wave form difference is less than the fixed value and exceeds a certain set number, the two wave form characteristics are considered to be similar, otherwise, the two wave form characteristics are not similar.
22. The method of extracting human motion features according to claim 20, wherein: if the result of judging whether the feature of the basic action extracted currently is similar to the feature standard of the basic action is similar, the extraction processing of the feature of the basic action further comprises: storing the extracted basic action characteristics in a queue of basic action characteristics; and generating a new feature standard of the basic action by using the existing basic action features and the extracted features.
23. The method of extracting human motion features according to claim 20, wherein: if the result of judging whether the characteristics of the basic action extracted currently are similar to the characteristic standards of the basic action is not similar, the number of times of suspicion is accumulated, and then whether the suspicion counter of the characteristics of the basic action reaches a suspicion threshold value is judged, if so, sliding pretreatment is firstly carried out, then sliding treatment is carried out, and if not, sliding treatment is directly carried out.
24. The method for extracting features of human motion according to claim 1 or 23, wherein: the sliding process comprises the following steps: covering the processed data with the unprocessed sensor data, and filling the buffer queue with the newly acquired sensor data; and regenerating the first data queue and the second data queue by using the new buffer data queue.
25. The method of extracting human motion features according to claim 23, wherein: the sliding pretreatment comprises the following steps: and clearing the queue of the basic action characteristic, clearing the standard of the basic action characteristic and clearing the suspicion counter of the basic action.
26. The utility model provides an intelligence wearing equipment which characterized in that, includes a module for accomplish human motion's feature extraction, the module includes:
the preprocessing unit is used for converting the three-dimensional gyroscope component in the acquired data queue into a one-dimensional first data queue for measuring the activity amplitude of the human body and converting the three-dimensional acceleration component in the acquired data queue into a one-dimensional second data queue for measuring the change beat of the periodic motion of the human body;
a first searching unit, configured to perform processing for searching for the start and end of a specific motion segment in the first data queue;
the second searching unit is used for searching the beat chain in the specific motion segment in the second data queue and providing average description of the beat chain;
a feature extraction unit for extracting a basic motion feature; and
a motion description generation unit for performing a judgment process of whether the number of the extracted and stored basic motion features reaches a set required number, if so, generating a description of the basic motion of the motion; otherwise, the sliding process is performed, and the process returns to the first search unit to perform the loop process.
CN201410164185.XA 2014-03-06 2014-04-22 The feature extracting method of a kind of Intelligent worn device and human motion Active CN103892840B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410164185.XA CN103892840B (en) 2014-03-06 2014-04-22 The feature extracting method of a kind of Intelligent worn device and human motion

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201410083087.3 2014-03-06
CN201410083087 2014-03-06
CN201410164185.XA CN103892840B (en) 2014-03-06 2014-04-22 The feature extracting method of a kind of Intelligent worn device and human motion

Publications (2)

Publication Number Publication Date
CN103892840A true CN103892840A (en) 2014-07-02
CN103892840B CN103892840B (en) 2015-11-18

Family

ID=50984654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410164185.XA Active CN103892840B (en) 2014-03-06 2014-04-22 The feature extracting method of a kind of Intelligent worn device and human motion

Country Status (1)

Country Link
CN (1) CN103892840B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105075886A (en) * 2015-07-27 2015-11-25 河南科技大学 Dairy cow automatic feeding device and corresponding intelligent feeding system
CN105104291A (en) * 2015-07-27 2015-12-02 河南科技大学 Dairy cow motion state judging method and corresponding intelligent feeding method
CN105549737A (en) * 2015-12-09 2016-05-04 上海斐讯数据通信技术有限公司 Method and intelligent device for recording exercise times and exercise arm band
CN105617638A (en) * 2015-12-25 2016-06-01 深圳市酷浪云计算有限公司 Badminton racket swinging movement recognizing method and device
CN105760646A (en) * 2014-12-18 2016-07-13 中国移动通信集团公司 Method and device for activity classification
CN104383674B (en) * 2014-10-21 2017-01-25 小米科技有限责任公司 Counting method and device used for intelligent wearing equipment as well as intelligent wearing equipment
CN106372673A (en) * 2016-09-06 2017-02-01 深圳市民展科技开发有限公司 Apparatus motion identification method
CN106730627A (en) * 2016-12-01 2017-05-31 上海长海医院 Foot recovers moving electron pin ring
CN107289966A (en) * 2016-03-30 2017-10-24 日本电气株式会社 Method and apparatus for counting step number
CN109446914A (en) * 2018-09-28 2019-03-08 中山乐心电子有限公司 The method, apparatus and intelligent wearable device of detection movement accuracy

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040186695A1 (en) * 2003-03-07 2004-09-23 Seiko Epson Corporation Body motion detection device, pitch meter, wristwatch-type information processing device, method for controlling thereof, control program, and storage medium
CN1931090A (en) * 2005-09-16 2007-03-21 万威科研有限公司 System and method for measuring gait kinematics information
CN101242879A (en) * 2005-09-02 2008-08-13 本田技研工业株式会社 Motion guide device, and its control system and control program
CN101294979A (en) * 2007-04-27 2008-10-29 陈侑郁 Wrist watch type acceleration sensing module for measuring amount of exercise
US20120323520A1 (en) * 2011-06-20 2012-12-20 Invensense, Inc. Motion determination

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040186695A1 (en) * 2003-03-07 2004-09-23 Seiko Epson Corporation Body motion detection device, pitch meter, wristwatch-type information processing device, method for controlling thereof, control program, and storage medium
CN101242879A (en) * 2005-09-02 2008-08-13 本田技研工业株式会社 Motion guide device, and its control system and control program
CN1931090A (en) * 2005-09-16 2007-03-21 万威科研有限公司 System and method for measuring gait kinematics information
CN101294979A (en) * 2007-04-27 2008-10-29 陈侑郁 Wrist watch type acceleration sensing module for measuring amount of exercise
US20120323520A1 (en) * 2011-06-20 2012-12-20 Invensense, Inc. Motion determination

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104383674B (en) * 2014-10-21 2017-01-25 小米科技有限责任公司 Counting method and device used for intelligent wearing equipment as well as intelligent wearing equipment
CN105760646A (en) * 2014-12-18 2016-07-13 中国移动通信集团公司 Method and device for activity classification
CN105075886A (en) * 2015-07-27 2015-11-25 河南科技大学 Dairy cow automatic feeding device and corresponding intelligent feeding system
CN105104291A (en) * 2015-07-27 2015-12-02 河南科技大学 Dairy cow motion state judging method and corresponding intelligent feeding method
CN105075886B (en) * 2015-07-27 2018-01-02 河南科技大学 A kind of milk cow automatic feeding device and corresponding intelligent feeding systems
CN105549737A (en) * 2015-12-09 2016-05-04 上海斐讯数据通信技术有限公司 Method and intelligent device for recording exercise times and exercise arm band
CN105617638A (en) * 2015-12-25 2016-06-01 深圳市酷浪云计算有限公司 Badminton racket swinging movement recognizing method and device
CN105617638B (en) * 2015-12-25 2019-04-05 深圳市酷浪云计算有限公司 Badminton racket swing action identification method and device
CN107289966A (en) * 2016-03-30 2017-10-24 日本电气株式会社 Method and apparatus for counting step number
CN106372673A (en) * 2016-09-06 2017-02-01 深圳市民展科技开发有限公司 Apparatus motion identification method
CN106730627A (en) * 2016-12-01 2017-05-31 上海长海医院 Foot recovers moving electron pin ring
CN109446914A (en) * 2018-09-28 2019-03-08 中山乐心电子有限公司 The method, apparatus and intelligent wearable device of detection movement accuracy

Also Published As

Publication number Publication date
CN103892840B (en) 2015-11-18

Similar Documents

Publication Publication Date Title
CN103892840B (en) The feature extracting method of a kind of Intelligent worn device and human motion
CN103908259B (en) The monitoring of a kind of Intelligent worn device and human motion and recognition methods
CN106289309B (en) Step-recording method and device based on 3-axis acceleration sensor
Brajdic et al. Walk detection and step counting on unconstrained smartphones
US9700241B2 (en) Gait analysis system and method
Jensen et al. Classification of kinematic swimming data with emphasis on resource consumption
US20110276304A1 (en) Determining energy expenditure of a user
Susi et al. Accelerometer signal features and classification algorithms for positioning applications
Zhang et al. Human activity recognition based on time series analysis using U-Net
RU2719945C2 (en) Apparatus, method and system for counting the number of cycles of periodic movement of a subject
EP3090684A1 (en) Pedometer and method for analyzing motion data
EP3090685A1 (en) Pedometer and method for analyzing motion data
US10264997B1 (en) Systems and methods for selecting accelerometer data to store on computer-readable media
Ashry et al. An LSTM-based descriptor for human activities recognition using IMU sensors
US11540748B2 (en) Method and system for gait detection of a person
EP3269303A1 (en) Method for determining the type of motion activity of a person and device for implementing same
Bajpai et al. Quantifiable fitness tracking using wearable devices
CN109758154B (en) Motion state determination method, device, equipment and storage medium
Luqian et al. Human activity recognition using time series pattern recognition model-based on tsfresh features
Yang et al. Comparing cross-subject performance on human activities recognition using learning models
CN112487902B (en) Exoskeleton-oriented gait phase classification method based on TCN-HMM
Wei et al. Unsupervised race walking recognition using smartphone accelerometers
Andrić et al. Sensor-based activity recognition and performance assessment in climbing: A review
Eskaf et al. Aggregated activity recognition using smart devices
Chawla et al. Using Machine Learning Techniques for User Specific Activity Recognition.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190305

Address after: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Patentee after: Shenzhen Yingdekang Technology Co., Ltd.

Address before: Room 1509, Coastal Times East Block, 12069 Shennan Avenue, Nanshan District, Shenzhen City, Guangdong Province, 518000

Patentee before: SHENZHEN DEKAIRUI TECHNOLOGY CO., LTD.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190328

Address after: 518000 Changyi Industrial Plant, No. 1 Huaning Road and Lirong Road, Dalang Street, Longhua District, Shenzhen City, Guangdong Province, 3 buildings and 11 floors

Patentee after: Shenzhen love Technology Co., Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Patentee before: Shenzhen Yingdekang Technology Co., Ltd.

TR01 Transfer of patent right